Medical University of South Carolina Grads Certified in Just Culture

View the original article here.

Some CHP grads go through pilot workforce accountability program

By: Heather Woolwine
May 24, 2017

CHARLESTON, SC – Health care administration professionals, who must carefully consider how they evaluate human errors and at-risk behaviors in health care practice by those they manage, often struggle with knowing the best way to encourage transparency and accountability for those errors. To address this knowledge gap in training for health administrators about to hit the workforce, the MUSC College of Health Professions (CHP) piloted a new curriculum approach for this year’s Master in Health Administration (MHA) graduating class that incorporated the Just Culture system of workplace accountability for high-consequence industries. It is the only program in the country to date to offer this additional training and research opportunity to its MHA students.

“Over the years, I have asked former graduate students what they felt unprepared for when they got into the real world of leading individuals within a health care entity,” said Tom Crawford, Ph.D., MHA program assistant professor.  “The answer was the delicate interaction with their employees when things did not go as planned.” To that end, Crawford and his colleagues constructed a curriculum using “Dave’s Subs: A Novel Story About Workplace Accountability” and partnered with the book’s author and Outcome Engenuity principal, David Marx, for a research and education project to develop and refine academic materials and testing related to Just Culture.

Recognized by industry leaders as the “father of the Just Culture movement,” Marx said that Just Culture is about differentiating human errors and at-risk behaviors from more culpable and reckless choices that providers may make in the course of caring for patients. “It works to move away from judging employees based upon an unfortunate outcome, putting more emphasis on the quality of their choices.  In doing this, we create a more accountable, open, learning culture within an organization – which in turn leads to better outcomes,” he said.

MHA students were provided an opportunity to take the Just Culture Certification exam, and provided feedback on the exam and curriculum throughout the educational partnership. Student Parker Rhoden, who recently passed the exam, said, “The Just Culture certification provided me with a framework to effectively handle difficult human resources decisions, and will be extremely valuable to my career in health administration. This really is an immediate benefit for those of us entering the workplace.”

Jami DelliFraine, Ph.D., CHP Department of Healthcare Leadership and Management chairwoman, echoed Rhoden’s comments. “We see this as an opportunity to bring the incredibly important message of Just Culture to tomorrow’s health care leaders, and through our continuing education program to leaders within the broader community,” she said.

A Call to Action for Every Healthcare Leader in America

Joint Commission Sentinel Event 57By Stephen G. Jones, MD

Strange. Our healthcare profession has among its ranks some of the brightest, most talented, highly-trained individuals to be found. We work in an environment utilizing cutting-edge science and technologies. Yet, we continue to fail at an alarming level in our pledged sacred mission to do no harm. Why?

As a profession, healthcare is often compared to the so-called high-reliability professions of aviation and nuclear power; then the question posed is, why we in the healthcare profession aren’t as good? Could it be that those industries were lucky enough to recruit all the exceptionally talented and gifted people, leaving healthcare with the B-list professionals? Of course not. Perhaps the individuals that work in aviation and nuclear power just care more about their work? Not a chance.

So why, 18 years after the release of the Institute for Medicine’s report To Err is Human, do we find ourselves as healthcare leaders asking the same questions?

Part of the answer may lie in the Joint Commission’s most recent Sentinel Event Alert, “The Essential Role of Leadership in Developing a Safety Culture” (Issue 57, March 1, 2017).

As a practicing physician with more years of experience than I care to admit, I have been witness to some remarkable changes in our field – many good, some not so good. The Joint Commission most certainly falls into the good category. Over the years, the Joint Commission has continued to evolve as an organization and build on its mission.

The Joint Commission has endeavored to work as more of a supportive partner to hospitals as opposed to regulatory policemen. More than ever, they seek best practices over citations, learning over blame, transparency over hiding. It can be argued that the Joint Commission now holds healthcare organizations to an even higher standard than it did in the past. The goal did not change, the approach did. Is there a lesson here for us?

The Joint Commission chose to build on their own success. In other words, the leadership of the Joint Commission did exactly what they are now fervently pleading with our own healthcare leadership to do:

• Develop and embed a culture of openness that is accountable and just

• Stop punishing well-intentioned employees who make an error. Instead, supporting them, and seeing the error not as an opportunity to blame, but rather to learn (partner vs. police)

• Foster an open and safe reporting system where everyone is encouraged to speak up without fear of reprisal

• Create an environment of shared learning that focuses on good system design and helping employees make better behavioral choices in a challenging environment

To be sure, the vast majority of very capable and gifted hospital leaders in our country don’t need a lesson from anyone, and certainly not me, on what’s important when it comes to patient care. And to be fair, hospital leaders today, unlike any time in the past, are faced with seemingly insurmountable and never-ending challenges. They operate in a volatile, uncertain, litigious environment often driven by external forces outside their control: hospital boards that expect (among other things) a beautiful environment, near perfect outcomes, a positive operating margin, and wonderful patient and employee satisfaction scores.

Nevertheless, this pivotal release by the Joint Commission is calling upon every healthcare leader to prioritize their efforts on developing a culture of safety, and utilize resources that drive, support, and maintain such a culture.

Building a true culture of safety, a Just Culture, is not easy. In fact it’s downright hard. It doesn’t happen overnight and doesn’t happen without passion and commitment from the very top leadership of any organization. And it certainly doesn’t happen by simply declaring you have a Just Culture and writing a policy around it. A Just Culture needs to be built from the ground up, with a model of systematic learning, and in an environment that wraps itself in the right system of justice.

I urge every hospital leader in America to read, share, and discuss this important Sentinel Event Alert with their extended leadership and embrace its recommendations. I encourage you to reach out to the many resources available to help you on this important journey.

The Joint Commission has provided a clear direction, but more importantly, a compelling challenge for establishing a culture of safety. More than ever, our patients, their families, our staff, all of us, need our top healthcare leaders to embrace this challenge. This country needs our healthcare leaders to lead.

Stephen G. Jones, MD

Medical Director of Safety

Yale New Haven Health System

ISMP Presents Lifetime Achievement Award to David Marx

David Marx Lifetime Achievement Award

David Marx Lifetime Achievement AwardAt its Annual Cheers Awards Dinner on December 6, 2016, ISMP presented a Lifetime Achievement Award to David Marx, CEO of Outcome Engenuity, for his ongoing contributions to healthcare patient safety initiatives and his significant impact on safe medication practices.  In a career spanning three decades, ISMP recognized David Marx as a true pioneer in the safety world.  Through the integration of systems engineering, human factors, and the law, David has built working environments that are more resistant to human error and changed the paradigm for how we manage individuals involved in medication errors and other types of failed outcomes.

David has authored a Patient Safety Guide for the National Institutes of Health, advised the US Agency for Healthcare Research and Quality on safety issues, created the Five Rules of Causation for the FAA, and led an external team benchmarking NASA’s Space Shuttle processing. Additionally, he has authored two books on workplace accountability: Whack-a-Mole: The Price We Pay For Expecting Perfection and Dave’s Subs: A Novel Story About Workplace Accountability. In 2005, ISMP awarded David an individual Cheers Award for his development of the Just Culture model. Incorporating lessons learned from aviation, aerospace, transportation, healthcare and other high-risk industries, David continues his efforts to help workplaces achieve highly reliable outcomes through his development of human factors risk modeling methods and as the father of the Just Culture accountability model.

imsp_cheers_awardOutcome Engenuity is honored to be recognized by Michael Cohen, founder, and CEO of ISMP, and his lifetime achievements of making medication practice safer for all through his tireless work at ISMP.

Pointing the Finger Is a Human Trait: We Must Learn to Do It Well

President Harry S. Truman is shown at his desk at the White House signing a proclamation declaring a national emergency. December 16, 1950. Acme. (USIA) NARA FILE #: 306-PS-50-16807 WAR & CONFLICT BOOK #: 1372
President Harry S. Truman is shown at his desk at the White House signing a proclamation declaring a national emergency. December 16, 1950. Acme. (USIA)
NARA FILE #: 306-PS-50-16807

Harry Truman famously kept a sign on his desk in the Oval Office reading “The Buck Stops Here.”  It was an overt declaration that he took ultimate responsibility for every choice his administration made, for everything they did or failed to do.  He assumed the blame.  And that’s an admirable trait in a leader—it builds trust and wins the respect of team members to know their leader is willing to take the blame for mistakes the team makes.  It goes back to justice: no one wants to be blamed for something for which they aren’t personally at fault.  Because whenever something goes wrong, there must be someone to blame, someone to be punished if necessary.  That’s a basic fact of human nature.  People think in terms of cause and effect: if something bad happened, someone or something must have screwed up to cause it.  But admirable as it is for a leader to assume responsibility for everything, blaming the person in charge isn’t enough.  Nor is just blaming the person at the point of failure, or picking a random scapegoat.  Simplistic ways of apportioning responsibility for mistakes, without in-depth analysis, allow us to gloss over the necessary response to failures: to figure out what actually went wrong, and how to fix it.  Pointing the finger is a human trait, but if leaders want their organizations to learn and improve, they must learn to do it well.

The key aspect of a learning culture is the ability to receive feedback that allows leaders to improve systems.  But the ability to improve systems is dependent entirely on the quality of the feedback leaders receive: they can only fix problems if they know what actually caused the problem in the first place.  Which means that they must have a system in place that allows them to gather accurate data and feedback.  Such a system, then, requires a delicate balance.  First, it must ensure that employees trust they will be treated fairly and justly, or they will not honestly report mistakes and areas for improvement.  If there is no sense of justice, there can be no learning culture, because information will not be reported for fear of being treated unjustly.  But the system must also be able to accurately identify the root causes of problems—to point the finger at the right person or people or systemic failure—and hold those responsible accountable.  A “no blame” culture is just as problematic as a strictly punitive culture in terms of learning and improvement.

Only when these two aspects are in place (accurate investigation and accountability combined with a sense of justice and fairness) can leaders learn from mistakes and improve the systems that bring them about.  Only if they can identify the person responsible for an error (pointing the finger accurately) can they then identify if it was indeed a simple error, or a risky choice due to an individual or systemic drift from procedural compliance, or reckless (or even intentionally harmful) action.  And only when that has been identified can they then determine if there were any systemic performance shaping factors that may have led to the error—factors that can be improved to reduce the likelihood of such an error in the future.  Or if the investigation reveals no such systemic factors were at play, they can decide the appropriate just response (consoling, coaching, retraining, punitive action, etc).  But this response—taking the appropriate reaction for the responsible individual(s) and possibly identifying and correcting systemic problems—can only occur if the leaders manage to get the first part right and point the finger well.  Letting the team leader take the blame may win him or her the respect of the team, but it does nothing for organizational learning and improvement.

Learning to point the finger accurately, to identify the root causes of problems and respond to them appropriately, is not only required for justice and employee trust and morale.  It is a sound business decision.  High-quality systems of investigation and accountability like the Just Culture (Workplace Accountability) Model are an investment in organizational learning and improvement.  When something goes wrong, it is natural to want to point the finger.  But leaders must learn to do it well if they want to make their organizations better.


Aaron Haskins, Outcome Engenuity Advisor.

Sharpening the focus on medical errors

Hospital works to build culture where reporting mistakes is celebrated

This Article is a re-post. See the original article here.

— At 6:40 a.m. on a recent day, several dozen nurses, doctors, technicians and other workers gathered in the second-floor surgical center at Sharp Memorial Hospital in San Diego for the daily safety huddle.

All wearing scrubs, their heads covered with blue surgical caps, they held copies of the day’s schedule, listening as surgical care manager Sam Minero pointed out the patients whose circumstances called for a little more awareness.

There were patients with latex allergies in rooms one, five and 10.

“Anything we use to care for those patients must be latex free,” Minero reminded, drawing nods from the gathering.

The patient in room nine had a VRE infection. Over in 17, it was methicillin-resistant staphylococcus aureus.

Half an hour later, across the medical campus in the outpatient pavilion, Dr. Michael Keefe led his surgical team through a quick timeout, repeating his patient’s name and age and the specific procedure to be performed, noting an allergy to penicillin. All of these items and more had already been gone over with the patient and checked upon arrival in the operating room. But hospital policy calls for the surgeon to run through those items one more time before asking for the scalpel. Only after asking each member of the team if they had a question did the surgeon say “we are set to proceed.”

These procedures inside and outside the OR have been in place at Kearny Mesa facility for many years, but Sharp has been working over the last two years to make them more meaningful.

Focus on deepening the hospital’s safety culture tightened in 2012 when a surgical team mistakenly removed patient Paul Kibbett’s healthy left kidney though a cancerous tumor had been discovered on the right. Ultimately both organs were removed, and the patient was forced to rely on dialysis for the rest of his life. In addition to a lawsuit, Sharp received a $100,000 fine and bad publicity, courtesy of the state’s immediate jeopardy system.

The event caused Sharp, the region’s largest health system with four acute-care hospitals in San Diego County, to do some soul searching.

After all, it was not like the hospital was ignoring standard safety procedures. Hospitals across the nation have had an intense focus on error prevention for more than a decade, mandating new fail-safes often adopted from industries such as aviation and nuclear power where one slip-up can cost many lives.

Checklists, which pilots routinely use to make sure they don’t miss critical steps in the complex task of preparing an airplane for flight, have become common in operating rooms. Some, including Sharp, have now begun using wireless electronic tracking systems for surgical sponges that allow surgeons to detect a left-behind item even if it is not visible in a patient’s body cavity.

However, there are still plenty of ways things can go wrong. In the Sharp kidney case, the surgeon decided to move forward without having confirmatory X-rays, taken at another facility, up on the digital screen in the operating room for visualization by the whole team despite the fact that hospital policy clearly stated that X-ray verification is required in cases where there could be left-right confusion.

The incident highlighted a simple truth: Rules and procedures only work if all of the people involved actually follow them every single time.

Safety, then, is just as much about a hospital’s culture as it is about having the right policies, procedures, technology and personnel in place.

When someone decides to skip a step, someone else needs to spot that behavior and call it out.

Dr. Gerald Hickson, immediate past chair of the National Patient Safety Foundation and a quality and safety executive at Vanderbilt University Medical Center in Nashville, Tenn., said the true work in increasing hospital safety is changing culture.

“It requires people, process and technology. What has so often happened is, when somebody attempts to put in new procedures like checklists or universal timeouts, surgeon X looks up and says, ‘I’ve been operating for 20 years. I’ve never operated on the wrong side. We don’t have time for this. Move on,’” Hickson said. “That has been, in my view, one of the big factors slowing down the safety movement. Culture trumps everything.”

Sharp seems to agree.

Its latest moves, said hospital medical director Dr. Geoffrey Stiles, have been as much about the way workers collaborate and hold each other accountable as they have been about creating redundant safety policies for high-risk operations.

It has been important, he said, for doctors to understand that a sterling history of safety does not necessarily mean an error-free future.

“One of the big pieces to get them engaged was for them to realize their vulnerability. They were all, for the most part, saying, ‘Yeah, I’m safe, I’m good,’ but, when the American Academy of Orthopedics comes out and says that an orthopedic surgeon has a one-in-four chance of doing a wrong-site surgery sometime in their career, it’s like, ‘OK, I don’t want to be in that 25 percent,’” Stiles said.

Sharp has tried, the director added, to emphasize to its caregivers that reporting errors is a good thing, even creating a “great catch” award complete with a catcher’s mitt trophy and company-wide recognition.

Sharp has also implemented the TeamSTEPPS program created by the U.S. Department of Defense and the Agency for Healthcare Research and Quality, which is designed to improve teamwork.

Surgical nursing manager Michele McCluer went through the training and said its emphasis on encouraging all caregivers to speak up if they see a problem is its most valuable feature.

But encouragement, she noted, is actually not the most critical element of success. Employees, she said, need to know that the corner office has their backs.

“Knowing that we will be backed up by our leadership and our management team is very important,” McCluer said.

Changing culture can force some difficult conversations, and no one knows that better than Dr. Tom Karagianes, the soft-spoken medical director of Sharp Memorial’s outpatient surgery pavilion.

He said that, historically, hospital workers have made allowances for disruptive behavior by doctors. That, he said, is a mistake. Though it may be uncomfortable, it is important for management to back its staff by sharing with doctors that the way they talk to their coworkers has real safety implications. It is more likely, for example, that employees will rush through safety checks without paying proper attention if they feel uncomfortable, rushed or anxious.

“A physician who is borderline disruptive throws everybody’s game off,” Karagianes said.

So far Sharp says these changes to culture have had a positive effect. The health care system has not had a “wrong site” surgery since the kidney removal mixup.

Stiles said the new focus on speaking up has occasionally brought disagreements to his attention.

“We have had some that have said, ‘Yeah, yeah, let’s go,’ and we say, ‘No, we have to do the timeout.’ The staff won’t give them the knife. If need be, they’ll escalate it to me or Tom,” Stiles said.


Just culture can improve safety

Analysis published by International Air Transport Association (IATA) - Airlines International.

February 3 2016

"It is only natural that people and organizations would be less willing to report their errors and other safety issues if they are afraid of punishment or even prosecution"

Just Culture achieved prominent recognition in the European Union (EU) last month, and there are new provisions calling for the protection of safety-related information anticipated to be adopted by the ICAO Council very soon. But, it is what you do with the tremendous amount of data that a just culture enables to be captured, through various mandatory and voluntary reporting systems, that magnifies its positive effect.

“The new EU regulation is about encouraging aviation personnel to tell their employer when things aren’t working well. It isn’t always that someone has made a mistake, it’s that something hasn’t worked out as expected on this occasion. They need to feel that they are being supported by their employer and that this information is useful and will be used to improve things,” said U.K. Civil Aviation Authority’s Performance Based Regulation Safety Data Lead, Sean Parker.

Beyond Europe, global standards beckon for Just Culture [See box Explaining Just Culture]. In October, ICAO member states filed their responses to proposals that include the addition of Safety Culture to Annex 19 of the Chicago Convention. Safety Culture is a broader concept, in which Just Culture is part. Just Culture enables a Safety Culture to exist. Following the anticipated final approval in March 2016, ICAO member states could be required to adopt Safety Culture through the amended Annex in November. Experts foresee a 2018-2020 timeframe for Safety Culture’s incorporation into the domestic regulation of the 190 ICAO member states.

Safety Culture and the need to protect safety data and safety information, collected for the purpose of maintaining or improving safety, was a notable theme at the second ICAO High Level Safety Conference, held earlier in 2015 in Montréal. It was agreed that quick progress in this regard is critical for the improvement of aviation safety.

“It is only natural that people and organizations would be less willing to report their errors and other safety issues if they are afraid of punishment or even prosecution,” noted Gilberto Lopez Meyer, IATA Senior Vice President, Safety and Flight Operations. “These protections are essential for the ongoing availability of safety data and safety information, and forms the basis of a Just Culture.”

The adoption of Just Culture will not only widen the array of data sources that can feed into a company or industry-wide predictive tool, but also increase the quality of the data provided. It is the predictive data analysis that can deliver more than simply local improvement at an airport or maintenance hangar, which a conventional mandatory reporting system for the immediate line managers may do.

Such predictive analysis can also be useful for accident prevention and investigation. A diversity of data to analyse is good because accidents are, “always a confluence of a variety of different factors, which nobody would ever have guessed would have come together at the same time,” IATA General Counsel Jeffrey Shane said.

The quantity of data produced from mandatory and voluntary reporting systems cannot be understated. Legal firm Pillsbury Winthrop Shaw Pittman Partner and Head of its aviation practice, Kenneth Quinn said: “You’re getting 10,000 bits of data from the widest possible variety of sources from airlines, as well as voluntary occurrence reports, [and] your getting it from repair stations.” And all of that can go into powerful computers.

“A big benefit is you can benchmark against other people,” Quinn said. “If you have five engine shutdowns over the course of a year and airline B has none then you have a higher than normal average of in-flight shutdowns, how are you monitoring things?”

Quinn points to the work the United States’ Federal Aviation Administration has done. All of that occurrence data, Quinn said, “goes into very powerful computers…and you take that and implement mitigation strategies to correct that, its having demonstrable safety benefits.” Because of the FAA’s work, Quinn explained that other authorities are examining the potential for such mass data reporting based predictive technology.

At the European Commission’s Aerodays 2015 conference in London in October, the European Aviation Safety Agency talked about its big data safety project that will spend about 31 million euros from 2015 to 2017. The project will seek to demonstrate an ability to predict an unsafe situation.

The adoption of Safety Culture, and by default Just Culture, by ICAO, will, however, present a challenge to some member states and access to all the diverse data that could make a difference could be hindered.  “We recognize there are sovereign legal systems that regard an accident as, in the first instance, something that is a potential criminal act,” said Australia’s Civil Aviation Safety Authority’s Associate Director of Aviation Safety, Jonathan Aleck.

Australia is not a country that begins with a criminal investigation, Aleck highlighted. Its airlines have adopted Just Cultures and in its latest annual review for 2015, Qantas said: “We are proud of our strong, ‘just culture’ of reporting and our dedication to learning from our experiences. And we strive to maintain an environment that encourages trust and confidence in our people to report hazards and incidents and suggest safety improvements.”

Nations whose airlines do face a criminal investigation team, whether they have Just Cultures or not, according to Aleck. He said: “In some jurisdictions there is a strict program [of aviation regulation]. The idea of just culture doesn’t fit well with those kinds of regimes, where the first people on the scene are often criminal investigators.”

Before the likely cause of the 31 October crash of Metrojet flight 9268 had been identified as a bomb, the Russian authorities initial announcement regarding its investigation was the start of a criminal one. The concern is that people who know what led to an accident will say nothing for fear of prosecution, when their information could help stop potentially fatal incidents from occurring again.

Shane is positive about changes to national legislation where Just Culture is not already codified following the expected ICAO March decision. He said: “The benefits can be demonstrated so powerfully that I expect [legislatures to adopt it in the next few years].”

The expansion of Just Culture has grown momentum. For those that have or are employing it, they find that it delivers new insights into how things go wrong. But, questions remain as to how far nations, whose instinct is to investigate possible criminal action first, can succeed in gaining all the possible benefits from the additional information that becomes available.

IATA Member & External Relations, ICAO, Director, Michael Comber sums it up simply. “It’s a tremendous advantage to have people come forward and speak because it’s the best way to prevent as well as figure out how [accidents] happened.” And that points us in the direction of improving safety.

Explaining Just Culture
The definition of Just Culture is an open way of working in which employees are not punished for decisions taken in good faith and commensurate with their experience and training. The employees can report mistakes, by them or others, and know that that information will feed into the safety management system.

However, gross negligence, willful violations and destructive acts are not tolerated.

Just Culture has been required within the EU since November under its regulation 376/2014 that also renews earlier mandatory reporting law. For Just Culture, the EU is requiring that organizations have protection for the reporting staff member, and for persons mentioned in the report, rules for confidentiality, and protection from an employer.

Prior to the European law coming into effect, a European declaration in favor of Just Culture was published in October. The declaration is supported by the Airports Council International, European regions airline association, European Cockpit Association, Aircraft Engineers International, IATA, and other aviation organizations.

As well as European efforts to implement Just Culture, this non-punitive reporting system was included in the Australian Civil Aviation Safety Authority’s new regulatory philosophy published in 2015. Prior to Australia’s CASA action, the United States, New Zealand and the UK had their own rules in place.


Coaching, and Some Choice Four Letter Words

Written by Ellen McDermott, Oe Advisor

shutterstock_46817356Forgive me as I get nostalgic for a moment and reflect back on my youthful days as a platoon leader in the Army.

I was a military police officer stationed in Germany, blessed to serve and command a platoon that was forty soldiers strong.  We were field support, which meant that our main focus was literally out “in the field” living out of our rucksacks training for combat support missions.  Many a day was spent getting muddy and cold, barely sleeping, eating bad food and drinking even worse coffee; and yet, this falls in that wonderful space of being the most stressful and yet the most enjoyable time of my life—all because of the dedicated soldiers with whom I served.

But even the best of us will drift.  And there came one very memorable and very long day in the field when drift became noticeable across the entire platoon—cutting corners, grumbling, moving slowly, and generally just not being the driven soldiers we aspire to be.  The frustration began to trickle up through the squad leaders to my platoon sergeant, and as the hours passed, even my temper started to flare.  And then it happened.  Standing in front of my platoon formation, I said a curse word.  Jaws dropped.

Obviously this was not the first time my soldiers had ever heard a leader curse.  In our world, a curse every other word was pretty much “the norm.”  But in the two years I served as their platoon leader, this was the first time anyone had heard me curse.  And when dismissed from formation, it was now my jaw that dropped, and that of my platoon sergeant’s, as the soldiers moved forward with a renewed and vigorous sense purpose—just from one tiny curse word.  It was so effective that from that day forward, my platoon sergeant would occasionally come to my office and ask, “Ma’am, this is one of those times. Would you please use a curse word?”

I share this story because I want to explain that while coaching is intended to be the highest form of accountability, it isn’t always going to be the hardest form of accountability.  We define coaching as “a values-supportive discussion with the employee on the need to engage in better behavioral choices.”  In later conversations, my very wise platoon sergeant explained to me that the reason that one curse word was so effective was because it was indeed that values-supportive moment. My platoon needed me to acknowledge just how tough that situation was, and by showing my very human moment of frustration, I also showed them that I was standing right there with them in the muck.

We’re very careful as Advisors to never tell anyone exactly how to do coaching, because only you can decide what’s appropriate for your organizational culture.  So am I saying go out and use cursing to get someone’s attention? Absolutely not!  In most workplaces that will be seen as punitive—clearly defeating the intent of coaching.  This was a good fit in this one unique organizational culture at that unique moment.  But I hope as Champions and leaders we don’t become so focused on the formal process that we lose sight of the intent of coaching—that very human moment of saying, “I see the situation, I’m standing here with you in this, and we can do better.”


Outcome Engenuity to Exhibit at the AONE 2016 Annual Meeting

aone2016-logoAre you and your colleagues planning to join the thousands of nursing leaders at the American Organization of Nursing Executives (AONE) Annual Meeting 2016? We are. Outcome Engenuity (Oe) is set to be a part of the Exhibition at the AONE 2016 Annual Meeting. Visit our exhibition booth for more on David Marx's latest book, Dave's Subs: A Novel Story about Workplace Accountability. We will have copies on hand to give away. We will also be available for more information on Just Culture training and products.

The AONE 2016 Annual Meeting is scheduled to run March 30 - April 2, 2016, with the Exhibition being from March 31 - April 1. For more information, visit:

We hope to see you there!

David Marx presents as core speaker at ISQua International Conference

IsquaDoha2015Just in October, healthcare leaders from across the world gathered together in Doha, Qatar at the International Society for Quality in Health Care (ISQua) International Conference to improve patient safety through the sharing and garnering of innovations and promoting new ideas.

As an international conference devoted to the improvement of patient safety, ISQua sought out the world’s chief healthcare experts to facilitate learning through presenting on their areas of expertise. Among the sought-after experts was Outcome Engenuity CEO and father of Just Culture, David Marx — considering his extensive experience in pioneering, developing and implementing Just Culture principles in healthcare — as well as various other industries across the globe.

A study in the Journal of Patient Safety from 2014 reported that more than 400,000 patients’ die every year from preventable medical harm. Bearing these numbers in mind, health care organizations are increasingly longing for justice and accountability within their complex systems to better manage employee behavior, improve learning systems and produce better outcomes (and so improve patient safety) within their organization. Consequently, this is a matter many health care leaders long to dive in to, making Marx a prime candidate as a core speaker at the ISQua International Conference.

Recognizing that hospitals, nursing facilities and ambulatory care facilities all face the task of building a stronger culture of accountability within their organizations, Marx framed his presentation around the movement to create a more accountable culture within the workplace. Marx also discussed the necessity of building a strong reporting and investigative culture, as well as the task of managing behavioral choices, providing many insights for organizations striving to accomplish this task:

Screen Shot 2015-11-24 at 12.18.45 PM


Screen Shot 2015-11-24 at 12.17.51 PM

Screen Shot 2015-11-24 at 12.18.05 PM

Screen Shot 2015-11-24 at 12.19.02 PM

Screen Shot 2015-11-24 at 12.19.21 PM

Screen Shot 2015-11-24 at 12.19.44 PM

Screen Shot 2015-11-24 at 11.09.05 AM


UPenn Law’s Quattrone Center receives $350,000 for deep just culture reviews

The Quattrone Center for the Fair Administration of Justice at the University of Pennsylvania Law School was recently awarded $350,000 in funding from the National Institute of Justice to conduct extensive reviews of error in Philadelphia’s criminal system using a just culture approach, according to a recent article from the University of Pennsylvania Law School. The funding will go toward the Philadelphia Event Review Team (PERT), which will be launched early 2016. PERT will assemble major criminal justice agencies to deeply analyze cases with unintended outcomes in Philadelphia’s criminal justice system in order to identify, prioritize and implement reforms across the various criminal justice agencies. Read the entire Penn Law article below:

Watch John Hollway discuss the Philadelphia Event Review Team below. See Full Article Here.

Screen Shot 2015-11-04 at 10.42.21 AM


A Just Culture encourages open reporting of errors, omissions, or decisions without the possibility of punitive action, a concept  widely embraced by regulatory agencies. This open reporting enhances safety regulations within the airlines industry. An essential component for success is that the person(s) reporting have the right to do so in strict confidence. It is this matter that Centre for Aviation (CAPA) speaks out against in its latest report, Aviation safety vs the “prosecutorial imperative”. Indiscriminate prosecutions erode safety culture.  It is truly the intersection between "law and the safety of air travel" that is being upheld, as Jeff Shane, IATA General Counsel, aptly labels "prosecutorial imperative” - that judges, prosecutors, and trial lawyers often seek access to this material." Read the full report at CAPA's website.


Aviation safety vs the “prosecutorial imperative”. Indiscriminate prosecutions erode safety culture


This report contains extensive extracts from the Keynote Remarks of Jeff Shane, IATA General Counsel, to the Tort Trial & Insurance Practice Section of the American Bar Association Aviation and Space Law Committee National Program, in Washington, DC on 22-Oct-2015. Mr Shane addresses a key area of concern to those dedicated to applying lessons learned from airline accidents in the cause of improving air safety.

Major improvements in safety management have come with the advent of voluntary reporting systems, dating back to the 1970s. Mr Shane recounts that these systems have been encouraged by regulators in a number of countries as part of a non-punitive, “just culture” approach to safety regulation. There is an emerging consensus among regulators and airlines alike that a “just culture” approach yields greater benefits than a regime characterized by enforcement penalties. Essential to the success of such systems is that the information furnished through such systems be held in strict confidence.

However, Mr Shane was concerned at a persistent “prosecutorial imperative” - that judges, prosecutors, and trial lawyers often seek access to this material and, “in a growing number of cases, they have succeeded.” If this trend were to continue, says Mr Shane, “the essential flow of safety information would simply dry up” as those with valuable knowledge fear the legal consequences of sharing information.

About the intersection between law and the safety of air travel

2015 is turning out to be a record year for the airline industry. Speaking a couple of days ago at IATA’s World Passenger Symposium in Hamburg, our Director General, Tony Tyler – my boss – announced that for 2015, we expect an industry net profit of $29.3 billion on revenues of $727 billion, for a net profit margin of 4 percent, generating a return on invested capital of 7.5 percent. For the first time, we actually expect the industry on average to create value for its investors. It’s hardly a robust performance compared to other industries -- Apple earned $13.6 billion in the second quarter of this year alone for a 23.4 percent margin -- but for the airline industry, 4 percent is something to celebrate.

But my theme today isn’t the quest for elusive profits in commercial aviation. I want to talk instead about the intersection between law and the safety of air travel, with a focus on some interesting recent developments.

Safety Information Protection - and learning from accidents

A good way to start the discussion might be by reference to the Montreal Convention of 1999. My guess is that the people in this room know better than anyone what things were like before that treaty came into force. Under the old Warsaw/Hague regime, airlines had strict liability for mishaps, but the victims of an accident were entitled to no more than some absurdly low recovery amount, depending on the jurisdiction in which they were eligible to sue. Even in the United States, where airlines were compelled by regulators to increase the damages available through the treaty, the maximum recovery was $75,000 per passenger.

The only way claimants could break those limits was to prove in court that the carrier had been guilty of “willful misconduct” – a gross negligence, reckless endangerment test that engendered many years of costly litigation that was excruciating for claimants and defendants alike.By the mid-‘90s, the airline industry had had enough.

In 1996, through inter-carrier agreements brokered by IATA and the Air Transport Association of America – today’s A4A – the airlines waived the liability limits of Warsaw/Hague. The Montreal Convention of 1999 effectively ratified that waiver. Today, strict liability is still the centerpiece of the regime, but unless the airline can prove that the accident was not due to its own negligence – in other words, prove that it took all available measures to prevent the accident -- claimants are entitled to recover all provable economic damages. The net result is that, as long as a claim falls under MC99, there is no longer any reason to spend years fighting in court over whether the airline was guilty of “willful misconduct.”

Even in the very rare case where the airline successfully asserts the non-negligence defense, claimants are entitled to 113,100 special drawing rights, or about US$160,000 at current conversion rates.The Montreal Convention made the recovery process more humane, to be sure. But it had an even more important benefit. I know that the plaintiffs’ bar prides itself on using the evidentiary tools available in a trial to tease out important facts that might otherwise have gone undiscovered.

Thanks to the Montreal Convention of 1999, the litigation-driven incentive to construe the facts in ways most beneficial to one side or the other have largely gone away.

But we lawyers have a professional responsibility to construe those facts in ways most beneficial to our clients. Fact-finding thus can take a back seat to advocacy. Thanks to the Montreal Convention of 1999, the litigation-driven incentive to construe the facts in ways most beneficial to one side or the other have largely gone away.

After all, every accident, however regrettable, represents an opportunity to make flying safer, as long as we can find out what actually happened. We can say, thanks to these important developments in civil litigation over the past 20 years, that we now have a much better chance of exploiting that opportunity fully.

You should know that since 1996, when the airlines first waived the limits of liability under Warsaw, the fatal accident rate in commercial aviation has steadily declined. In fact, 2014 was the safest year we have ever had. Jet-hull loss per million sectors flown in 2014 was 0.23, the lowest on record. Whether or not you believe that the elimination of “willful misconduct” trials was a factor in that steady decline in accidents, at least we know that the reduction in such litigation had no adverse safety consequences.

The Prosecutorial Imperative vs. “Just Culture”

But there’s another worrisome impediment to learning from occasional mistakes. It is what I will call the prosecutorial imperative.

In too many jurisdictions, the instinct is to treat every accident as a possible crime. There is immediate tension between the technical accident investigators who simply want to find out what happened in the interest of making sure it doesn’t happen again, and the criminal investigators who want to determine whether the accident was attributable to culpable conduct and if so to punish that conduct. I don’t have to tell you what happens when the gendarmes put yellow tape around the scene of an accident and start quizzing witnesses.

Those closest to the event and with the most valuable information hire lawyers and are warned that anything they say may be used against them. Getting the facts becomes much harder.

the prosecutorial imperative can compromise in a fundamental way the over-arching safety ethic

Even more worrying is that the prosecutorial imperative can compromise in a fundamental way the over-arching safety ethic that has been so successfully embedded in the DNA of the aviation industry. I’m talking about the “just culture” approach that is widely treated within the industry as a sine qua non to optimal safety performance. The idea dates back to the 1970s, when the first voluntary reporting systems were established.

It is a simple concept: Companies and their employees are encouraged to report voluntarily any defect, any anomaly, any departure from the norm, anything that might compromise the safety of flight. The information and its source are held in strict confidence. And no punishment follows – either of the employee or of the company. No protection is accorded to criminal activity, of course, but short of that, the information remains sacrosanct.

The great thing about the just culture approach is not merely that it produces timely information that can save lives, but that the information is widely shared among those who can benefit from it. There are searchable online databases containing massive amounts of vitally important safety-related information. With today’s sophisticated analysis and artificial intelligence, it is possible to predict incipient dangerous conditions and remedy them well in advance of an actual system failure.

An Emerging Consensus: deficiencies are best addressed by a "just culture" approach

The value of just culture has been widely acknowledged by regulatory authorities. The FAA last June issued a new “compliance philosophy” (FAA Order 8000.373, June 26, 2015) that places new emphasis on non-punitive means of rectifying deviations from regulatory requirements when disclosed. Noting that some deviations arise from factors like flawed procedures, simple mistakes, lack of understanding, or diminished skills, the FAA believes that such deficiencies “can most effectively be corrected through root cause analysis and training, education or other appropriate improvements to procedures or training programs for regulated entities....” In other words, not through the imposition of penalties. The objective, quite clearly, is to encourage more voluntary reporting in the interest of ensuring that the safety management systems required of all airlines are working optimally.

“CASA embraces, and encourages the development throughout the aviation community of, a ‘just culture,’ in which people are not punished for actions, omissions, or decisions taken by them that are commensurate with their experience, qualifications and training.”

Just last month Australia’s Civil Aviation Safety Authority issued a new statement of regulatory philosophy that even more explicitly embraced the just culture approach. The agency wrote: “CASA embraces, and encourages the development throughout the aviation community of, a ‘just culture,’ in which people are not punished for actions, omissions, or decisions taken by them that are commensurate with their experience, qualifications and training.”

Earlier this month, the European Commission convened a meeting in Brussels to introduce a “European Corporate Just Culture Declaration.” The Declaration said: “It is acknowledged that, in an operational aviation industry environment, individuals, despite their training, expertise, experience, abilities and good will, may be faced with situations where the limits of human performance combined with unwanted and unpredictable systemic influences may lead to an undesirable outcome.”

There’s a bumper sticker that makes the same point in fewer words. It can be paraphrased as “Stuff happens.”

The declaration then continues: “Analysis of reported occurrences by organisations should focus on system performance and contributing factors first and not on apportioning blame and/or focus on individual responsibilities....”

Very clearly, there is an emerging consensus -- among regulatory agencies and the industry -- that encouraging voluntary disclosure of safety information is in everyone’s interest, and that the best way to do so is to apply non-punitive remedies to deficiencies that are voluntarily disclosed.

ICAO and Protection at the Global Level

we have seen too many cases in recent years in which judges, prosecutors, and plaintiffs’ attorneys have sought access to this vitally important safety information.

Despite this consensus, however, we have seen too many cases in recent years in which judges, prosecutors, and plaintiffs’ attorneys have sought access to this vitally important safety information. In a growing number of instances, they have succeeded. If that trend were to continue, you can be assured that the essential flow of safety information would simply dry up.

This danger is increasingly understood and it’s now an issue that’s being tackled globally, most importantly at ICAO.

Five years ago, an ICAO High-level Safety Conference recommended the development of new guidance – what ICAO calls “Standards and Recommended Practices” or “SARPs” – to be included in a new annex to the Chicago Convention devoted to safety management. The annexes to the Convention, as you probably know, are where the high-level principles enunciated in the treaty are turned into more specific and granular guidance. They aren’t self- executing; they have to be implemented through national laws and regulations in order to be effective, but that’s generally what happens. It happens because the quality of a government’s aviation safety oversight is measured by the extent to which it has implemented ICAO’s SARPs and other guidance.

The new SARPs envisioned five years ago were to spell out government responsibilities for the protection of safety information. The protection of information derived from accident investigations was already addressed to some extent in the accident investigation annex -- Annex 13. The new SARPs were intended to reinforce those protections and explicitly cover information reported via the safety management systems that are now a mandatory ingredient in airline operations – including, of course, the voluntary reporting I’ve been talking about. This new guidance will be included in the new safety management annex -- Annex 19. And lest there be any doubt, the protection contemplated is protection from prosecutors, judges, and yes, even trial lawyers.

Some of the most important provisions can be found listed under new “Principles of protection” proposed for Annex 19. The first principle is that “States shall ensure that safety data or safety information is not used for: a) disciplinary, civil, administrative and criminal proceedings against employees operational personnel or organizations; b) disclosure to the public; or c) any purposes other than maintaining or improving safety; unless a principle of exception applies.”

The “principles of exception” are what you would expect – cases in which the conduct in question clearly crosses the line from an honest mistake into the area of reckless endangerment, gross negligence, willful misconduct, or whatever you want to call it -- conduct that would always be subject to prosecution under applicable national laws.

But the overarching idea, simply put, is that penalizing honest mistakes merely impedes the flow of valuable safety information and thereby actually increases the risk profile of the aviation sector.

ICAO is moving towards a basis for a standard global approach by end-2016

The new provisions were circulated to governments for a final review last July in something ICAO calls a “state letter.” Any further comments from the governments were due a week ago, by October 15. The next step will be a review by ICAO’s Air Navigation Commission with the intention of presenting the language to the ICAO Council – ICAO’s governing body -- for final approval next March. Nobody expects to hear any dissent. The new provisions will then become effective in November of next year.

There is still an open question as to when the new provisions will become applicable to governments – 2018 or 2020 are the options being discussed. As I indicated earlier, nothing in an ICAO annex is self-executing; to be effective and enforceable, the guidance has to be translated into national law by governments. My guess is that a great many governments won’t wait for the new language to become effective but will start their legislative processes working even sooner.

All of this is good news for the airlines, of course, but it is even better news for their customers – including you and me. Aviation is already the safest mode of transportation, and by a wide measure. But air traffic is predicted to double over the course of the next 20 years.

That means that we have an obligation to do all we can to make the remarkable safety management systems we rely upon today even better. The changes in law that I’ve discussed will be an essential element in that improvement.


Bringing Just Culture to the Streets


Making judgments without an understanding of the root cause(s) of the situation quenches both growth and learning culture. It is absolutely necessary for us to remain impartial in our judgments until we can adequately discern the root cause(s) of the event.

Recently, a pastor who played a major role in the Boston Miracle, Jeffrey Brown, presented a Ted talk testifying to the power of this.

In the late 1980s, violence on the streets of Boston was increasing at alarming rates, and by 1990 Boston’s homicide rate reached a peak of 152. But by 1999, that number dropped down to 31 thanks to the key leaders within the Boston community. The Boston Miracle, simply stated, was this unprecedented 79 percent drop in the city’s homicides over the span of 10 years from 1990–1999.

In his Ted Talk, Brown shared that after a multitude of tragic events, he realized that it was not enough for him to build programs for the at-risk youth. He began to search for the youth actively involved in violence. He soon found himself walking the streets of Boston during the hours of the night, and by 1992 he and other area pastors had formed the Ten Point Coalition to combat youth violence in the streets of Boston.

Over time, these pastors began developing relationships on the streets of Boston during the night hours. They discovered that the individuals, who many dismissed as cold and heartless, were the exact opposite of their labels, and were simply trying to “make it on the streets.”  By not rushing to judgment, the pastors were able to engage with the youth and partnered with them to change the culture on the streets.

But this journey took time. It was only when these youths viewed the Ten Point Coalition and the law-enforcement as legitimate, fair and just that the culture on the streets began to change. This meant the Ten Point Coalition and law enforcement had to consistently take the time to discern what justice meant for each person involved, determining who needed to be helped, who needed to be coached and who needed to be punished. In turn, the Boston area pastors were able to help the Boston police focus on the truly reckless and intentionally harmful behaviors.

This began the transformation of the street culture, and cultivated a cultural atmosphere ripe for justice. With cooperation at all levels, the Boston Miracle occurred, becoming a powerful testimony to the fruit of not rushing into judgment over a situation. Even now, others are inspired by the result of Boston’s street transformation in the 1990’s. In fact, a group of Baltimore pastors have decided to devote the summer of 2015 to walking the streets of Baltimore at night in hopes of the cultural transformation of the streets of Baltimore.

Monica Lewinsky addresses the culture of shame

Culture is shaped by our behaviors, and both repetitive human errors and at-risk behaviors can be detrimental to the direction of an organization. For leaders, this is critical to note.

Why? It is simple. Leaders have the authority to shape the system and culture, ultimately determining the direction of an organization. What is allowed and voiced within the workplace either gives room for learning and growth, or squashes learning and growth.

In March 2015, Monica Lewinsky presented the Ted Talk message “The Price of Shame,” which focused on the effects of cyber bullying. Out of respect for Lewinsky, it was Ted’s aim to provide a safe place for her because it was among her first public appearances in 10 years. However, comments derailing Lewinsky began almost immediately upon the posting of the message (before the public would have time to watch the full 20 minute Ted Talk).

The very thing that Lewinsky was speaking up about was happening.

With that, three Ted employees immediately took control of the situation through aggressively monitoring the comments being made: they would purge the negative comments and reply to the positive comments, bringing the good to the top of the feed. After much deliberate work, the Ted employees saw a shift within the public forum—the voices that uplifted, empowered and encouraged Lewinsky were prevailing, changing the forum content and culture altogether.

The public began to see what was clearly accepted and what was not acceptable for the forum.

During Lewinsky’s message, she encouraged the listeners to become “upstanders,” defending those who are victims in the world’s steep culture of shame. Interestingly, the very call-to-action given by Lewinsky during her message was manifesting within the forum—people were becoming “upstanders” for Lewinsky in the midst of a culture of shame.

One commenter wrote: “I am so inspired by her wisdom and courage. I cannot imagine the depths of despair she went through and wow look at the incredible message she is bringing to us now because she survived and is now thriving.”

Clearly, through this case, we see the way that the Ted employees shaped—and ultimately shifted—the culture of the forum, helping to redefine the norm within the forum. The Ted employees victoriously encouraged the voices that silenced the shame and silenced the voices that encouraged shame.

For leaders within the workplace, it is important to understand that through empowering individuals who are giving voice to desired outcomes within the workplace, we shape the culture of the workplace in a positive manner, creating an open atmosphere encouraging the desired outcome, and thus, discouraging the undesired outcome.

It is evident through the Lewinsky case, the power of unity to shift a culture--and within the workplace voices that uplift, encourage, and empower each other to learn and grow--can eventually impact the culture of the organization as a whole.

Australian aviation agency implements Just Culture

CASA_pilot_C857E1F0-7D82-11E3-85ED005056A302E6 (1)

It’s simple: mistakes occur—and our response (or lack of response) to mistakes can either propel an organization forward or fully divert its direction. If a mistake remains neglected, that mistake could develop into at-risk—and even reckless—behavior. Yet again, if a mistake is brought forth and is properly investigated, there remains an opportunity for learning and growth.

In an effort to develop a culture where mistakes are genuinely recognized as opportunities to learn and improve from, Australia’s Civil Aviation Safety Authority (CASA) recently announced its decision to implement a “just culture” approach to aviation regulation.

"The advantage of a Just Culture approach is that it encourages people to be open and accountable about their mistakes, so there is a better reporting of errors and the ability to learn from them is enhanced,” CASA Director of Aviation Safety, Mark Skidmore, said in an interview with Australian Flying.

Skidmore stated that through this initiative, CASA’s desire is for individuals and organizations to understand the root cause of mistakes and how to reduce the likelihood of the same mistake from occurring in the future. Even more, through the implementation of Just Culture, CASA’s hope is to better improve aviation systems altogether through commitment to accountability and transparency.

He further emphasized in a later article from Australian Flying the need for cooperation and accountability throughout the aviation community. Without it, the implementation of Just Culture will not impact Australian aviation.

In the latter article, Skidmore also emphasized the need for a structured system such as Just Culture. He explained that such structure creates an atmosphere for open reporting, helping to establish a culture of accountability and workplace transparency.

CASA’s decision to encourage and implement Just Culture throughout Australian aviation reveals the need for industry and organizational cooperation for effective implementation of Just Culture. Even more, through this, it is clear how important it is for organizational leaders to take action through committing to understand how to properly manage and investigate the root of events, in order to establish a culture of accountability and workplace transparency.

Learn how to implement Just Culture in your organization by visiting our Live Just Culture Course training page for upcoming Just Culture Certification Course dates and information, or search our Online training page for more resources.

Hospital duties hindered by lack of nursing staff safety

All too often nurses and other hospital staff are harmed at work while diligently performing their duty of providing care for patients. Last week, NPR News released the first of a series of four investigative pieces addressing this failure to protect employee safety within hospitals.

Every day, nurses are repeatedly faced with circumstances that hinder their ability to adequately meet the need of their patients without harming themselves—specifically when it comes to their everyday duty of moving and lifting patients.

According to the NPR article, it is clear that the extent to which hospitals are emphasizing a “culture of safety” for nurses is incomparable to the way it is emphasized for patients—and understandably so being that the patient is the one who is ill or injured.

However, with nurses put in such fragile situations daily, the risks are high. In fact, the Bureau of Labor Statistics (BLS) reported more than 35,000 back and other injuries among nursing employees.  In 2013 the BLS reported that orderlies and nursing assistants experience nearly triple the amount of musculoskeletal injuries causing them to miss work as police officers, correctional officers and construction laborers.

The article also notes that there is little to no aggressive action being taken by hospitals to address this and to protect their staff from lifting injuries—potentially leading to employees missing work or attempt to work through the pain, hindering patient care quality. According to American Nurses Association (ANA), 10 states within the country “require a comprehensive program in health care facilities” promoting nursing staff safety.

It seems there is a correlation between the lack of protection of nursing staff safety and the duty nursing staff have to produce an outcome--is the fix to the problem Just Culture?

For more information on establishing Just Culture, see our Live Training Courses page or our Online Course Training page.

Why the ‘5 Why’s’ are not enough for a good investigation


By: John Westphal


Event Investigation is a tool within the reactive learning system that we use to extract learning from an event. One of the practices I have employed as a six sigma black belt and human factors investigator is the '5 Whys' technique. It is used in the Analyze phase of the 'Six Sigma DMAIC' (Define, Measure, Analyze, Improve, and Control) methodology, and is a technique that seeks to identify the root cause of an event.

That said, as organizational leaders, I believe we have become overly captivated with the identification of the root cause. Many organizations have fallen into the trap of believing if they find the root cause, they have in essence found the piece to address for further mitigation of the event risk. This, in my experience, leads us down a path of fixing one event at a time rather than addressing common cause failures, which are often further up the causal chain and allow us the opportunity to address risk in a more holistic fashion.

The '5 Whys' is a simple methodology that allows us a basic understanding of the initiating event (root cause), but as we seek to extract all the learning from the event for more effective risk mitigation, we must employ additional methodologies (Rules of causation) to help us understand the cause and effect relationship, mitigate the use of negative descriptors that are subjective in nature, identify and explain the human errors and at-risk behaviors within the event, and lastly, only allow causal factors that had a preexisting duty to act.

It is from the place of a more sophisticated event analysis that we can filter the noise around the event and better understand the role of human error, at-risk behavior, mechanical failure, and environmental/cultural conditions that increased the likelihood of the event. Armed with a more sophisticated approach to reactive learning, we now have the opportunity to classify the failure. In other words, was this design failure, component failure, or unique failure? Depending on which we see dictates the response to the event, thus safeguarding the organization from overreacting to single events.

Now that we have the failure classified with a good understanding of the direct and probabilistic causal links within the event, we are in a better position to conduct the systemic analysis across multiple events, searching for the common cause failures.

It is at this point we have now converted our reactive learning system to a proactive learning system, breaking causal chains across multiple events, decreasing the risk throughout our operational environment.

We can see that, although the '5 Why’s' is a simple tool that allows us a very basic understanding of an event, it is simply not enough for a good investigation that makes it possible to convert the learning system from reactive to proactive and eventually predictive in nature.

For more on event investigation, see our Live Courses information page or our Online Course information page.