A Call to Action for Every Healthcare Leader in America

Joint Commission Sentinel Event 57By Stephen G. Jones, MD

Strange. Our healthcare profession has among its ranks some of the brightest, most talented, highly-trained individuals to be found. We work in an environment utilizing cutting-edge science and technologies. Yet, we continue to fail at an alarming level in our pledged sacred mission to do no harm. Why?

As a profession, healthcare is often compared to the so-called high-reliability professions of aviation and nuclear power; then the question posed is, why we in the healthcare profession aren’t as good? Could it be that those industries were lucky enough to recruit all the exceptionally talented and gifted people, leaving healthcare with the B-list professionals? Of course not. Perhaps the individuals that work in aviation and nuclear power just care more about their work? Not a chance.

So why, 18 years after the release of the Institute for Medicine’s report To Err is Human, do we find ourselves as healthcare leaders asking the same questions?

Part of the answer may lie in the Joint Commission’s most recent Sentinel Event Alert, “The Essential Role of Leadership in Developing a Safety Culture” (Issue 57, March 1, 2017).

As a practicing physician with more years of experience than I care to admit, I have been witness to some remarkable changes in our field – many good, some not so good. The Joint Commission most certainly falls into the good category. Over the years, the Joint Commission has continued to evolve as an organization and build on its mission.

The Joint Commission has endeavored to work as more of a supportive partner to hospitals as opposed to regulatory policemen. More than ever, they seek best practices over citations, learning over blame, transparency over hiding. It can be argued that the Joint Commission now holds healthcare organizations to an even higher standard than it did in the past. The goal did not change, the approach did. Is there a lesson here for us?

The Joint Commission chose to build on their own success. In other words, the leadership of the Joint Commission did exactly what they are now fervently pleading with our own healthcare leadership to do:

• Develop and embed a culture of openness that is accountable and just

• Stop punishing well-intentioned employees who make an error. Instead, supporting them, and seeing the error not as an opportunity to blame, but rather to learn (partner vs. police)

• Foster an open and safe reporting system where everyone is encouraged to speak up without fear of reprisal

• Create an environment of shared learning that focuses on good system design and helping employees make better behavioral choices in a challenging environment

To be sure, the vast majority of very capable and gifted hospital leaders in our country don’t need a lesson from anyone, and certainly not me, on what’s important when it comes to patient care. And to be fair, hospital leaders today, unlike any time in the past, are faced with seemingly insurmountable and never-ending challenges. They operate in a volatile, uncertain, litigious environment often driven by external forces outside their control: hospital boards that expect (among other things) a beautiful environment, near perfect outcomes, a positive operating margin, and wonderful patient and employee satisfaction scores.

Nevertheless, this pivotal release by the Joint Commission is calling upon every healthcare leader to prioritize their efforts on developing a culture of safety, and utilize resources that drive, support, and maintain such a culture.

Building a true culture of safety, a Just Culture, is not easy. In fact it’s downright hard. It doesn’t happen overnight and doesn’t happen without passion and commitment from the very top leadership of any organization. And it certainly doesn’t happen by simply declaring you have a Just Culture and writing a policy around it. A Just Culture needs to be built from the ground up, with a model of systematic learning, and in an environment that wraps itself in the right system of justice.

I urge every hospital leader in America to read, share, and discuss this important Sentinel Event Alert with their extended leadership and embrace its recommendations. I encourage you to reach out to the many resources available to help you on this important journey.

The Joint Commission has provided a clear direction, but more importantly, a compelling challenge for establishing a culture of safety. More than ever, our patients, their families, our staff, all of us, need our top healthcare leaders to embrace this challenge. This country needs our healthcare leaders to lead.

Stephen G. Jones, MD

Medical Director of Safety

Yale New Haven Health System

Just culture can improve safety

Analysis published by International Air Transport Association (IATA) - Airlines International.

February 3 2016

"It is only natural that people and organizations would be less willing to report their errors and other safety issues if they are afraid of punishment or even prosecution"

Just Culture achieved prominent recognition in the European Union (EU) last month, and there are new provisions calling for the protection of safety-related information anticipated to be adopted by the ICAO Council very soon. But, it is what you do with the tremendous amount of data that a just culture enables to be captured, through various mandatory and voluntary reporting systems, that magnifies its positive effect.

“The new EU regulation is about encouraging aviation personnel to tell their employer when things aren’t working well. It isn’t always that someone has made a mistake, it’s that something hasn’t worked out as expected on this occasion. They need to feel that they are being supported by their employer and that this information is useful and will be used to improve things,” said U.K. Civil Aviation Authority’s Performance Based Regulation Safety Data Lead, Sean Parker.

Beyond Europe, global standards beckon for Just Culture [See box Explaining Just Culture]. In October, ICAO member states filed their responses to proposals that include the addition of Safety Culture to Annex 19 of the Chicago Convention. Safety Culture is a broader concept, in which Just Culture is part. Just Culture enables a Safety Culture to exist. Following the anticipated final approval in March 2016, ICAO member states could be required to adopt Safety Culture through the amended Annex in November. Experts foresee a 2018-2020 timeframe for Safety Culture’s incorporation into the domestic regulation of the 190 ICAO member states.

Safety Culture and the need to protect safety data and safety information, collected for the purpose of maintaining or improving safety, was a notable theme at the second ICAO High Level Safety Conference, held earlier in 2015 in Montréal. It was agreed that quick progress in this regard is critical for the improvement of aviation safety.

“It is only natural that people and organizations would be less willing to report their errors and other safety issues if they are afraid of punishment or even prosecution,” noted Gilberto Lopez Meyer, IATA Senior Vice President, Safety and Flight Operations. “These protections are essential for the ongoing availability of safety data and safety information, and forms the basis of a Just Culture.”

The adoption of Just Culture will not only widen the array of data sources that can feed into a company or industry-wide predictive tool, but also increase the quality of the data provided. It is the predictive data analysis that can deliver more than simply local improvement at an airport or maintenance hangar, which a conventional mandatory reporting system for the immediate line managers may do.

Such predictive analysis can also be useful for accident prevention and investigation. A diversity of data to analyse is good because accidents are, “always a confluence of a variety of different factors, which nobody would ever have guessed would have come together at the same time,” IATA General Counsel Jeffrey Shane said.

The quantity of data produced from mandatory and voluntary reporting systems cannot be understated. Legal firm Pillsbury Winthrop Shaw Pittman Partner and Head of its aviation practice, Kenneth Quinn said: “You’re getting 10,000 bits of data from the widest possible variety of sources from airlines, as well as voluntary occurrence reports, [and] your getting it from repair stations.” And all of that can go into powerful computers.

“A big benefit is you can benchmark against other people,” Quinn said. “If you have five engine shutdowns over the course of a year and airline B has none then you have a higher than normal average of in-flight shutdowns, how are you monitoring things?”

Quinn points to the work the United States’ Federal Aviation Administration has done. All of that occurrence data, Quinn said, “goes into very powerful computers…and you take that and implement mitigation strategies to correct that, its having demonstrable safety benefits.” Because of the FAA’s work, Quinn explained that other authorities are examining the potential for such mass data reporting based predictive technology.

At the European Commission’s Aerodays 2015 conference in London in October, the European Aviation Safety Agency talked about its big data safety project that will spend about 31 million euros from 2015 to 2017. The project will seek to demonstrate an ability to predict an unsafe situation.

The adoption of Safety Culture, and by default Just Culture, by ICAO, will, however, present a challenge to some member states and access to all the diverse data that could make a difference could be hindered.  “We recognize there are sovereign legal systems that regard an accident as, in the first instance, something that is a potential criminal act,” said Australia’s Civil Aviation Safety Authority’s Associate Director of Aviation Safety, Jonathan Aleck.

Australia is not a country that begins with a criminal investigation, Aleck highlighted. Its airlines have adopted Just Cultures and in its latest annual review for 2015, Qantas said: “We are proud of our strong, ‘just culture’ of reporting and our dedication to learning from our experiences. And we strive to maintain an environment that encourages trust and confidence in our people to report hazards and incidents and suggest safety improvements.”

Nations whose airlines do face a criminal investigation team, whether they have Just Cultures or not, according to Aleck. He said: “In some jurisdictions there is a strict program [of aviation regulation]. The idea of just culture doesn’t fit well with those kinds of regimes, where the first people on the scene are often criminal investigators.”

Before the likely cause of the 31 October crash of Metrojet flight 9268 had been identified as a bomb, the Russian authorities initial announcement regarding its investigation was the start of a criminal one. The concern is that people who know what led to an accident will say nothing for fear of prosecution, when their information could help stop potentially fatal incidents from occurring again.

Shane is positive about changes to national legislation where Just Culture is not already codified following the expected ICAO March decision. He said: “The benefits can be demonstrated so powerfully that I expect [legislatures to adopt it in the next few years].”

The expansion of Just Culture has grown momentum. For those that have or are employing it, they find that it delivers new insights into how things go wrong. But, questions remain as to how far nations, whose instinct is to investigate possible criminal action first, can succeed in gaining all the possible benefits from the additional information that becomes available.

IATA Member & External Relations, ICAO, Director, Michael Comber sums it up simply. “It’s a tremendous advantage to have people come forward and speak because it’s the best way to prevent as well as figure out how [accidents] happened.” And that points us in the direction of improving safety.

Explaining Just Culture
The definition of Just Culture is an open way of working in which employees are not punished for decisions taken in good faith and commensurate with their experience and training. The employees can report mistakes, by them or others, and know that that information will feed into the safety management system.

However, gross negligence, willful violations and destructive acts are not tolerated.

Just Culture has been required within the EU since November under its regulation 376/2014 that also renews earlier mandatory reporting law. For Just Culture, the EU is requiring that organizations have protection for the reporting staff member, and for persons mentioned in the report, rules for confidentiality, and protection from an employer.

Prior to the European law coming into effect, a European declaration in favor of Just Culture was published in October. The declaration is supported by the Airports Council International, European regions airline association, European Cockpit Association, Aircraft Engineers International, IATA, and other aviation organizations.

As well as European efforts to implement Just Culture, this non-punitive reporting system was included in the Australian Civil Aviation Safety Authority’s new regulatory philosophy published in 2015. Prior to Australia’s CASA action, the United States, New Zealand and the UK had their own rules in place.


Coaching, and Some Choice Four Letter Words

Written by Ellen McDermott, Oe Advisor

shutterstock_46817356Forgive me as I get nostalgic for a moment and reflect back on my youthful days as a platoon leader in the Army.

I was a military police officer stationed in Germany, blessed to serve and command a platoon that was forty soldiers strong.  We were field support, which meant that our main focus was literally out “in the field” living out of our rucksacks training for combat support missions.  Many a day was spent getting muddy and cold, barely sleeping, eating bad food and drinking even worse coffee; and yet, this falls in that wonderful space of being the most stressful and yet the most enjoyable time of my life—all because of the dedicated soldiers with whom I served.

But even the best of us will drift.  And there came one very memorable and very long day in the field when drift became noticeable across the entire platoon—cutting corners, grumbling, moving slowly, and generally just not being the driven soldiers we aspire to be.  The frustration began to trickle up through the squad leaders to my platoon sergeant, and as the hours passed, even my temper started to flare.  And then it happened.  Standing in front of my platoon formation, I said a curse word.  Jaws dropped.

Obviously this was not the first time my soldiers had ever heard a leader curse.  In our world, a curse every other word was pretty much “the norm.”  But in the two years I served as their platoon leader, this was the first time anyone had heard me curse.  And when dismissed from formation, it was now my jaw that dropped, and that of my platoon sergeant’s, as the soldiers moved forward with a renewed and vigorous sense purpose—just from one tiny curse word.  It was so effective that from that day forward, my platoon sergeant would occasionally come to my office and ask, “Ma’am, this is one of those times. Would you please use a curse word?”

I share this story because I want to explain that while coaching is intended to be the highest form of accountability, it isn’t always going to be the hardest form of accountability.  We define coaching as “a values-supportive discussion with the employee on the need to engage in better behavioral choices.”  In later conversations, my very wise platoon sergeant explained to me that the reason that one curse word was so effective was because it was indeed that values-supportive moment. My platoon needed me to acknowledge just how tough that situation was, and by showing my very human moment of frustration, I also showed them that I was standing right there with them in the muck.

We’re very careful as Advisors to never tell anyone exactly how to do coaching, because only you can decide what’s appropriate for your organizational culture.  So am I saying go out and use cursing to get someone’s attention? Absolutely not!  In most workplaces that will be seen as punitive—clearly defeating the intent of coaching.  This was a good fit in this one unique organizational culture at that unique moment.  But I hope as Champions and leaders we don’t become so focused on the formal process that we lose sight of the intent of coaching—that very human moment of saying, “I see the situation, I’m standing here with you in this, and we can do better.”


Systemic Analysis – The Key to Effective Risk Mitigation

By: John Westphal, Senior Advisor

The achilles heel of high consequence industries I have worked in over the years is the over reaction to a single event. In other words, we allow one event to drive systemic changes throughout the organization when that one event may not necessarily tell us enough about the risk that existed within that particular socio-technical system, or the learning system lacked the sophistication to offer a robust enough view of the risk within the identified system.

The question then becomes, which one is it, the sophistication of the learning system or the insignificance of the event? Generally, the failure resides in the learning systems capability to fully explore the richness of the single event.  Failing to combine that learning with other event investigations and developing an accurate view of the inherent and system risk existing within the operation.

The described failure within the learning system generally occurs because of two reasons. First, our single event investigation struggles to define the appropriate cause and effect relationships that existed within the event. Often we insert non-causal data or non-duties into the event causing such significant noise we are unable to articulate the actual risk. Secondly, due to this failure in our single event investigations, we become limited in our ability to do precise common cause failure analysis when looking across multiple events. Simply put it becomes a “garbage in/garbage out” learning system.

In addition to the stated failures above, our learning systems often fail to correctly identify causes related to a class of events. In other words, the causal factor is not relevant to what occurred yesterday but is relevant to what may happen tomorrow or six months from now. A great example of this was played out in the movie, “Flight,” starring Denzel Washington. In the movie, Washington plays an airline captain with a significant drug and alcohol problem.  In an aircraft emergency, Washington engages in extraordinary actions saving hundreds of lives on board even though he was intoxicated while flying the aircraft. The question becomes, was his intoxication causal in regards to the loss of the aircraft? No. It was a mechanical failure that caused the loss of the aircraft. However, does a pilot who is willing to fly under the influence of alcohol and drugs represent a risk to the system? Yes!

At the end of the day our learning systems must exhibit three layers of examination. First, we must be able to identify the relevant cause and effect relationships within a single event. Secondly, once the cause and effect relationships are identified within these single events, we conduct the proper systemic analysis to identify risk mitigation strategies. A general rule of thumb is 70% to 80% of our interventions should come from systemic analysis. Third, our learning system must also assess causal factors related to the class of event. Working to move from the reactive realm to proactive and eventually a predictive learning system.

Outcome Engenuity to Exhibit at the AONE 2016 Annual Meeting

aone2016-logoAre you and your colleagues planning to join the thousands of nursing leaders at the American Organization of Nursing Executives (AONE) Annual Meeting 2016? We are. Outcome Engenuity (Oe) is set to be a part of the Exhibition at the AONE 2016 Annual Meeting. Visit our exhibition booth for more on David Marx's latest book, Dave's Subs: A Novel Story about Workplace Accountability. We will have copies on hand to give away. We will also be available for more information on Just Culture training and products.

The AONE 2016 Annual Meeting is scheduled to run March 30 - April 2, 2016, with the Exhibition being from March 31 - April 1. For more information, visit: http://www.aone.org/annual-meeting/.

We hope to see you there!

David Marx presents as core speaker at ISQua International Conference

IsquaDoha2015Just in October, healthcare leaders from across the world gathered together in Doha, Qatar at the International Society for Quality in Health Care (ISQua) International Conference to improve patient safety through the sharing and garnering of innovations and promoting new ideas.

As an international conference devoted to the improvement of patient safety, ISQua sought out the world’s chief healthcare experts to facilitate learning through presenting on their areas of expertise. Among the sought-after experts was Outcome Engenuity CEO and father of Just Culture, David Marx — considering his extensive experience in pioneering, developing and implementing Just Culture principles in healthcare — as well as various other industries across the globe.

A study in the Journal of Patient Safety from 2014 reported that more than 400,000 patients’ die every year from preventable medical harm. Bearing these numbers in mind, health care organizations are increasingly longing for justice and accountability within their complex systems to better manage employee behavior, improve learning systems and produce better outcomes (and so improve patient safety) within their organization. Consequently, this is a matter many health care leaders long to dive in to, making Marx a prime candidate as a core speaker at the ISQua International Conference.

Recognizing that hospitals, nursing facilities and ambulatory care facilities all face the task of building a stronger culture of accountability within their organizations, Marx framed his presentation around the movement to create a more accountable culture within the workplace. Marx also discussed the necessity of building a strong reporting and investigative culture, as well as the task of managing behavioral choices, providing many insights for organizations striving to accomplish this task:

Screen Shot 2015-11-24 at 12.18.45 PM


Screen Shot 2015-11-24 at 12.17.51 PM

Screen Shot 2015-11-24 at 12.18.05 PM

Screen Shot 2015-11-24 at 12.19.02 PM

Screen Shot 2015-11-24 at 12.19.21 PM

Screen Shot 2015-11-24 at 12.19.44 PM

Screen Shot 2015-11-24 at 11.09.05 AM


UPenn Law’s Quattrone Center receives $350,000 for deep just culture reviews

The Quattrone Center for the Fair Administration of Justice at the University of Pennsylvania Law School was recently awarded $350,000 in funding from the National Institute of Justice to conduct extensive reviews of error in Philadelphia’s criminal system using a just culture approach, according to a recent article from the University of Pennsylvania Law School. The funding will go toward the Philadelphia Event Review Team (PERT), which will be launched early 2016. PERT will assemble major criminal justice agencies to deeply analyze cases with unintended outcomes in Philadelphia’s criminal justice system in order to identify, prioritize and implement reforms across the various criminal justice agencies. Read the entire Penn Law article below:

Watch John Hollway discuss the Philadelphia Event Review Team below. See Full Article Here.

Screen Shot 2015-11-04 at 10.42.21 AM


A Just Culture encourages open reporting of errors, omissions, or decisions without the possibility of punitive action, a concept  widely embraced by regulatory agencies. This open reporting enhances safety regulations within the airlines industry. An essential component for success is that the person(s) reporting have the right to do so in strict confidence. It is this matter that Centre for Aviation (CAPA) speaks out against in its latest report, Aviation safety vs the “prosecutorial imperative”. Indiscriminate prosecutions erode safety culture.  It is truly the intersection between "law and the safety of air travel" that is being upheld, as Jeff Shane, IATA General Counsel, aptly labels "prosecutorial imperative” - that judges, prosecutors, and trial lawyers often seek access to this material." Read the full report at CAPA's website.


Aviation safety vs the “prosecutorial imperative”. Indiscriminate prosecutions erode safety culture


This report contains extensive extracts from the Keynote Remarks of Jeff Shane, IATA General Counsel, to the Tort Trial & Insurance Practice Section of the American Bar Association Aviation and Space Law Committee National Program, in Washington, DC on 22-Oct-2015. Mr Shane addresses a key area of concern to those dedicated to applying lessons learned from airline accidents in the cause of improving air safety.

Major improvements in safety management have come with the advent of voluntary reporting systems, dating back to the 1970s. Mr Shane recounts that these systems have been encouraged by regulators in a number of countries as part of a non-punitive, “just culture” approach to safety regulation. There is an emerging consensus among regulators and airlines alike that a “just culture” approach yields greater benefits than a regime characterized by enforcement penalties. Essential to the success of such systems is that the information furnished through such systems be held in strict confidence.

However, Mr Shane was concerned at a persistent “prosecutorial imperative” - that judges, prosecutors, and trial lawyers often seek access to this material and, “in a growing number of cases, they have succeeded.” If this trend were to continue, says Mr Shane, “the essential flow of safety information would simply dry up” as those with valuable knowledge fear the legal consequences of sharing information.

About the intersection between law and the safety of air travel

2015 is turning out to be a record year for the airline industry. Speaking a couple of days ago at IATA’s World Passenger Symposium in Hamburg, our Director General, Tony Tyler – my boss – announced that for 2015, we expect an industry net profit of $29.3 billion on revenues of $727 billion, for a net profit margin of 4 percent, generating a return on invested capital of 7.5 percent. For the first time, we actually expect the industry on average to create value for its investors. It’s hardly a robust performance compared to other industries -- Apple earned $13.6 billion in the second quarter of this year alone for a 23.4 percent margin -- but for the airline industry, 4 percent is something to celebrate.

But my theme today isn’t the quest for elusive profits in commercial aviation. I want to talk instead about the intersection between law and the safety of air travel, with a focus on some interesting recent developments.

Safety Information Protection - and learning from accidents

A good way to start the discussion might be by reference to the Montreal Convention of 1999. My guess is that the people in this room know better than anyone what things were like before that treaty came into force. Under the old Warsaw/Hague regime, airlines had strict liability for mishaps, but the victims of an accident were entitled to no more than some absurdly low recovery amount, depending on the jurisdiction in which they were eligible to sue. Even in the United States, where airlines were compelled by regulators to increase the damages available through the treaty, the maximum recovery was $75,000 per passenger.

The only way claimants could break those limits was to prove in court that the carrier had been guilty of “willful misconduct” – a gross negligence, reckless endangerment test that engendered many years of costly litigation that was excruciating for claimants and defendants alike.By the mid-‘90s, the airline industry had had enough.

In 1996, through inter-carrier agreements brokered by IATA and the Air Transport Association of America – today’s A4A – the airlines waived the liability limits of Warsaw/Hague. The Montreal Convention of 1999 effectively ratified that waiver. Today, strict liability is still the centerpiece of the regime, but unless the airline can prove that the accident was not due to its own negligence – in other words, prove that it took all available measures to prevent the accident -- claimants are entitled to recover all provable economic damages. The net result is that, as long as a claim falls under MC99, there is no longer any reason to spend years fighting in court over whether the airline was guilty of “willful misconduct.”

Even in the very rare case where the airline successfully asserts the non-negligence defense, claimants are entitled to 113,100 special drawing rights, or about US$160,000 at current conversion rates.The Montreal Convention made the recovery process more humane, to be sure. But it had an even more important benefit. I know that the plaintiffs’ bar prides itself on using the evidentiary tools available in a trial to tease out important facts that might otherwise have gone undiscovered.

Thanks to the Montreal Convention of 1999, the litigation-driven incentive to construe the facts in ways most beneficial to one side or the other have largely gone away.

But we lawyers have a professional responsibility to construe those facts in ways most beneficial to our clients. Fact-finding thus can take a back seat to advocacy. Thanks to the Montreal Convention of 1999, the litigation-driven incentive to construe the facts in ways most beneficial to one side or the other have largely gone away.

After all, every accident, however regrettable, represents an opportunity to make flying safer, as long as we can find out what actually happened. We can say, thanks to these important developments in civil litigation over the past 20 years, that we now have a much better chance of exploiting that opportunity fully.

You should know that since 1996, when the airlines first waived the limits of liability under Warsaw, the fatal accident rate in commercial aviation has steadily declined. In fact, 2014 was the safest year we have ever had. Jet-hull loss per million sectors flown in 2014 was 0.23, the lowest on record. Whether or not you believe that the elimination of “willful misconduct” trials was a factor in that steady decline in accidents, at least we know that the reduction in such litigation had no adverse safety consequences.

The Prosecutorial Imperative vs. “Just Culture”

But there’s another worrisome impediment to learning from occasional mistakes. It is what I will call the prosecutorial imperative.

In too many jurisdictions, the instinct is to treat every accident as a possible crime. There is immediate tension between the technical accident investigators who simply want to find out what happened in the interest of making sure it doesn’t happen again, and the criminal investigators who want to determine whether the accident was attributable to culpable conduct and if so to punish that conduct. I don’t have to tell you what happens when the gendarmes put yellow tape around the scene of an accident and start quizzing witnesses.

Those closest to the event and with the most valuable information hire lawyers and are warned that anything they say may be used against them. Getting the facts becomes much harder.

the prosecutorial imperative can compromise in a fundamental way the over-arching safety ethic

Even more worrying is that the prosecutorial imperative can compromise in a fundamental way the over-arching safety ethic that has been so successfully embedded in the DNA of the aviation industry. I’m talking about the “just culture” approach that is widely treated within the industry as a sine qua non to optimal safety performance. The idea dates back to the 1970s, when the first voluntary reporting systems were established.

It is a simple concept: Companies and their employees are encouraged to report voluntarily any defect, any anomaly, any departure from the norm, anything that might compromise the safety of flight. The information and its source are held in strict confidence. And no punishment follows – either of the employee or of the company. No protection is accorded to criminal activity, of course, but short of that, the information remains sacrosanct.

The great thing about the just culture approach is not merely that it produces timely information that can save lives, but that the information is widely shared among those who can benefit from it. There are searchable online databases containing massive amounts of vitally important safety-related information. With today’s sophisticated analysis and artificial intelligence, it is possible to predict incipient dangerous conditions and remedy them well in advance of an actual system failure.

An Emerging Consensus: deficiencies are best addressed by a "just culture" approach

The value of just culture has been widely acknowledged by regulatory authorities. The FAA last June issued a new “compliance philosophy” (FAA Order 8000.373, June 26, 2015) that places new emphasis on non-punitive means of rectifying deviations from regulatory requirements when disclosed. Noting that some deviations arise from factors like flawed procedures, simple mistakes, lack of understanding, or diminished skills, the FAA believes that such deficiencies “can most effectively be corrected through root cause analysis and training, education or other appropriate improvements to procedures or training programs for regulated entities....” In other words, not through the imposition of penalties. The objective, quite clearly, is to encourage more voluntary reporting in the interest of ensuring that the safety management systems required of all airlines are working optimally.

“CASA embraces, and encourages the development throughout the aviation community of, a ‘just culture,’ in which people are not punished for actions, omissions, or decisions taken by them that are commensurate with their experience, qualifications and training.”

Just last month Australia’s Civil Aviation Safety Authority issued a new statement of regulatory philosophy that even more explicitly embraced the just culture approach. The agency wrote: “CASA embraces, and encourages the development throughout the aviation community of, a ‘just culture,’ in which people are not punished for actions, omissions, or decisions taken by them that are commensurate with their experience, qualifications and training.”

Earlier this month, the European Commission convened a meeting in Brussels to introduce a “European Corporate Just Culture Declaration.” The Declaration said: “It is acknowledged that, in an operational aviation industry environment, individuals, despite their training, expertise, experience, abilities and good will, may be faced with situations where the limits of human performance combined with unwanted and unpredictable systemic influences may lead to an undesirable outcome.”

There’s a bumper sticker that makes the same point in fewer words. It can be paraphrased as “Stuff happens.”

The declaration then continues: “Analysis of reported occurrences by organisations should focus on system performance and contributing factors first and not on apportioning blame and/or focus on individual responsibilities....”

Very clearly, there is an emerging consensus -- among regulatory agencies and the industry -- that encouraging voluntary disclosure of safety information is in everyone’s interest, and that the best way to do so is to apply non-punitive remedies to deficiencies that are voluntarily disclosed.

ICAO and Protection at the Global Level

we have seen too many cases in recent years in which judges, prosecutors, and plaintiffs’ attorneys have sought access to this vitally important safety information.

Despite this consensus, however, we have seen too many cases in recent years in which judges, prosecutors, and plaintiffs’ attorneys have sought access to this vitally important safety information. In a growing number of instances, they have succeeded. If that trend were to continue, you can be assured that the essential flow of safety information would simply dry up.

This danger is increasingly understood and it’s now an issue that’s being tackled globally, most importantly at ICAO.

Five years ago, an ICAO High-level Safety Conference recommended the development of new guidance – what ICAO calls “Standards and Recommended Practices” or “SARPs” – to be included in a new annex to the Chicago Convention devoted to safety management. The annexes to the Convention, as you probably know, are where the high-level principles enunciated in the treaty are turned into more specific and granular guidance. They aren’t self- executing; they have to be implemented through national laws and regulations in order to be effective, but that’s generally what happens. It happens because the quality of a government’s aviation safety oversight is measured by the extent to which it has implemented ICAO’s SARPs and other guidance.

The new SARPs envisioned five years ago were to spell out government responsibilities for the protection of safety information. The protection of information derived from accident investigations was already addressed to some extent in the accident investigation annex -- Annex 13. The new SARPs were intended to reinforce those protections and explicitly cover information reported via the safety management systems that are now a mandatory ingredient in airline operations – including, of course, the voluntary reporting I’ve been talking about. This new guidance will be included in the new safety management annex -- Annex 19. And lest there be any doubt, the protection contemplated is protection from prosecutors, judges, and yes, even trial lawyers.

Some of the most important provisions can be found listed under new “Principles of protection” proposed for Annex 19. The first principle is that “States shall ensure that safety data or safety information is not used for: a) disciplinary, civil, administrative and criminal proceedings against employees operational personnel or organizations; b) disclosure to the public; or c) any purposes other than maintaining or improving safety; unless a principle of exception applies.”

The “principles of exception” are what you would expect – cases in which the conduct in question clearly crosses the line from an honest mistake into the area of reckless endangerment, gross negligence, willful misconduct, or whatever you want to call it -- conduct that would always be subject to prosecution under applicable national laws.

But the overarching idea, simply put, is that penalizing honest mistakes merely impedes the flow of valuable safety information and thereby actually increases the risk profile of the aviation sector.

ICAO is moving towards a basis for a standard global approach by end-2016

The new provisions were circulated to governments for a final review last July in something ICAO calls a “state letter.” Any further comments from the governments were due a week ago, by October 15. The next step will be a review by ICAO’s Air Navigation Commission with the intention of presenting the language to the ICAO Council – ICAO’s governing body -- for final approval next March. Nobody expects to hear any dissent. The new provisions will then become effective in November of next year.

There is still an open question as to when the new provisions will become applicable to governments – 2018 or 2020 are the options being discussed. As I indicated earlier, nothing in an ICAO annex is self-executing; to be effective and enforceable, the guidance has to be translated into national law by governments. My guess is that a great many governments won’t wait for the new language to become effective but will start their legislative processes working even sooner.

All of this is good news for the airlines, of course, but it is even better news for their customers – including you and me. Aviation is already the safest mode of transportation, and by a wide measure. But air traffic is predicted to double over the course of the next 20 years.

That means that we have an obligation to do all we can to make the remarkable safety management systems we rely upon today even better. The changes in law that I’ve discussed will be an essential element in that improvement.


Bringing Just Culture to the Streets


Making judgments without an understanding of the root cause(s) of the situation quenches both growth and learning culture. It is absolutely necessary for us to remain impartial in our judgments until we can adequately discern the root cause(s) of the event.

Recently, a pastor who played a major role in the Boston Miracle, Jeffrey Brown, presented a Ted talk testifying to the power of this.

In the late 1980s, violence on the streets of Boston was increasing at alarming rates, and by 1990 Boston’s homicide rate reached a peak of 152. But by 1999, that number dropped down to 31 thanks to the key leaders within the Boston community. The Boston Miracle, simply stated, was this unprecedented 79 percent drop in the city’s homicides over the span of 10 years from 1990–1999.

In his Ted Talk, Brown shared that after a multitude of tragic events, he realized that it was not enough for him to build programs for the at-risk youth. He began to search for the youth actively involved in violence. He soon found himself walking the streets of Boston during the hours of the night, and by 1992 he and other area pastors had formed the Ten Point Coalition to combat youth violence in the streets of Boston.

Over time, these pastors began developing relationships on the streets of Boston during the night hours. They discovered that the individuals, who many dismissed as cold and heartless, were the exact opposite of their labels, and were simply trying to “make it on the streets.”  By not rushing to judgment, the pastors were able to engage with the youth and partnered with them to change the culture on the streets.

But this journey took time. It was only when these youths viewed the Ten Point Coalition and the law-enforcement as legitimate, fair and just that the culture on the streets began to change. This meant the Ten Point Coalition and law enforcement had to consistently take the time to discern what justice meant for each person involved, determining who needed to be helped, who needed to be coached and who needed to be punished. In turn, the Boston area pastors were able to help the Boston police focus on the truly reckless and intentionally harmful behaviors.

This began the transformation of the street culture, and cultivated a cultural atmosphere ripe for justice. With cooperation at all levels, the Boston Miracle occurred, becoming a powerful testimony to the fruit of not rushing into judgment over a situation. Even now, others are inspired by the result of Boston’s street transformation in the 1990’s. In fact, a group of Baltimore pastors have decided to devote the summer of 2015 to walking the streets of Baltimore at night in hopes of the cultural transformation of the streets of Baltimore.

Monica Lewinsky addresses the culture of shame

Culture is shaped by our behaviors, and both repetitive human errors and at-risk behaviors can be detrimental to the direction of an organization. For leaders, this is critical to note.

Why? It is simple. Leaders have the authority to shape the system and culture, ultimately determining the direction of an organization. What is allowed and voiced within the workplace either gives room for learning and growth, or squashes learning and growth.

In March 2015, Monica Lewinsky presented the Ted Talk message “The Price of Shame,” which focused on the effects of cyber bullying. Out of respect for Lewinsky, it was Ted’s aim to provide a safe place for her because it was among her first public appearances in 10 years. However, comments derailing Lewinsky began almost immediately upon the posting of the message (before the public would have time to watch the full 20 minute Ted Talk).

The very thing that Lewinsky was speaking up about was happening.

With that, three Ted employees immediately took control of the situation through aggressively monitoring the comments being made: they would purge the negative comments and reply to the positive comments, bringing the good to the top of the feed. After much deliberate work, the Ted employees saw a shift within the public forum—the voices that uplifted, empowered and encouraged Lewinsky were prevailing, changing the forum content and culture altogether.

The public began to see what was clearly accepted and what was not acceptable for the forum.

During Lewinsky’s message, she encouraged the listeners to become “upstanders,” defending those who are victims in the world’s steep culture of shame. Interestingly, the very call-to-action given by Lewinsky during her message was manifesting within the forum—people were becoming “upstanders” for Lewinsky in the midst of a culture of shame.

One commenter wrote: “I am so inspired by her wisdom and courage. I cannot imagine the depths of despair she went through and wow look at the incredible message she is bringing to us now because she survived and is now thriving.”

Clearly, through this case, we see the way that the Ted employees shaped—and ultimately shifted—the culture of the forum, helping to redefine the norm within the forum. The Ted employees victoriously encouraged the voices that silenced the shame and silenced the voices that encouraged shame.

For leaders within the workplace, it is important to understand that through empowering individuals who are giving voice to desired outcomes within the workplace, we shape the culture of the workplace in a positive manner, creating an open atmosphere encouraging the desired outcome, and thus, discouraging the undesired outcome.

It is evident through the Lewinsky case, the power of unity to shift a culture--and within the workplace voices that uplift, encourage, and empower each other to learn and grow--can eventually impact the culture of the organization as a whole.

Australian aviation agency implements Just Culture

CASA_pilot_C857E1F0-7D82-11E3-85ED005056A302E6 (1)

It’s simple: mistakes occur—and our response (or lack of response) to mistakes can either propel an organization forward or fully divert its direction. If a mistake remains neglected, that mistake could develop into at-risk—and even reckless—behavior. Yet again, if a mistake is brought forth and is properly investigated, there remains an opportunity for learning and growth.

In an effort to develop a culture where mistakes are genuinely recognized as opportunities to learn and improve from, Australia’s Civil Aviation Safety Authority (CASA) recently announced its decision to implement a “just culture” approach to aviation regulation.

"The advantage of a Just Culture approach is that it encourages people to be open and accountable about their mistakes, so there is a better reporting of errors and the ability to learn from them is enhanced,” CASA Director of Aviation Safety, Mark Skidmore, said in an interview with Australian Flying.

Skidmore stated that through this initiative, CASA’s desire is for individuals and organizations to understand the root cause of mistakes and how to reduce the likelihood of the same mistake from occurring in the future. Even more, through the implementation of Just Culture, CASA’s hope is to better improve aviation systems altogether through commitment to accountability and transparency.

He further emphasized in a later article from Australian Flying the need for cooperation and accountability throughout the aviation community. Without it, the implementation of Just Culture will not impact Australian aviation.

In the latter article, Skidmore also emphasized the need for a structured system such as Just Culture. He explained that such structure creates an atmosphere for open reporting, helping to establish a culture of accountability and workplace transparency.

CASA’s decision to encourage and implement Just Culture throughout Australian aviation reveals the need for industry and organizational cooperation for effective implementation of Just Culture. Even more, through this, it is clear how important it is for organizational leaders to take action through committing to understand how to properly manage and investigate the root of events, in order to establish a culture of accountability and workplace transparency.

Learn how to implement Just Culture in your organization by visiting our Live Just Culture Course training page for upcoming Just Culture Certification Course dates and information, or search our Online training page for more resources.

Hospital duties hindered by lack of nursing staff safety

All too often nurses and other hospital staff are harmed at work while diligently performing their duty of providing care for patients. Last week, NPR News released the first of a series of four investigative pieces addressing this failure to protect employee safety within hospitals.

Every day, nurses are repeatedly faced with circumstances that hinder their ability to adequately meet the need of their patients without harming themselves—specifically when it comes to their everyday duty of moving and lifting patients.

According to the NPR article, it is clear that the extent to which hospitals are emphasizing a “culture of safety” for nurses is incomparable to the way it is emphasized for patients—and understandably so being that the patient is the one who is ill or injured.

However, with nurses put in such fragile situations daily, the risks are high. In fact, the Bureau of Labor Statistics (BLS) reported more than 35,000 back and other injuries among nursing employees.  In 2013 the BLS reported that orderlies and nursing assistants experience nearly triple the amount of musculoskeletal injuries causing them to miss work as police officers, correctional officers and construction laborers.

The article also notes that there is little to no aggressive action being taken by hospitals to address this and to protect their staff from lifting injuries—potentially leading to employees missing work or attempt to work through the pain, hindering patient care quality. According to American Nurses Association (ANA), 10 states within the country “require a comprehensive program in health care facilities” promoting nursing staff safety.

It seems there is a correlation between the lack of protection of nursing staff safety and the duty nursing staff have to produce an outcome--is the fix to the problem Just Culture?

For more information on establishing Just Culture, see our Live Training Courses page or our Online Course Training page.

Why the ‘5 Why’s’ are not enough for a good investigation


By: John Westphal


Event Investigation is a tool within the reactive learning system that we use to extract learning from an event. One of the practices I have employed as a six sigma black belt and human factors investigator is the '5 Whys' technique. It is used in the Analyze phase of the 'Six Sigma DMAIC' (Define, Measure, Analyze, Improve, and Control) methodology, and is a technique that seeks to identify the root cause of an event.

That said, as organizational leaders, I believe we have become overly captivated with the identification of the root cause. Many organizations have fallen into the trap of believing if they find the root cause, they have in essence found the piece to address for further mitigation of the event risk. This, in my experience, leads us down a path of fixing one event at a time rather than addressing common cause failures, which are often further up the causal chain and allow us the opportunity to address risk in a more holistic fashion.

The '5 Whys' is a simple methodology that allows us a basic understanding of the initiating event (root cause), but as we seek to extract all the learning from the event for more effective risk mitigation, we must employ additional methodologies (Rules of causation) to help us understand the cause and effect relationship, mitigate the use of negative descriptors that are subjective in nature, identify and explain the human errors and at-risk behaviors within the event, and lastly, only allow causal factors that had a preexisting duty to act.

It is from the place of a more sophisticated event analysis that we can filter the noise around the event and better understand the role of human error, at-risk behavior, mechanical failure, and environmental/cultural conditions that increased the likelihood of the event. Armed with a more sophisticated approach to reactive learning, we now have the opportunity to classify the failure. In other words, was this design failure, component failure, or unique failure? Depending on which we see dictates the response to the event, thus safeguarding the organization from overreacting to single events.

Now that we have the failure classified with a good understanding of the direct and probabilistic causal links within the event, we are in a better position to conduct the systemic analysis across multiple events, searching for the common cause failures.

It is at this point we have now converted our reactive learning system to a proactive learning system, breaking causal chains across multiple events, decreasing the risk throughout our operational environment.

We can see that, although the '5 Why’s' is a simple tool that allows us a very basic understanding of an event, it is simply not enough for a good investigation that makes it possible to convert the learning system from reactive to proactive and eventually predictive in nature.

For more on event investigation, see our Live Courses information page or our Online Course information page.

Outcome Engenuity’s Just Culture Certification Courses for 2014

About This Course:



Individuals interested in the Just Culture Model to improve their leadership effectiveness.

If you are an organization interested in a systemic implementation of Just Culture, we recommend a team from each section of your organization capable of guiding your organization in the training and implementation of the Five Skills Model; in particular, leaders from operations, safety/risk/quality, and HR. This course is open to professionals from all industries.

Note: This is now a 3 day course only.


Just Culture Certification Course Overview:

This is the Just Culture flagship course. It is designed to improve leadership effectiveness through use of the Just Culture model. As a society, we struggle with the question of how to hold human beings accountable when they fail to live up to our expectations or when a significant event occurs. This is true for our justice systems, for our organizations and for us as individual leaders. This course provides the most comprehensive instruction available in the Five Skills Model for achieving better outcomes:

  • Identifying Values and Setting Expectations
  • Improving System Design
  • Managing Behavioral Choices
  • Building and Utilizing Robust Learning Systems
  • Ensuring Justice and Accountability – The Just Culture Algorithm

As a Certified Just Culture Champion you are versed in the history of Just Culture, able to integrate the Five Skills Model into your leadership practices, and highly proficient in the correct application of the Just Culture Algorithm. We discuss how a Just Culture helps in designing systems that anticipate human error, at-risk behavior, and reckless behavior. The Just Culture Algorithm is the premier tool used to achieve both justice and accountability across industries worldwide. During the course, participants will receive hands-on practice and examine how this tool integrates the law with key principles of socio-technical engineering.

Materials Provided In The Course:

  • Just Culture Certification CourseThe Just Culture Algorithm v3.2
  • Event Investigation Toolkit and Online Training
  • Coaching & Mentoring Action Guide and Online Training
  • Just Culture for Managers Workbook and Online Training
  • The Proposition
  • The Final Check Toolkit


You will be prepared with online training, display your proficiency with an exam, and review your exam with a Just Culture Advisor to fine tune any areas you may be struggling with and discuss any concerns you may have about implementing Just Culture in your organization. Upon certification, you will also gain membership in a community of more than 1,500 Champions worldwide who have demonstrated in-depth Just Culture proficiency and competency in the Just Culture Algorithm™. Access to the Just Culture Certified Champion Network provides you with ongoing resources and learning support tools to sustain your personal and professional development, and provides a forum for sharing of organizational experiences and Just Culture best practices.



Free OnDemand Webinar
Just Culture Champions / Creating That Internal Resources - Presented by John Westphal

Thank You.

System Changes and Risky Choices


ProductivityMost employees will not actively choose to reduce their productivity; most people have enough self-preservation instinct to keep from making such an unwise decision. What can happen, however, and frequently does happen, is that an employee can choose to make small changes in their system. Usually their intentions are good; they’re trying to speed up the process by cutting out a secondary check or by rushing through a minor part of the process.

But those processes were specifically built over time in order to produce the outcome you desire with a minimal failure rate. Choosing to circumvent the process runs the risk of increasing that failure rate, and why would anyone do that? Perhaps they have created the outcome so consistently for so long without failure that they have forgotten that failure is an option. Or maybe they’re trying to improve their production rate. Or perhaps they just have the confidence in their own skills to continue to avoid failure without following all of the guidelines.

It’s not malicious intent. It’s just that their perception of risk has drifted. They have grown accustomed to the risk; it doesn’t seem as, well, risky as it should. And as long as we learn to recognize it and get ourselves back on track, we’ll be fine. We can use periodic training to realign our sense of risk. We can change the systems we use to allow for more checks or to add redundancy. Most and easiest of all, we can keep the lines of communication open, letting workers use each other to remember what can happen when we fail.

Once we recognize what’s causing the drop in productivity, we can do something about it. But it can take a little extra effort to look behind the symptom to find the real cause. That’s why we encourage a culture of learning as a vital part of a Just Culture.

When a Human Error Catches Negative Media Attention

Negative PR

Negative PRWhat if a human error attracts negative media attention and risks harming your organization’s reputation, or compromises your mission or values? To err is human, and humans will make mistakes, we will even make at-risk choices that cause errors and mistakes. Knowing this, should employees be punished if the general public is looking for retribution for a harmful outcome? What role does public opinion have in our management of employees involved in negative outcomes? Keep in mind that even public outrage is an outcome, and should not be taken into consideration in the evaluation of an event.

Human errors are not good things – yet we must anticipate that they will occur. When your investigation confirms that an event that has garnered public attention is the result of human error, we must look to next steps.

Ideally, we can settle negative attention in a Just Culture without compromising the commitments that we have made. Take a deep breath and be willing to stand behind what your organization has decided is the "right" way to practice and manage. Stand up and be open and honest: “Yes, a bad thing happened, but after a thorough investigation we have found that this tragedy is the result of a human error, one that any human could have made. Pursuant to that finding, and in light of the Just Culture principles we have supported since [date], we will continue to support and console the involved employees. We as an organization, however, have taken every step possible to ensure that the circumstances that led up to this mistake have been thoroughly investigated and we have enhanced our systems and processes to ensure that this tragedy cannot be repeated.”

Proactive steps to explain our organization's implementation of Just Culture to your local community may assist you with being able to speak openly and honestly about the choices your organization will make when an event occurs.  Without the ability to withstand public pressure, punishing an employee based on public outcry can damage the trust that a Just Culture works to build between leadership and employees; your organization will have to decide if that’s a price worth paying.