Medical University of South Carolina Grads Certified in Just Culture

View the original article here.

Some CHP grads go through pilot workforce accountability program

By: Heather Woolwine
May 24, 2017

CHARLESTON, SC – Health care administration professionals, who must carefully consider how they evaluate human errors and at-risk behaviors in health care practice by those they manage, often struggle with knowing the best way to encourage transparency and accountability for those errors. To address this knowledge gap in training for health administrators about to hit the workforce, the MUSC College of Health Professions (CHP) piloted a new curriculum approach for this year’s Master in Health Administration (MHA) graduating class that incorporated the Just Culture system of workplace accountability for high-consequence industries. It is the only program in the country to date to offer this additional training and research opportunity to its MHA students.

“Over the years, I have asked former graduate students what they felt unprepared for when they got into the real world of leading individuals within a health care entity,” said Tom Crawford, Ph.D., MHA program assistant professor.  “The answer was the delicate interaction with their employees when things did not go as planned.” To that end, Crawford and his colleagues constructed a curriculum using “Dave’s Subs: A Novel Story About Workplace Accountability” and partnered with the book’s author and Outcome Engenuity principal, David Marx, for a research and education project to develop and refine academic materials and testing related to Just Culture.

Recognized by industry leaders as the “father of the Just Culture movement,” Marx said that Just Culture is about differentiating human errors and at-risk behaviors from more culpable and reckless choices that providers may make in the course of caring for patients. “It works to move away from judging employees based upon an unfortunate outcome, putting more emphasis on the quality of their choices.  In doing this, we create a more accountable, open, learning culture within an organization – which in turn leads to better outcomes,” he said.

MHA students were provided an opportunity to take the Just Culture Certification exam, and provided feedback on the exam and curriculum throughout the educational partnership. Student Parker Rhoden, who recently passed the exam, said, “The Just Culture certification provided me with a framework to effectively handle difficult human resources decisions, and will be extremely valuable to my career in health administration. This really is an immediate benefit for those of us entering the workplace.”

Jami DelliFraine, Ph.D., CHP Department of Healthcare Leadership and Management chairwoman, echoed Rhoden’s comments. “We see this as an opportunity to bring the incredibly important message of Just Culture to tomorrow’s health care leaders, and through our continuing education program to leaders within the broader community,” she said.

A Call to Action for Every Healthcare Leader in America

Joint Commission Sentinel Event 57By Stephen G. Jones, MD

Strange. Our healthcare profession has among its ranks some of the brightest, most talented, highly-trained individuals to be found. We work in an environment utilizing cutting-edge science and technologies. Yet, we continue to fail at an alarming level in our pledged sacred mission to do no harm. Why?

As a profession, healthcare is often compared to the so-called high-reliability professions of aviation and nuclear power; then the question posed is, why we in the healthcare profession aren’t as good? Could it be that those industries were lucky enough to recruit all the exceptionally talented and gifted people, leaving healthcare with the B-list professionals? Of course not. Perhaps the individuals that work in aviation and nuclear power just care more about their work? Not a chance.

So why, 18 years after the release of the Institute for Medicine’s report To Err is Human, do we find ourselves as healthcare leaders asking the same questions?

Part of the answer may lie in the Joint Commission’s most recent Sentinel Event Alert, “The Essential Role of Leadership in Developing a Safety Culture” (Issue 57, March 1, 2017).

As a practicing physician with more years of experience than I care to admit, I have been witness to some remarkable changes in our field – many good, some not so good. The Joint Commission most certainly falls into the good category. Over the years, the Joint Commission has continued to evolve as an organization and build on its mission.

The Joint Commission has endeavored to work as more of a supportive partner to hospitals as opposed to regulatory policemen. More than ever, they seek best practices over citations, learning over blame, transparency over hiding. It can be argued that the Joint Commission now holds healthcare organizations to an even higher standard than it did in the past. The goal did not change, the approach did. Is there a lesson here for us?

The Joint Commission chose to build on their own success. In other words, the leadership of the Joint Commission did exactly what they are now fervently pleading with our own healthcare leadership to do:

• Develop and embed a culture of openness that is accountable and just

• Stop punishing well-intentioned employees who make an error. Instead, supporting them, and seeing the error not as an opportunity to blame, but rather to learn (partner vs. police)

• Foster an open and safe reporting system where everyone is encouraged to speak up without fear of reprisal

• Create an environment of shared learning that focuses on good system design and helping employees make better behavioral choices in a challenging environment

To be sure, the vast majority of very capable and gifted hospital leaders in our country don’t need a lesson from anyone, and certainly not me, on what’s important when it comes to patient care. And to be fair, hospital leaders today, unlike any time in the past, are faced with seemingly insurmountable and never-ending challenges. They operate in a volatile, uncertain, litigious environment often driven by external forces outside their control: hospital boards that expect (among other things) a beautiful environment, near perfect outcomes, a positive operating margin, and wonderful patient and employee satisfaction scores.

Nevertheless, this pivotal release by the Joint Commission is calling upon every healthcare leader to prioritize their efforts on developing a culture of safety, and utilize resources that drive, support, and maintain such a culture.

Building a true culture of safety, a Just Culture, is not easy. In fact it’s downright hard. It doesn’t happen overnight and doesn’t happen without passion and commitment from the very top leadership of any organization. And it certainly doesn’t happen by simply declaring you have a Just Culture and writing a policy around it. A Just Culture needs to be built from the ground up, with a model of systematic learning, and in an environment that wraps itself in the right system of justice.

I urge every hospital leader in America to read, share, and discuss this important Sentinel Event Alert with their extended leadership and embrace its recommendations. I encourage you to reach out to the many resources available to help you on this important journey.

The Joint Commission has provided a clear direction, but more importantly, a compelling challenge for establishing a culture of safety. More than ever, our patients, their families, our staff, all of us, need our top healthcare leaders to embrace this challenge. This country needs our healthcare leaders to lead.

Stephen G. Jones, MD

Medical Director of Safety

Yale New Haven Health System

Pointing the Finger Is a Human Trait: We Must Learn to Do It Well

President Harry S. Truman is shown at his desk at the White House signing a proclamation declaring a national emergency. December 16, 1950. Acme. (USIA) NARA FILE #: 306-PS-50-16807 WAR & CONFLICT BOOK #: 1372
President Harry S. Truman is shown at his desk at the White House signing a proclamation declaring a national emergency. December 16, 1950. Acme. (USIA)
NARA FILE #: 306-PS-50-16807

Harry Truman famously kept a sign on his desk in the Oval Office reading “The Buck Stops Here.”  It was an overt declaration that he took ultimate responsibility for every choice his administration made, for everything they did or failed to do.  He assumed the blame.  And that’s an admirable trait in a leader—it builds trust and wins the respect of team members to know their leader is willing to take the blame for mistakes the team makes.  It goes back to justice: no one wants to be blamed for something for which they aren’t personally at fault.  Because whenever something goes wrong, there must be someone to blame, someone to be punished if necessary.  That’s a basic fact of human nature.  People think in terms of cause and effect: if something bad happened, someone or something must have screwed up to cause it.  But admirable as it is for a leader to assume responsibility for everything, blaming the person in charge isn’t enough.  Nor is just blaming the person at the point of failure, or picking a random scapegoat.  Simplistic ways of apportioning responsibility for mistakes, without in-depth analysis, allow us to gloss over the necessary response to failures: to figure out what actually went wrong, and how to fix it.  Pointing the finger is a human trait, but if leaders want their organizations to learn and improve, they must learn to do it well.

The key aspect of a learning culture is the ability to receive feedback that allows leaders to improve systems.  But the ability to improve systems is dependent entirely on the quality of the feedback leaders receive: they can only fix problems if they know what actually caused the problem in the first place.  Which means that they must have a system in place that allows them to gather accurate data and feedback.  Such a system, then, requires a delicate balance.  First, it must ensure that employees trust they will be treated fairly and justly, or they will not honestly report mistakes and areas for improvement.  If there is no sense of justice, there can be no learning culture, because information will not be reported for fear of being treated unjustly.  But the system must also be able to accurately identify the root causes of problems—to point the finger at the right person or people or systemic failure—and hold those responsible accountable.  A “no blame” culture is just as problematic as a strictly punitive culture in terms of learning and improvement.

Only when these two aspects are in place (accurate investigation and accountability combined with a sense of justice and fairness) can leaders learn from mistakes and improve the systems that bring them about.  Only if they can identify the person responsible for an error (pointing the finger accurately) can they then identify if it was indeed a simple error, or a risky choice due to an individual or systemic drift from procedural compliance, or reckless (or even intentionally harmful) action.  And only when that has been identified can they then determine if there were any systemic performance shaping factors that may have led to the error—factors that can be improved to reduce the likelihood of such an error in the future.  Or if the investigation reveals no such systemic factors were at play, they can decide the appropriate just response (consoling, coaching, retraining, punitive action, etc).  But this response—taking the appropriate reaction for the responsible individual(s) and possibly identifying and correcting systemic problems—can only occur if the leaders manage to get the first part right and point the finger well.  Letting the team leader take the blame may win him or her the respect of the team, but it does nothing for organizational learning and improvement.

Learning to point the finger accurately, to identify the root causes of problems and respond to them appropriately, is not only required for justice and employee trust and morale.  It is a sound business decision.  High-quality systems of investigation and accountability like the Just Culture (Workplace Accountability) Model are an investment in organizational learning and improvement.  When something goes wrong, it is natural to want to point the finger.  But leaders must learn to do it well if they want to make their organizations better.


Aaron Haskins, Outcome Engenuity Advisor.

Just culture can improve safety

Analysis published by International Air Transport Association (IATA) - Airlines International.

February 3 2016

"It is only natural that people and organizations would be less willing to report their errors and other safety issues if they are afraid of punishment or even prosecution"

Just Culture achieved prominent recognition in the European Union (EU) last month, and there are new provisions calling for the protection of safety-related information anticipated to be adopted by the ICAO Council very soon. But, it is what you do with the tremendous amount of data that a just culture enables to be captured, through various mandatory and voluntary reporting systems, that magnifies its positive effect.

“The new EU regulation is about encouraging aviation personnel to tell their employer when things aren’t working well. It isn’t always that someone has made a mistake, it’s that something hasn’t worked out as expected on this occasion. They need to feel that they are being supported by their employer and that this information is useful and will be used to improve things,” said U.K. Civil Aviation Authority’s Performance Based Regulation Safety Data Lead, Sean Parker.

Beyond Europe, global standards beckon for Just Culture [See box Explaining Just Culture]. In October, ICAO member states filed their responses to proposals that include the addition of Safety Culture to Annex 19 of the Chicago Convention. Safety Culture is a broader concept, in which Just Culture is part. Just Culture enables a Safety Culture to exist. Following the anticipated final approval in March 2016, ICAO member states could be required to adopt Safety Culture through the amended Annex in November. Experts foresee a 2018-2020 timeframe for Safety Culture’s incorporation into the domestic regulation of the 190 ICAO member states.

Safety Culture and the need to protect safety data and safety information, collected for the purpose of maintaining or improving safety, was a notable theme at the second ICAO High Level Safety Conference, held earlier in 2015 in Montréal. It was agreed that quick progress in this regard is critical for the improvement of aviation safety.

“It is only natural that people and organizations would be less willing to report their errors and other safety issues if they are afraid of punishment or even prosecution,” noted Gilberto Lopez Meyer, IATA Senior Vice President, Safety and Flight Operations. “These protections are essential for the ongoing availability of safety data and safety information, and forms the basis of a Just Culture.”

The adoption of Just Culture will not only widen the array of data sources that can feed into a company or industry-wide predictive tool, but also increase the quality of the data provided. It is the predictive data analysis that can deliver more than simply local improvement at an airport or maintenance hangar, which a conventional mandatory reporting system for the immediate line managers may do.

Such predictive analysis can also be useful for accident prevention and investigation. A diversity of data to analyse is good because accidents are, “always a confluence of a variety of different factors, which nobody would ever have guessed would have come together at the same time,” IATA General Counsel Jeffrey Shane said.

The quantity of data produced from mandatory and voluntary reporting systems cannot be understated. Legal firm Pillsbury Winthrop Shaw Pittman Partner and Head of its aviation practice, Kenneth Quinn said: “You’re getting 10,000 bits of data from the widest possible variety of sources from airlines, as well as voluntary occurrence reports, [and] your getting it from repair stations.” And all of that can go into powerful computers.

“A big benefit is you can benchmark against other people,” Quinn said. “If you have five engine shutdowns over the course of a year and airline B has none then you have a higher than normal average of in-flight shutdowns, how are you monitoring things?”

Quinn points to the work the United States’ Federal Aviation Administration has done. All of that occurrence data, Quinn said, “goes into very powerful computers…and you take that and implement mitigation strategies to correct that, its having demonstrable safety benefits.” Because of the FAA’s work, Quinn explained that other authorities are examining the potential for such mass data reporting based predictive technology.

At the European Commission’s Aerodays 2015 conference in London in October, the European Aviation Safety Agency talked about its big data safety project that will spend about 31 million euros from 2015 to 2017. The project will seek to demonstrate an ability to predict an unsafe situation.

The adoption of Safety Culture, and by default Just Culture, by ICAO, will, however, present a challenge to some member states and access to all the diverse data that could make a difference could be hindered.  “We recognize there are sovereign legal systems that regard an accident as, in the first instance, something that is a potential criminal act,” said Australia’s Civil Aviation Safety Authority’s Associate Director of Aviation Safety, Jonathan Aleck.

Australia is not a country that begins with a criminal investigation, Aleck highlighted. Its airlines have adopted Just Cultures and in its latest annual review for 2015, Qantas said: “We are proud of our strong, ‘just culture’ of reporting and our dedication to learning from our experiences. And we strive to maintain an environment that encourages trust and confidence in our people to report hazards and incidents and suggest safety improvements.”

Nations whose airlines do face a criminal investigation team, whether they have Just Cultures or not, according to Aleck. He said: “In some jurisdictions there is a strict program [of aviation regulation]. The idea of just culture doesn’t fit well with those kinds of regimes, where the first people on the scene are often criminal investigators.”

Before the likely cause of the 31 October crash of Metrojet flight 9268 had been identified as a bomb, the Russian authorities initial announcement regarding its investigation was the start of a criminal one. The concern is that people who know what led to an accident will say nothing for fear of prosecution, when their information could help stop potentially fatal incidents from occurring again.

Shane is positive about changes to national legislation where Just Culture is not already codified following the expected ICAO March decision. He said: “The benefits can be demonstrated so powerfully that I expect [legislatures to adopt it in the next few years].”

The expansion of Just Culture has grown momentum. For those that have or are employing it, they find that it delivers new insights into how things go wrong. But, questions remain as to how far nations, whose instinct is to investigate possible criminal action first, can succeed in gaining all the possible benefits from the additional information that becomes available.

IATA Member & External Relations, ICAO, Director, Michael Comber sums it up simply. “It’s a tremendous advantage to have people come forward and speak because it’s the best way to prevent as well as figure out how [accidents] happened.” And that points us in the direction of improving safety.

Explaining Just Culture
The definition of Just Culture is an open way of working in which employees are not punished for decisions taken in good faith and commensurate with their experience and training. The employees can report mistakes, by them or others, and know that that information will feed into the safety management system.

However, gross negligence, willful violations and destructive acts are not tolerated.

Just Culture has been required within the EU since November under its regulation 376/2014 that also renews earlier mandatory reporting law. For Just Culture, the EU is requiring that organizations have protection for the reporting staff member, and for persons mentioned in the report, rules for confidentiality, and protection from an employer.

Prior to the European law coming into effect, a European declaration in favor of Just Culture was published in October. The declaration is supported by the Airports Council International, European regions airline association, European Cockpit Association, Aircraft Engineers International, IATA, and other aviation organizations.

As well as European efforts to implement Just Culture, this non-punitive reporting system was included in the Australian Civil Aviation Safety Authority’s new regulatory philosophy published in 2015. Prior to Australia’s CASA action, the United States, New Zealand and the UK had their own rules in place.


Systemic Analysis – The Key to Effective Risk Mitigation

By: John Westphal, Senior Advisor

The achilles heel of high consequence industries I have worked in over the years is the over reaction to a single event. In other words, we allow one event to drive systemic changes throughout the organization when that one event may not necessarily tell us enough about the risk that existed within that particular socio-technical system, or the learning system lacked the sophistication to offer a robust enough view of the risk within the identified system.

The question then becomes, which one is it, the sophistication of the learning system or the insignificance of the event? Generally, the failure resides in the learning systems capability to fully explore the richness of the single event.  Failing to combine that learning with other event investigations and developing an accurate view of the inherent and system risk existing within the operation.

The described failure within the learning system generally occurs because of two reasons. First, our single event investigation struggles to define the appropriate cause and effect relationships that existed within the event. Often we insert non-causal data or non-duties into the event causing such significant noise we are unable to articulate the actual risk. Secondly, due to this failure in our single event investigations, we become limited in our ability to do precise common cause failure analysis when looking across multiple events. Simply put it becomes a “garbage in/garbage out” learning system.

In addition to the stated failures above, our learning systems often fail to correctly identify causes related to a class of events. In other words, the causal factor is not relevant to what occurred yesterday but is relevant to what may happen tomorrow or six months from now. A great example of this was played out in the movie, “Flight,” starring Denzel Washington. In the movie, Washington plays an airline captain with a significant drug and alcohol problem.  In an aircraft emergency, Washington engages in extraordinary actions saving hundreds of lives on board even though he was intoxicated while flying the aircraft. The question becomes, was his intoxication causal in regards to the loss of the aircraft? No. It was a mechanical failure that caused the loss of the aircraft. However, does a pilot who is willing to fly under the influence of alcohol and drugs represent a risk to the system? Yes!

At the end of the day our learning systems must exhibit three layers of examination. First, we must be able to identify the relevant cause and effect relationships within a single event. Secondly, once the cause and effect relationships are identified within these single events, we conduct the proper systemic analysis to identify risk mitigation strategies. A general rule of thumb is 70% to 80% of our interventions should come from systemic analysis. Third, our learning system must also assess causal factors related to the class of event. Working to move from the reactive realm to proactive and eventually a predictive learning system.

Bringing Just Culture to the Streets


Making judgments without an understanding of the root cause(s) of the situation quenches both growth and learning culture. It is absolutely necessary for us to remain impartial in our judgments until we can adequately discern the root cause(s) of the event.

Recently, a pastor who played a major role in the Boston Miracle, Jeffrey Brown, presented a Ted talk testifying to the power of this.

In the late 1980s, violence on the streets of Boston was increasing at alarming rates, and by 1990 Boston’s homicide rate reached a peak of 152. But by 1999, that number dropped down to 31 thanks to the key leaders within the Boston community. The Boston Miracle, simply stated, was this unprecedented 79 percent drop in the city’s homicides over the span of 10 years from 1990–1999.

In his Ted Talk, Brown shared that after a multitude of tragic events, he realized that it was not enough for him to build programs for the at-risk youth. He began to search for the youth actively involved in violence. He soon found himself walking the streets of Boston during the hours of the night, and by 1992 he and other area pastors had formed the Ten Point Coalition to combat youth violence in the streets of Boston.

Over time, these pastors began developing relationships on the streets of Boston during the night hours. They discovered that the individuals, who many dismissed as cold and heartless, were the exact opposite of their labels, and were simply trying to “make it on the streets.”  By not rushing to judgment, the pastors were able to engage with the youth and partnered with them to change the culture on the streets.

But this journey took time. It was only when these youths viewed the Ten Point Coalition and the law-enforcement as legitimate, fair and just that the culture on the streets began to change. This meant the Ten Point Coalition and law enforcement had to consistently take the time to discern what justice meant for each person involved, determining who needed to be helped, who needed to be coached and who needed to be punished. In turn, the Boston area pastors were able to help the Boston police focus on the truly reckless and intentionally harmful behaviors.

This began the transformation of the street culture, and cultivated a cultural atmosphere ripe for justice. With cooperation at all levels, the Boston Miracle occurred, becoming a powerful testimony to the fruit of not rushing into judgment over a situation. Even now, others are inspired by the result of Boston’s street transformation in the 1990’s. In fact, a group of Baltimore pastors have decided to devote the summer of 2015 to walking the streets of Baltimore at night in hopes of the cultural transformation of the streets of Baltimore.

Monica Lewinsky addresses the culture of shame

Culture is shaped by our behaviors, and both repetitive human errors and at-risk behaviors can be detrimental to the direction of an organization. For leaders, this is critical to note.

Why? It is simple. Leaders have the authority to shape the system and culture, ultimately determining the direction of an organization. What is allowed and voiced within the workplace either gives room for learning and growth, or squashes learning and growth.

In March 2015, Monica Lewinsky presented the Ted Talk message “The Price of Shame,” which focused on the effects of cyber bullying. Out of respect for Lewinsky, it was Ted’s aim to provide a safe place for her because it was among her first public appearances in 10 years. However, comments derailing Lewinsky began almost immediately upon the posting of the message (before the public would have time to watch the full 20 minute Ted Talk).

The very thing that Lewinsky was speaking up about was happening.

With that, three Ted employees immediately took control of the situation through aggressively monitoring the comments being made: they would purge the negative comments and reply to the positive comments, bringing the good to the top of the feed. After much deliberate work, the Ted employees saw a shift within the public forum—the voices that uplifted, empowered and encouraged Lewinsky were prevailing, changing the forum content and culture altogether.

The public began to see what was clearly accepted and what was not acceptable for the forum.

During Lewinsky’s message, she encouraged the listeners to become “upstanders,” defending those who are victims in the world’s steep culture of shame. Interestingly, the very call-to-action given by Lewinsky during her message was manifesting within the forum—people were becoming “upstanders” for Lewinsky in the midst of a culture of shame.

One commenter wrote: “I am so inspired by her wisdom and courage. I cannot imagine the depths of despair she went through and wow look at the incredible message she is bringing to us now because she survived and is now thriving.”

Clearly, through this case, we see the way that the Ted employees shaped—and ultimately shifted—the culture of the forum, helping to redefine the norm within the forum. The Ted employees victoriously encouraged the voices that silenced the shame and silenced the voices that encouraged shame.

For leaders within the workplace, it is important to understand that through empowering individuals who are giving voice to desired outcomes within the workplace, we shape the culture of the workplace in a positive manner, creating an open atmosphere encouraging the desired outcome, and thus, discouraging the undesired outcome.

It is evident through the Lewinsky case, the power of unity to shift a culture--and within the workplace voices that uplift, encourage, and empower each other to learn and grow--can eventually impact the culture of the organization as a whole.

The Just Culture Organizational Benchmark™ Survey

The Organizational Benchmark™ Survey is designed to measure critical behavioral markers that show an organization’s growth in culture around a particular organizational value, such as safety, privacy, compassion or cost control. The markers are the same for each value (safety or privacy) in that the basic elements of a learning and just culture are common.












The markers follow twelve areas of focus:

  1. Organizational Values
  2. System Design
  3. Management/Subordinate Coaching
  4. Peer/Peer Coaching
  5. Outcomes
  6. Open Reporting
  7. Search for Causes
  8. Internal Transparency
  9. Response to Human Error
  10. Response to Reckless Behavior
  11. Severity Bias
  12. Equity

An explanation of the 12 benchmarks:

1. Organizational Values

In this benchmark area, we ask employees if they believe their manager’s behaviors demonstrate that the particular value is supported by the organization. This provides a high-level view of how employees are interpreting their manager’s behaviors attached to a particular value.

2. System Design

In this benchmark area, we ask employees if they see systems being changed in response to adverse events and hazards identified by the employee group. This focus on system design is a key operational tool.

3. Management/Subordinate Coaching

In this benchmark area, we ask employees if they see their managers coaching when staff members make risky behavioral choices tied to the value being analyzed. Knowing that employees will drift into at-risk behaviors, this marker tells us whether managers are coaching employees onto better behavioral choices.

4. Peer/Peer Coaching

In this benchmark area, we ask if employees are willing to coach each other. This marker goes beyond merely offering help to another employee. We ask if employees are willing to challenge the behavioral choice of a peer that they see making risky choices.

5. Outcomes

In this benchmark area, we ask employees if they see outcomes tied to a particular value heading in the right direction (increasing or decreasing). This will assess employee perceptions of whether they believe organizational outcomes are improving. This is especially important where adverse events are hard to track in a quantitative manner (e.g., compassion).

6. Open Reporting

In this benchmark area, we ask employees if they are willing to report hazards or near misses that might detrimentally impact a particular organizational value. As opposed to reporting of adverse events, this behavioral marker looks at the near miss or hazard as the precursor to harm. Open reporting is essential to create a learning culture.

7. Search for Causes

In this benchmark area, we ask employees if they see managers investigating system precursors to potential harm. We focus on near misses that, if investigated and understood, would produce critical system learning.

8. Internal Transparency

In this benchmark area, we ask employees if they observe open dialogue concerning adverse events and lessons learned as related to the value under analysis.

9. Response to Human Error

In this benchmark area, we ask employees if they see employees being disciplined for inadvertent human errors. This marker ties directly to the Just Culture model for the proper response to human error.

10. Response to Reckless Behavior

In this benchmark area, we ask employees if disciplinary action is taken when an employee willfully chooses to recklessly endanger the value under analysis. This also ties directly to the Just Culture model in the response to reckless behavior.

11. Severity Bias

In this benchmark area, we ask employees if they believe that the severity of event outcomes play a significant role in whether the event will lead to positive change in systems or processes.

12. Equity

In this benchmark area, we ask employees if they believe that they are treated fairly across employee groups. Equity, the belief in the system being fairly applied across employees, is central to the notion of a Just Culture.

Why the ‘5 Why’s’ are not enough for a good investigation


By: John Westphal


Event Investigation is a tool within the reactive learning system that we use to extract learning from an event. One of the practices I have employed as a six sigma black belt and human factors investigator is the '5 Whys' technique. It is used in the Analyze phase of the 'Six Sigma DMAIC' (Define, Measure, Analyze, Improve, and Control) methodology, and is a technique that seeks to identify the root cause of an event.

That said, as organizational leaders, I believe we have become overly captivated with the identification of the root cause. Many organizations have fallen into the trap of believing if they find the root cause, they have in essence found the piece to address for further mitigation of the event risk. This, in my experience, leads us down a path of fixing one event at a time rather than addressing common cause failures, which are often further up the causal chain and allow us the opportunity to address risk in a more holistic fashion.

The '5 Whys' is a simple methodology that allows us a basic understanding of the initiating event (root cause), but as we seek to extract all the learning from the event for more effective risk mitigation, we must employ additional methodologies (Rules of causation) to help us understand the cause and effect relationship, mitigate the use of negative descriptors that are subjective in nature, identify and explain the human errors and at-risk behaviors within the event, and lastly, only allow causal factors that had a preexisting duty to act.

It is from the place of a more sophisticated event analysis that we can filter the noise around the event and better understand the role of human error, at-risk behavior, mechanical failure, and environmental/cultural conditions that increased the likelihood of the event. Armed with a more sophisticated approach to reactive learning, we now have the opportunity to classify the failure. In other words, was this design failure, component failure, or unique failure? Depending on which we see dictates the response to the event, thus safeguarding the organization from overreacting to single events.

Now that we have the failure classified with a good understanding of the direct and probabilistic causal links within the event, we are in a better position to conduct the systemic analysis across multiple events, searching for the common cause failures.

It is at this point we have now converted our reactive learning system to a proactive learning system, breaking causal chains across multiple events, decreasing the risk throughout our operational environment.

We can see that, although the '5 Why’s' is a simple tool that allows us a very basic understanding of an event, it is simply not enough for a good investigation that makes it possible to convert the learning system from reactive to proactive and eventually predictive in nature.

For more on event investigation, see our Live Courses information page or our Online Course information page.

How ‘Zero’ Language Can Hurt More Than It Helps


From time to time in the Just Culture we encounter various uses of the term ‘Zero.’

They usually fall within in two categories. The first of which is in regards to how many of a particular event an organization seeks or expects to have, as in ’The goal is Zero". The second use we often see, which can be in conjunction with the first, is a notion of intolerance for any form of violation of a specific rule or policy, as in a ‘Zero Tolerance rule'. I will disclaim that generally, in Just Culture, we try to avoid the use of 'zero language’ in organizational initiatives or policies. However, since many Just Culture organizations operate in a world with inherited language sets and competing ideologies, we felt it appropriate to address some of the common pitfalls with these ideas.

In regards to thinking about ‘Zero’ as a goal, the major consideration ought to be that while zero can be used as an aspirational goal, it really cannot be an expectation or standard. The reason being that ALL systems will fail at some rate. When we are intellectually honest, we must acknowledge that, given enough time, 'never ever' is a statistical impossibility. Even when we set the inescapably fallible human component aside, metal will still rust, engines will still seize, and circuitry will corrode over time. Whether failure is a result of mechanical limitations or whether it be the behavioral choices of the human components, all systems possess a statistical rate of failure. Really great systems, obviously, fail at very low rates. But perfection is unfortunately, just not in the cards. That’s the bad news.

The good news is, however, we can get better! We can improve our rates by designing systems, with built in recoveries, redundancies and barriers, that are error tolerant and robust enough to get us pretty close to zero in some cases. Take for example, the commercial aviation or nuclear industries. They are widely regarded as among highest in reliability, and yet the systems they have designed are not producing adverse outcomes at a rate of zero. In Commercial aviation, they have designed a system to a standard of one catastrophic event in a billion, and in large part achieved that rate. In the world of system design that figure is remarkable when you consider that Six Sigma’s whole premise is to get to three defects per one million.

Even though these catastrophic events are absolute tragedies when they occur, they make headlines in part because they are so rare. And though it never will be said this way: as a society, and as an industry, we generally accept that rate. We accept the rate because we value our transportation options, and we acknowledge inherent risks with doing business. We’ve looked at the one in a billion rate (which averages to approximately one plane crash a year), we’ve weighed the alternatives, and ultimately we’ve collectively said that we can live with that rate. (What’s really astounding is that in regards to our roadways systems as they are currently designed, we also choose to accept a rate of around 30,000 automobile fatalities annually in the US.)

In order to improve our systems in a world of limited resources we often have to make tradeoffs with other values to do so.

But perhaps you don’t accept that rate. Perhaps you want a better rate of one in 10 billion or one in 20 billion (in continuation of the aviation example). Ok, that’s fair, but realize doing so comes at a cost. In order to improve our systems in a world of limited resources we often have to make tradeoffs with other values to do so. Anyone who has ever been stuck at an airport terminal waiting on “maintenance” knows that when we stress safety, it impacts other values like timeliness, customer service, and, from the airliner’s perspective our profit margins. So when designing systems as an organization you have to find a rate you can live with, realize that it can’t be zero, and then you have to figure out what you are willing to trade to get there.

The second use of zero language we see is the ‘Zero Tolerance’ policies or rules that organizations use when trying to emphasize a particular safety risk. Here is where I will issue some major cautions in regards to how these rules relate to Just Culture. Often times, in efforts to alter the perceptions of risk around a particular class of event we opt to draw lines in the sand and implement artificial danger to “assist" our employees perceive risk. Forgoing a time investment into coaching around the actual harm, our desperation for immediate improvements leads us to seek those improvements by giving the issue a very serious tone and attaching some punitive threats. Sometimes organizations can forget that in a Just Culture, altering perceptions of risk takes time and deliberate effort. It is true that in Just Culture we will, in fact, leverage artificial danger (mainly a progressive disciplinary path) from time to time, but we go awry when we try to implement rules that no longer require us to evaluate the employee’s behavioral choices.

Our challenge to organizations is to self-evaluate ‘zero rules’ by asking the following three questions:

  1. Could an employee violate this rule by human error?
  2. Could an employee violate this rule while genuinely not perceiving the risk (despite previous training)?
  3. Are we now saying something different about the way we intend to deal with these behaviors?

The reality is that in Just Culture, we can really only have a zero tolerance policy towards the reckless and above behaviors. Any other punitive actions, save for repetitive errors or at- risk behaviors, are simply not aligned with Just Culture. 'Zero language' is not inherently wrong, but it can be a slippery slope. Even when the term ‘Zero Tolerance' is not associated with punitive action, it can at the least be confusing to staff. As in, if one rule is a ‘zero' rule and not to be broken, what does that imply about all the other rules?

At the end of the day, we must proactively protect our Just Culture commitment from the line of thinking that just because we’ve made expectations clear in the past, or because we’ve trained around specific procedures and risks, our staff now no longer can or will make human errors and at-risk choices. We must constantly defend against our ever present severity bias and refuse to undermine our open learning/reporting culture through the guise of zeal. We strongly urge the Just Culture Community to be cautious with zero language, in our expectations and policies, and realize that even Just Culture organizations are susceptible to drift over time.

Download “Whack-a-Mole” Digital Copy

David Marx says, "Give it away...". Whack-a-Mole: The Price We Pay For Expecting Perfection, by David Marx is now offered in a digital copy.



Whack-a-Mole: The Price We Pay For Expecting Perfection explores the role of human error in society, from aviation and healthcare, to driving and parenting—and where accountability rests for those errors, especially when they take the life of another. David Marx argues that regulatory and human resource prohibitions, along with the criminal prosecution of human error, have been counter-productive to helping society deal with the risks and consequences of human fallibility. Marx advocates a different approach to addressing our shared fallibility.

Scroll down to get your copy (digital download) of Whack-a-Mole: The Price We Pay For Expecting Perfection. by David Marx, JD, CEO of Outcome Engenuity and father of Just Culture and engineer of the The Just Culture Algorithm™ 

-Learn More About Just Culture-  -Just Culture Training Events-  -Event Investigation/Root Cause Analysis-

What does our model of accountability look like?

A Just Culture, an Accountable Culture, is a result as much as it is a set of management skills and tools that make it possible. The Outcome Engenuity model has consistently transformed organizational environments from the inside out. Everyone gets on the same page. Everyone is made aware of the companies' values and how they are expected to make choices that protect them. Everyone is an active part of the plan and the solutions. Everyone is trained and held to achieve equal accountability.

Core Values Concept- Just Culture

Perhaps the best way to convey this is by explaining what an organization looks like that has integrated our accountability model into their way of doing business. This organization, like many others, has a mission. Its pursuits and reason for being are grounded in certain values it has determined are most important to it. Now imagine every employee protects those values by the choices they make and how they accomplish their duties.



eraser mistake on exam_12172684

Of course, what life tells us is that people not only can, but will, make mistakes. Sometimes these mistakes hurt others or cause harm to the company; aware of that, this company has designed a system that can catch those errors before they become critical. If they do become critical, they have designed recoveries to stop or reduce the bad outcome.



This organization has the Outcome Engenuity Workplace Accountability model, also known historically as Just Culture, which helps it to constantly improve its systems at the core because its employees feel safe to raise their hand when they see a mistake or made an error or bad choice so the system can be updated and improved to help catch those events. This is a company that learns from its mistakes and near misses. It's efficient because its expectations are clear. Its stable because it's always learning. It's profitability is strengthened because it is more efficient, stable and the risks are proactively managed. Employee morale is high since every person is treated fairly and is empowered to do the best within their positions.

The foundation of our proven model is in managing three types of behavior, against three types of duties, within a framework of five skills upheld by the organizations leadership and staff. Click each item below to find out more:


Our model of workplace accountability identifies 3 types of behavioral choices that every person makes and needs to manage.

1. Human Error
2. At Risk Behavior
3. Reckless Behavior



It holds its people accountable to one or more of those behavioral choices when they carry out three types of duties or expectations. These three duties are defined as:

1. The duty to avoid causing unjustifiable risk or harm.
2. The duty to follow a procedural rule.
3. The duty to produce an outcome.


These behaviors and duties are at the core of the 5 skills a company applies to make this transformation possible. 

1. Values and Expectations

2. System Design

3. Behavioral Choices

4. Learning Systems

5. Accountability and Justice



The Just Culture AlgorithmTM is our primary tool for understanding and categorizing the choices of those in our organization. With it, we can evaluate an event based on a set of duties inherent to the system in order to determine which of the three behaviors was most likely in play. This gives us the ability to address the event and the people involved in a constructive way rather than simply reacting to the outcome. It can also show us how multiple behaviors can be associated with a single event, so that we can evaluate each behavior separately in order to more effectively determine the root cause.


Company wide performance and organizational improvements, where equity, fairness and accountability live at the forefront, that is a Just Culture. For the sake of our staff, those we serve and ourselves, we all need it.



You can watch the 43 minute webinar video recording provided below. It is presented by one of our advisors, Ellen McDermott. In it she does a very good job of further introducing Just Culture in a simple and comprehensive way, while still diving into the questions you may have before moving further.


Watch the 20 minute What is Just Culture video by David Marx.  Watch the Top 10 Questions - FAQ videos answered by David Marx, JD and CEO of Outcome Engenuity. Some of these questions answered are submitted by others going through the various stages of rolling out a Just Culture in their organizations.  With every organization and every role of a given position being unique, the importance of certain Just Culture concepts may be emphasized accordingly. This is another option to consider with those as potential insights when watching those 10 videos.


Contact one of our wonderful Client Relations Specialists. You can call or email us with any questions at all. We always enjoy sharing the good news of Just Culture and learning about the needs  of others and how we can connect the solutions for a custom fit for your organizational development. 


Check out the Just Culture Overview and share it with the other need to know leaders of your organization. We also have industry specific versions here: Healthcare - EMS - Aviation.

Also consider reviewing "The Final Check" or visit our website . Here you will see how we chronicled a hospitals use of Outcome Engenuity's  Just Culture principles and tools to solve a serious risk to its patients that kept occurring.


Review some of our training topics and supportive tools. With great care and consideration we have meticulously developed world class live certification courses, online training classes, a printed and electronic interactive version of the exclusive Outcome Engenuity Just Culture Algorithm™ an Organizational Benchmark Survey,  analysis tool and more. Looking through some of those product and training descriptions may help you define some of the solutions of most interest to you right now.


Roll the dice and continue business as usual. Well we would call that At Risk behavior and we don't recommend that. Doing nothing will potentially keep you from harvesting the organizational improvements that a Just Culture is proven to provide. You see the truth is we are all fallible humans who will make mistakes. We will drift into behaviors that don't support the best values. We will hide the important information of an error if we continue to get punished for not being perfect. We will continue to lose precious profit and our resources will get tapped by bad outcomes and lack of performance. We hope you would decide to join the community of worldwide organizations which are living out the Just Culture everyday and are supporting each other in ways to make the world a better place to live. A more just and fair culture for us all.


Frequently Asked Questions

Where do I start with Just Culture?

We suggest starting at the top; you’ll need the buy-in of the executive team in order to effectively make the changes needed for implementation. Explain to them the core concepts of Just Culture, the different, better way to do business. Show them the outcome bias and how destructive it can be. Then you can work your way down to local managers to show how a Just Culture works.

What does JC implementation look like?

Implementation usually begins with learning how the three behaviors work and how the quality of each choice can make a difference. Getting past the outcome bias is also a big early step. Then you’ll revise organizational policies and procedures, beginning the system design phase. Training your managers and staff on the concepts and on peer-to-peer coaching will begin to build a learning culture in your organization. And once you get buy-in across the organization, your people will begin to make the right choices and your processes will steadily improve.

How do we maintain momentum once we start the process?

Getting the Just Culture ball rolling will take leaders role modeling and interacting with subordinates and peers. You’ll also want to give regular reminders and use any and all examples that come up to show how it’s working. Periodic training and practicing with fictitious scenarios can also help an organization pursue a learning culture.

How important is getting leadership aligned?

Without the support of upper leadership, a Just Culture implementation will always be treading on thin ice, afraid that the effort could be suspended or cut entirely. In order to commit the organization to the change, top-level buy-in is essential. This is not to say that they must drive the implementation, though; leadership can come from any level.

Is on-site training available?

Absolutely; in fact, most of the training we do is on-site. Our advisors travel quite a bit to visit our clients to do training, and some clients have Certified Champions who do the same. We also have several options for online training available.

How does Just Cause work with the Just Culture Algorithm?

Just Culture and Just Cause fit together beautifully. The focus of Just Cause is the procedural aspect of justice; it spells out the rights and requirements of the law. Just Culture aligns with the substantive aspect; it spells out how to define what crime is and how to handle justice on both the personal and organizational levels. So the older Just Cause concepts and the newer Just Culture concepts complement each other quite well.

How many people from my organization would you recommend getting certified?

The answer to this question, which we get a lot, is very dependent on your organization. We recommend that you have representation from your operational leaders, your HR team, and your quality and safety personnel. How many employees and managers you have and how much culture change will be required will dictate how many Champions it will take to best serve your needs.

What are some methods for instilling daily use of the Algorithm by managers?

Console the error, coach the at-risk, punish the reckless irrespective of the outcome; when your managers are comfortable enough with this that it becomes the reaction to an event, you’re on the right track. Keep them practicing with fictional scenarios and encourage them to look at their normal everyday events through this lens. Once it becomes habit for them, you’re in a great position.

What are some metrics that will show that we’re making progress?

The obvious general answer here is better outcomes; you’ll start to see an improvement in how your processes work. To get there, though, you’ll see more depth in your investigative processes and your disciplinary records. You’ll see improvements in the design of your systems, and you’ll have more data to measure. And our team can help you establish more specific metrics to measure based on your organization and your industry.

What are some methods of tracking non-punitive coaching/consoling sessions in order to record repetitive behaviors?

What you choose to track will be based on your organization’s mission and values; you’ll set up team and individual guidelines that you’ll want to watch. In the coming months we’ll be reaching out more about a new tool for this very purpose. Keeping track of both sanctions and accolades will be important for use in performance reviews as well. That data will allow you to see trends in behavioral choices, and that will cue you to address them.


Objective Standard: Guard Against Personal Bias

Objective StandardWhen using the objective standard of the Just Culture Algorithm, a rule of thumb is to ask what a similar person, similarly situated would do. So for example, if you work in the airline industry and are trying to figure out how to judge a pilot’s behavioral choice, of course you’re going to ask that pilot about what he was thinking, why he would do what he did. This is the subjective standard. But then you will also get the perspective of the culture in which the pilot is operating—whether other pilots within the company would have appreciated the risk or would have made the same choice.

But here is the caution of this. If you try to work out in your head what other pilots would do, you carry the risk of letting your own personal bias carry over into your judgment of the situation. Every person carries at least a little bit of personal bias into any situation they encounter; it is a natural part of our very human nature. Whether our opinions are shaped from past experiences, personal observations, or taught to us, we are inherently prone to bias. So maybe based upon your experience you’d have a tendency to view most pilots as careful, cautious, and risk-averse. On the opposite end of the spectrum, perhaps you have a personal bias that causes you to view pilots as mostly top-guns who crave an adrenaline rush.

This tendency towards bias can be seen in our own legal system. As legal courts seek to determine what a “reasonable” person would have done (the objective standard) in the various situations brought before them in court, the court—composed of fallible human beings guided by the judge, also a fallible human being—have been seen to carry in their personal bias when rendering judgments. For one example, young boys at play have often been deemed less perceptive of the risk than young girls at play, regardless of being the same age or even placed in the same situation.

[C]ourts routinely exonerate playing boys for their dangerous behavior.  Their language is telling—they speak of boys yielding to the overwhelming temptation to play with dangerous things …The contrast with playing girls could hardly be more dramatic…in cases involving girls injured while at play, courts show little or no sympathy for the playing girl, routinely holding her to a higher standard than her male counterpart. . . .Because of the way that the reasonable person embodies intuitive and undifferentiated judgments about appropriateness, it should hardly be surprising that it often draws deeply on ideas about what is normal or natural.

Mayo Moran, Rethinking the Reasonable Person; An Egalitarian Reconstruction of the Objective Standard, 8-9 (Oxford University Press, 2003)

But as an imposer in the Just Culture model, there is one tactic you might use to help you guard against this personal bias. Instead of relying upon your personal ideas and thoughts of what a pilot is likely to do, actually go and talk with other pilots. Create your own very real pool of similar persons and ask them how they would have handled the situation, or if they had ever seen people do this before. Allow these individuals to form the basis of your objective standard, and this brings you one step closer to a more just judgment.

What is a strategy that you find helpful to guard against personal bias in your leadership decisions? 

From Outcome to Procedure: A Mitigation of Risk

Time and Attendance

Typically time and attendance is described as an outcome-based duty. The employer tells the employee to show up to work at the start of their shift, and it is up to the employee to figure out how to get there on time. But that’s not always the case, is it?  There are times when even time and attendance may be further controlled by an employer to where it begins to look more procedural. Think about asking an employee to be “on-call.” Being on-call may be relatively simple, like requiring your employee to be on the premises within twenty minutes of being called. This is still an outcome-based duty. The employer doesn’t tell the employee how to make sure they are able to be on-site within 20 minutes – there’s no guidance as to how far out the employee could travel to during this time frame, no guidance on what activities the employee may or may not be engaging in leading up to being called in – it is up to the employee to figure out how they are going to organize their life and their personal schedule to make sure they can meet this tasking.

But an employer may choose to put more restrictions on this. Someone’s on-call tasking may have any number of restrictions and control measures placed upon it; it might even be a requirement for someone to be on-site for the entire duration that they are on-call. So why might an employer shift the same tasking – time and attendance – from an outcome-based duty to a set of procedural rules? To understand this you have to first remember that fundamental notion that human beings will make mistakes; it is not an “if” but a “when” for when we will make a mistake. So when an employer builds a task around an outcome-based duty, where the employee is tasked with figuring out how to make things happen, it is not a question of “will” the employee not achieve the outcome you want but rather “when.”

So when it comes to time and attendance, many times leaving it up to the employee is just fine. For the majority of the time, if an employee is late on any given shift, the company is going to be able to continue running with minimal hiccups due to the tardiness of the employee. But there may be times when the possibility of someone coming in late is unacceptable; maybe the employer feels very strongly that there is a need for this one individual to be present and the possibility that the employee shows up late is unacceptably risky to the organization. If they see this unacceptable risk, ideally the employer is going to take some of the burden of responsibility onto themselves to mitigate this risk; an example of this might be requiring an employee to stay on-site for the duration of their on-call shift, place mileage restrictions on the individual, or other any number of procedural rules that packaged together create the more restrictive on-call requirements.

Now there of course is the potential that the employee is going to struggle to see the reason behind why the employer has elected to control a process, making it more restrictive and procedural as opposed to leaving it up to the employee to control their own system. If this happens, the employer has an opportunity to talk with the employee and make sure the employee understands and appreciates the risk that the employer is trying to mitigate.

What is Design for Happiness?

Design for HappinessThis is probably the most difficult of the three (Life, Liberty, and the Pursuit of Happiness). For the most part as a company, we want to protect liberty and life so people can go out and pursue their happiness. There are social systems - for example, the neighbor who is playing loud rock and roll music and the Home Owners Association’s saying, “look, we have restrictions on what you can do because we want to sleep at night.” There are things that we do in the social construct to protect our collective happiness, and the social controls that we use, the design elements we use, are the same as we use on liberty and life. But for the most part, it boils down to three things - life, liberty, and the pursuit of happiness. Thomas Jefferson got it right when he said that you’ve got to be deliberate about designing around all three, and there are design trade-offs that you go through and say how much liberty are we going to give up for life and how much liberty are we going to give up for happiness. But the Design for Happiness is really how as individuals we say, “I want to create a happy life for myself.” Probably that design is getting married, having kids, buying a house – you can pick what that is. But we want to talk to the world about the social controls and the importance of social controls on protecting each individual human being’s ability to pursue their happiness in the way they see fit.

David Marx, CEO at Outcome Engenuity, explains what it means to 'Design for Happiness' in the video below.