Healthcare Associations Can Offer Deep Discounts To It’s Members – Here’s How

Leaders around the world are supporting Just Culture initiatives

By being a Just Culture Connector, you can support your members in their initiatives and lead them in their journey by any of these three options:

  • Host a course for your hospitals and get 5 free seats
  • Promote a course with an exclusive member discount code for registrants
  • Support an OE Just Culture statewide initiative and offer discount Enterprise Licensing for all of your hospitals

How ‘Zero’ Language Can Hurt More Than It Helps

 

From time to time in the Just Culture we encounter various uses of the term ‘Zero.’

Shocked young businessmanThey usually fall within in two categories. The first of which is in regards to how many of a particular event an organization seeks or expects to have, as in ’The goal is Zero". The second use we often see, which can be in conjunction with the first, is a notion of intolerance for any form of violation of a specific rule or policy, as in a ‘Zero Tolerance rule'. I will disclaim that generally, in Just Culture, we try to avoid the use of 'zero language’ in organizational initiatives or policies. However, since many Just Culture organizations operate in a world with inherited language sets and competing ideologies, we felt it appropriate to address some of the common pitfalls with these ideas.

In regards to thinking about ‘Zero’ as a goal, the major consideration ought to be that while zero can be used as an aspirational goal, it really cannot be an expectation or standard. The reason being that ALL systems will fail at some rate. When we are intellectually honest, we must acknowledge that, given enough time, 'never ever' is a statistical impossibility. Even when we set the inescapably fallible human component aside, metal will still rust, engines will still seize, and circuitry will corrode over time. Whether failure is a result of mechanical limitations or whether it be the behavioral choices of the human components, all systems possess a statistical rate of failure. Really great systems, obviously, fail at very low rates. But perfection is unfortunately, just not in the cards. That’s the bad news.

The good news is, however, we can get better! We can improve our rates by designing systems, with built in recoveries, redundancies and barriers, that are error tolerant and robust enough to get us pretty close to zero in some cases. Take for example, the commercial aviation or nuclear industries. They are widely regarded as among highest in reliability, and yet the systems they have designed are not producing adverse outcomes at a rate of zero. In Commercial aviation, they have designed a system to a standard of one catastrophic event in a billion, and in large part achieved that rate. In the world of system design that figure is remarkable when you consider that Six Sigma’s whole premise is to get to three defects per one million.

Even though these catastrophic events are absolute tragedies when they occur, they make headlines in part because they are so rare. And though it never will be said this way: as a society, and as an industry, we generally accept that rate. We accept the rate because we value our transportation options, and we acknowledge inherent risks with doing business. We’ve looked at the one in a billion rate (which averages to approximately one plane crash a year), we’ve weighed the alternatives, and ultimately we’ve collectively said that we can live with that rate. (What’s really astounding is that in regards to our roadways systems as they are currently designed, we also choose to accept a rate of around 30,000 automobile fatalities annually in the US.)

In order to improve our systems in a world of limited resources we often have to make tradeoffs with other values to do so.

But perhaps you don’t accept that rate. Perhaps you want a better rate of one in 10 billion or one in 20 billion (in continuation of the aviation example). Ok, that’s fair, but realize doing so comes at a cost. In order to improve our systems in a world of limited resources we often have to make tradeoffs with other values to do so. Anyone who has ever been stuck at an airport terminal waiting on “maintenance” knows that when we stress safety, it impacts other values like timeliness, customer service, and, from the airliner’s perspective our profit margins. So when designing systems as an organization you have to find a rate you can live with, realize that it can’t be zero, and then you have to figure out what you are willing to trade to get there.

The second use of zero language we see is the ‘Zero Tolerance’ policies or rules that organizations use when trying to emphasize a particular safety risk. Here is where I will issue some major cautions in regards to how these rules relate to Just Culture. Often times, in efforts to alter the perceptions of risk around a particular class of event we opt to draw lines in the sand and implement artificial danger to “assist" our employees perceive risk. Forgoing a time investment into coaching around the actual harm, our desperation for immediate improvements leads us to seek those improvements by giving the issue a very serious tone and attaching some punitive threats. Sometimes organizations can forget that in a Just Culture, altering perceptions of risk takes time and deliberate effort. It is true that in Just Culture we will, in fact, leverage artificial danger (mainly a progressive disciplinary path) from time to time, but we go awry when we try to implement rules that no longer require us to evaluate the employee’s behavioral choices.

Our challenge to organizations is to self-evaluate ‘zero rules’ by asking the following three questions:

  1. Could an employee violate this rule by human error?
  2. Could an employee violate this rule while genuinely not perceiving the risk (despite previous training)?
  3. Are we now saying something different about the way we intend to deal with these behaviors?

The reality is that in Just Culture, we can really only have a zero tolerance policy towards the reckless and above behaviors. Any other punitive actions, save for repetitive errors or at- risk behaviors, are simply not aligned with Just Culture. 'Zero language' is not inherently wrong, but it can be a slippery slope. Even when the term ‘Zero Tolerance' is not associated with punitive action, it can at the least be confusing to staff. As in, if one rule is a ‘zero' rule and not to be broken, what does that imply about all the other rules?

At the end of the day, we must proactively protect our Just Culture commitment from the line of thinking that just because we’ve made expectations clear in the past, or because we’ve trained around specific procedures and risks, our staff now no longer can or will make human errors and at-risk choices. We must constantly defend against our ever present severity bias and refuse to undermine our open learning/reporting culture through the guise of zeal. We strongly urge the Just Culture Community to be cautious with zero language, in our expectations and policies, and realize that even Just Culture organizations are susceptible to drift over time.

Download “Whack-a-Mole” Digital Copy

David Marx says, "Give it away...". Whack-a-Mole: The Price We Pay For Expecting Perfection, by David Marx is now offered in a digital copy.

WAMebookcover

ABOUT THE BOOK

Whack-a-Mole: The Price We Pay For Expecting Perfection explores the role of human error in society, from aviation and healthcare, to driving and parenting—and where accountability rests for those errors, especially when they take the life of another. David Marx argues that regulatory and human resource prohibitions, along with the criminal prosecution of human error, have been counter-productive to helping society deal with the risks and consequences of human fallibility. Marx advocates a different approach to addressing our shared fallibility.

Scroll down to get your copy (digital download) of Whack-a-Mole: The Price We Pay For Expecting Perfection. by David Marx, JD, CEO of Outcome Engenuity and father of Just Culture and engineer of the The Just Culture Algorithm™ 

-Learn More About Just Culture-  -Just Culture Training Events-  -Event Investigation/Root Cause Analysis-

Evidence: The Ultimate Game Changer

Authored by Joan Ellis Beglinger, MSN, RN, MBA, FACHE, FAAN

My first serious exposure to applying David Marx’s Just Culture model (1) to a real-world event was unfortunately born of tragedy.  As the Vice President for Patient Care / Chief Nursing Officer (CNO) of a large, tertiary medical center, I was called while on vacation, on a beautiful July day in 2006, to be informed that a young, laboring mother had died as result of a medication error.  The impact of this event was profound, lasting and life-changing for many people.  When this event happened, our organization was in the early stages of integrating Just Culture and we had limited application experience.  The utility of this approach in helping us to understand and learn from this tragic event was immediately evident.  The logic of Just Culture made incredible sense; that in order to prevent this tragedy from happening again, we needed to look at the system design of our hospital plus learn from the choices of our employees within the system.

Investigating questionIn the subsequent years, our organization took serious steps to integrate Just Culture.  Every employee participated in educational sessions designed to provide the fundamentals of this new way of thinking about our work and the Administrative Team taught these courses to underscore their importance.  The Just Culture AlgorithmTM  was incorporated into our Root Cause Analysis (RCA) process and we gained a new understanding of our duties and the importance of understanding different behavioral choices in determining the appropriate organizational response when things go wrong.  We came to truly believe that, despite our best efforts to be conscientious in the service to our patients, we would always make mistakes as an elemental function of our humanity.  Further, we  understood that to create an environment in which people would surface errors, admit mistakes and seek to continuously improve, it was vital that our approach was driven by principles of fairness and justice.

There are a relatively small number of things I have learned on my personal developmental journey, throughout nearly 30 years of administrative practice, that have impacted my thinking and my approach to the extent David Marx’s Just Culture has, and naturally I have begun considering how these principles can and should be applicable to management decisions.  The executive decision makers of the organization create the conditions, at the point of care, that enable or impede excellence in clinical practice.  It is the allocation of resources that serves as a foundation for all of the work that is done.  But despite the criticality of these decisions, we have not sufficiently addressed how to hold management accountable when they make risky choices.

...we must refine our decision-making process, especially now as “the way we’ve always done it” will no longer suffice in the face of evidence to the contrary

When we evaluate the behavioral choices of a clinical professional, we are essentially asking, "what would a competent and conscientious professional have done in this situation?"  We consider what the individual actually did and compare it to what other similar professionals, his peer group, would have done under similar circumstances.  The context for this evaluation includes the standards by which we practice.  These may include organizational policies, procedures and expectations, but also include appropriate clinical practice, which is driven by best available evidence.

Management practice is strikingly lacking in similar evidence to drive decision making.  We lag significantly behind our clinical colleagues in research to support our actions and managing "the way we've always done it" is common.  We make critical decisions based on our intuition or our experience (or both).  We tend to think in linear terms when our complex world demands a systems approach.  However, we must refine our decision-making process, especially now as “the way we’ve always done it” will no longer suffice in the face of evidence to the contrary.

The sweeping changes of the Affordable Care Act, specifically reimbursement pressure on hospitals, have the issue of executive decision making looming prominently on my mind.   In the absence of evidence to guide decision making, leaders of our industry have frequently engaged in the same detrimental pattern of response to declining reimbursement, despite the fact it has never produced sustainable results.  The cycle goes something like this: in the face of decreasing revenue, we perceive an urgent need to cut costs in order to maintain a margin.  We identify nursing as the largest cost.  We are certain the "new normal" will require us to learn to do more with less. We cut staffing at the point of care.  We congratulate ourselves for the significant expense reduction (on paper, in the short term).     The flurry of activity subsides and there is no systematic, longitudinal evaluation of the impact of the cuts. (By design or default?)  We have no substantive analysis of the impact on patient outcomes, patient satisfaction, nurse engagement, nurse turnover, physician engagement, as examples, though the anecdotes are endless. Over time, as a result of the unworkable environments created by this activity, resources are gradually added back.

In recent years, there has been significant research examining the relationship between professional nurse staffing and the professional practice environment and patient outcomes.  We have learned with certainty that key outcomes such as mortality, "failure to rescue", preventable complications and readmissions are directly linked to both professional nurse staffing and the professional practice environment.  We understand that a nurse can be stretched too thin  to be able to perceive subtle changes in patient condition or effectively coordinate post-discharge care.  We can now demonstrate, through substantial evidence, that organizations with the best staffing and positive practice environments produce the best results. (2-7).

We have crossed the critical threshold that requires a change in management practice.  We must transition from individual preference to  evidence-based  decision making as it relates to nurse staffing, and it is time to apply the principles of Just Culture to executive decision making with the same rigor and enthusiasm that we apply them to clinical practice.

A useful analogy is the change in practice that occurs in the clinical setting with the emergence of evidence.  Consider a familiar example: care of the surgical patient.  In years past, the care of the surgical patient was largely driven by surgeon preference.  There were significant variations in practice and, in time, research demonstrated a corresponding variation in outcomes.   Over time, research demonstrated that there is a "best practice".  National standards evolved and the Surgical Care Improvement Project (SCIP) was born. (8)  Today, in surgical settings across the country, a standard approach to care of the surgical patient has replaced individual surgeon preference.  It is inconceivable that the hospital or medical staff leadership in any setting would tolerate a surgeon or nurse veering from the standards.  It is considered a matter of patient safety.

The evidence now available to us regarding professional nurse staffing requires us to abandon the arbitrary decision making of the past.  Determining appropriate staffing requires an analytical approach with a deep understanding of patient care, analysis of the population being served, evaluation of the capacity of the nursing organization, consideration of the geographic setting in which care is being provided, support systems available and meaningful voice from the direct care nurses.

David Marx (1) taught us that the most important duty we as human beings have is to avoid causing unjustifiable risk or harm towards one another.  When we continue to rely on cutting nurse staffing as a short-term financial solution, knowing this decision will significantly increase the risk of mortality, failure to rescue, and preventable complications, is that not a violation of our most important duty to avoid causing unjustifiable risk?  Is it not time to apply the same standard to our executive choices as to the behavioral choices of our clinical colleagues?   Zero tolerance for repeated risky decisions in accordance with the Just Culture model.  The evidence has crossed a critical threshold and evidence is the ultimate game changer.

beglinger-01

 

Principal / Joan Ellis Beglinger, Designing Tomorrow - Madison, WI

Consultant / Tim Porter-O’Grady Associates, Inc. - Atlanta, GA

Contact Joan by email.

Notes

Marx D. Whack a Mole: the Price We Pay For Expecting Perfection. Plano, TX: by Your Side Studios; 2009.

Aiken LH, Cimiotti JP, Sloane DM, et al. Effects of nurse staffing and nurse education on    patient deaths in hospitals with different nurse work environments.  Med Care. 2011;49:1047-1053.

Aiken LH, Sloane DM, Cimiotti JP, et al.  Implications of the California nurse staffing mandate for other states.  Health Serv Res. 2010;45:904-921.

McHugh MD, Ma C.  Hospital nursing and 30-day readmissions among Medicare patients with heart failure, acute myocardial infarction, and pneumonia. Medical Care. 2013;51:52-59.

McHugh MD, Berez J, Small, DS. Hospitals with higher nurse staffing had lower odds of readmissions penalties than hospitals with lower staffing.  Health Aff. 2013;32:1740-1747.

Needleman J, Buerhaus PI, Stewart M, et al. Nurse staffing in hospitals: is there a business case for quality?  Health Aff. 2006;25:204-211.

Needleman J, Buerhaus P, Pankratz VS, et al. Nurse staffing and inpatient hospital mortality.  N Engl J Med.  2011;364:1037-1045.

The Surgical Care Improvement Project-Joint Commission, available at http://www.jointcommisision.org/surgical_care_improvement_project/.  Accessed January 27, 2014.

 

 

Safe Choices Training For Staff Online

SafeChoicesOnline1

Once your entire managerial team is proficient on Just Culture principles and the use of the Algorithm™, it’s time to get your front line employees on board. We recommend that you educate all front-line employees on their role in the organization’s learning culture, their ability to impact system design, and their ability to influence outcomes and organizational success through safe behavioral choices.

To better assist you, the one-hour online Safe Choices™ Training for staff is designed to provide your employees with a high level overview of the Just Culture concepts. The purpose of this course is to highlight the impact of an employee’s role in the Just Culture at your organization. Not only is this training about making choices that impact safety, but it is also about aligning values, designing better systems, and about how we communicate with each other.

The online training consists of:

  • Five minute introduction video that outlines some Just Culture basics
  • 23 minute movie that shows some everyday scenarios where staff can observe and apply Just Culture principles
  • Six modules that reference back to the movie, ending with a question or two that checks the employee’s learning progress
  • Upon completing the Safe Choices™ Training for staff, each employee will be on their way to making safer choices and having impact in the Just Culture of your organization.

The price of $29.00 is per online user. If you are interested in bulk "Enterprise Licensing" please feel free to contact our friendly Client Relations Team.

Provided below are a few screenshots of the basics of the online training materials. Click on the image for the full size view.

SafeChoicesOnline2

SafeChoicesOnline3

SafeChoicesOnline4

SafeChoicesOnline6

 

Preventable Error in Skilled Nursing Facilities Costs $2.8 Billion

preventative error, hand hygiene, just cultureAn estimated 22 percent of Medicare beneficiaries experienced adverse events during their [Skilled Nursing Facility] stays. An additional 11 percent of Medicare beneficiaries experienced temporary harm events during their SNF stays…59 percent of these adverse events and temporary harm events were clearly or likely preventable...this equates to $2.8 billion spent on hospital treatment for harm caused in [Skilled Nursing Facilities] in FY 2011.”1

Startling numbers; imagine the good the $2.8 billion could do within healthcare if it was not needed to recover from preventable medical error. However, “preventable medical error” is a term used broadly to describe three very different behavioral choices. In order to achieve significant and lasting improvements in patient safety, we have to investigate further and look into the underlying behavioral choices that contributed to this “preventable medical error.”

For example, consider the issue of healthcare associated infections, which accounts for 26 percent of these adverse events and temporary harm within Skilled Nursing Facilities.2 Contributing to this percentage are infections caused by healthcare providers failing to follow proper hand hygiene. Should a patient acquire an infection because his or her healthcare provider did not follow hand hygiene protocols, this would be categorized as “preventable medical error.” Yet this “preventable medical error” could have happened due to three very different behavioral choices:

(1) The healthcare provider enters a patient’s room that is crowded with family. Distracted by the family, he forgets to wash his hands before handling the patient. This is human error. No conscious decision, merely a genuine mistake.

(2) The healthcare provider decides to not wash her hands. She has already washed her hands six times this morning and the patient only needs a quick routine check, so she decides washing her hands again is an unnecessary precaution. This is at-risk behavior, a choice that is risky but she does not see the risk.

(3) The healthcare provider is running late for his gym class. He knows how risky it is to not wash his hands before touching the patient but he decides to skip hand hygiene anyways in order to finish a few minutes faster. This is reckless behavior. He sees the risk and chooses the behavior anyways.

While resulting in the same “preventable medical error,” these behavioral choices are very different. Each deserve a very different leadership response and guide us as leaders as to how to best prevent the “preventable medical error” in the future. The challenge for Just Culture Champions is to never settle for the label of “preventable medical error.” Instead, investigate and learn from the underlying behavioral choices. It is through this process of understanding the Three Manageable Behaviors that we will achieve significant and lasting improvement to patient safety.

1 Department of Health and Human Services, “Adverse Events in Skilled Nursing Facilities: National Incidence Among Medicare Beneficiaries,” February 2014, p. 2; available at http://oig.hhs.gov/oei/reports/oei-06-11-00370.asp (emphasis added).

 

2 Ibid., p. 21.

 

System Changes and Risky Choices

Productivity

ProductivityMost employees will not actively choose to reduce their productivity; most people have enough self-preservation instinct to keep from making such an unwise decision. What can happen, however, and frequently does happen, is that an employee can choose to make small changes in their system. Usually their intentions are good; they’re trying to speed up the process by cutting out a secondary check or by rushing through a minor part of the process.

But those processes were specifically built over time in order to produce the outcome you desire with a minimal failure rate. Choosing to circumvent the process runs the risk of increasing that failure rate, and why would anyone do that? Perhaps they have created the outcome so consistently for so long without failure that they have forgotten that failure is an option. Or maybe they’re trying to improve their production rate. Or perhaps they just have the confidence in their own skills to continue to avoid failure without following all of the guidelines.

It’s not malicious intent. It’s just that their perception of risk has drifted. They have grown accustomed to the risk; it doesn’t seem as, well, risky as it should. And as long as we learn to recognize it and get ourselves back on track, we’ll be fine. We can use periodic training to realign our sense of risk. We can change the systems we use to allow for more checks or to add redundancy. Most and easiest of all, we can keep the lines of communication open, letting workers use each other to remember what can happen when we fail.

Once we recognize what’s causing the drop in productivity, we can do something about it. But it can take a little extra effort to look behind the symptom to find the real cause. That’s why we encourage a culture of learning as a vital part of a Just Culture.

When a Human Error Catches Negative Media Attention

Negative PR

Negative PRWhat if a human error attracts negative media attention and risks harming your organization’s reputation, or compromises your mission or values? To err is human, and humans will make mistakes, we will even make at-risk choices that cause errors and mistakes. Knowing this, should employees be punished if the general public is looking for retribution for a harmful outcome? What role does public opinion have in our management of employees involved in negative outcomes? Keep in mind that even public outrage is an outcome, and should not be taken into consideration in the evaluation of an event.

Human errors are not good things – yet we must anticipate that they will occur. When your investigation confirms that an event that has garnered public attention is the result of human error, we must look to next steps.

Ideally, we can settle negative attention in a Just Culture without compromising the commitments that we have made. Take a deep breath and be willing to stand behind what your organization has decided is the "right" way to practice and manage. Stand up and be open and honest: “Yes, a bad thing happened, but after a thorough investigation we have found that this tragedy is the result of a human error, one that any human could have made. Pursuant to that finding, and in light of the Just Culture principles we have supported since [date], we will continue to support and console the involved employees. We as an organization, however, have taken every step possible to ensure that the circumstances that led up to this mistake have been thoroughly investigated and we have enhanced our systems and processes to ensure that this tragedy cannot be repeated.”

Proactive steps to explain our organization's implementation of Just Culture to your local community may assist you with being able to speak openly and honestly about the choices your organization will make when an event occurs.  Without the ability to withstand public pressure, punishing an employee based on public outcry can damage the trust that a Just Culture works to build between leadership and employees; your organization will have to decide if that’s a price worth paying.

Total Quality Management: Is there a trade off involved?

Values

Values

Perhaps the greatest challenge for any organization is the management of competing values and the acceptance of design trades. When I speak of design trades, I am eluding to the fact that in any system design exercise, we often allow the infringement of one value in trade for the better fulfillment of a competing value. For example, in healthcare today we allow patients/residents the freedom to move around their rooms and the hospital while we provide care. This is of course done in the fulfillment of patient satisfaction and other important values. That said the design trade within this socio-technical system is the increased likelihood of a serious injurious fall. The value of safety is now compromised due to the unassisted movement with a patient who has impaired mobility or additional factors and who will engage in these activities for a multitude of reasons. This fact considered, we employ a number of different system design strategies to mitigate the risk of a serious injurious fall, things such as bedside alarms, fall risk assessments and floor mats to name a few. All of this speaks to a particular design law we must heed when designing socio technical systems. Design law eleven from Outcome Engenuity’s sixteen design laws states, “All systems suffer from design trades – maximizing performance toward one value will ultimately hurt another value, or the mission itself. The closer we get to perfection toward any one value, the higher the costs to other values.”

The importance of this law becomes critical in the application of Total Quality Management philosophies such as six sigma and lean sigma. As a six sigma black belt and a lean sigma practitioner, I can appreciate the value that both of these principles bring to the table. However one of the challenges that exist in applying these methodologies is to ensure we have examined the system design through the lens of each value. In many instances we are removing the variation within a process while only examining that variation or non-value added step through the scope of one value. Thus again, potentially overly impinging upon another value that we did not consider in our design of experiment or work flow analysis. It is vitally important that as we employ TQM philosophies we keep in mind the design trades and multitude of values we are attempting to uphold within our operational expanse.

At-Risk Behavior

At-Risk BehaviorJust Culture defines three behaviors: human error, at-risk behavior, and reckless behavior. At-risk behavior is the tricky one to work with; it’s the most prevalent and it takes many forms. Risky behaviors can be difficult to spot and the employee may not even realize they’re making a risky choice.

In most cases, the employee either doesn’t see the risk in the choice or doesn’t think the risk is significant. Maybe they’re a little too comfortable in what they’re doing. Maybe they think they’ve “got it under control.” Or maybe they’re just trying to improve their efficiency, skipping or glossing over steps they feel are not important. A risky choice can also be hidden behind a human error, making it more difficult to identify and correct. For example, a baker bakes a cake for the wrong length of time (human error) because she chose not to read over the recipe that she has used for many years (risky choice).

A Just Culture treats at-risk behavior with coaching. When an employee makes a risky choice, we talk to the employee about the risk, making sure they understand the how and the why of the situation. This is not considered punitive; it’s intended as part of the learning process. We need to ensure that the employee understands how the risk can threaten the values of the organization.

Risk is part of doing business; we will never escape it. But we can manage risk; we can be careful and smart about which risks to take. We can also learn from risks taken in order to avoid taking the same risks over and over. And we can encourage our people to let us know when they have taken a risk, whether it worked or not, and explore how it could have worked better.

Human Performance Management Process

Performance Management Process

Performance Management Process

People are a major part of our work systems. People are also prone to making mistakes, but that’s not always the whole story. Managing the human element means working with a diversity of variables, but let’s briefly touch on a few helpful strategies for controlling behavioral choices in our work systems.

Education is a fundamental strategy. For training purposes we make a distinction between knowledge and skill. We use the term knowledge when we refer to what people know, and skill relates to the ability to use and apply knowledge to perform a task or accomplish a goal. We can impart knowledge with training in a classroom setting, but skill comes with experience in using the knowledge to improve the system in some way.

In the analysis of a work process, aspects will be noted that can affect the worker’s performance, in good ways and in bad ways. As we identify these, we can take them into account in our system, minimizing any negative effects that we find. For example, if workers are having difficulty with the controls of a certain operation, perhaps the controls can be re-ordered to make the operation flow more smoothly. Decades of experience have gone into the organization of an airliner’s controls with the intent of streamlining the process of flying the plane.

The ability to see and understand the risks in a work process is vital to the strategy of avoiding risk. Teaching your people why each step is on a checklist is an important part in their understanding of why they shouldn’t skip any of the steps. It’s also important that they understand how small problems can lead to bigger problems, especially in processes that can have catastrophic outcomes. Back to the plane example, we need to have the skills and training to spot failures before the plane takes off. If we catch the small risks and failures, it helps to avoid the really bad outcomes.

These strategies are all designed to improve communication with your people and help your team produce better outcomes. What are some examples of ways you help keep your team on track? Let’s discuss it in the comments below.

Qualitative versus Quantitative Risk Models

Qualitative versus QuantitativeA socio-technical risk model can be both qualitative and quantitative; the qualitative model illustrates system design with the quantitative model becoming a guide to actual practice. As a qualitative risk model, the model itself becomes a visual representation of basic events and how they combine to produce an undesired outcome. A qualitative model shows the inter-relation between equipment failures, human error, and at-risk behaviors that can combine to produce the undesired result. With probability estimates included, the model provides a quantitative estimate of the relative risks, from one branch to another and from one failure combination to another. Quantitative risk models are the higher value trees; however, they also face criticism associated with the uncertainty of their numbers.

These quantitative models better represent the risk factors we might expect to see in an operational environment, including the contribution of at-risk behaviors. At this point, there are two paths to complete the model – working back from the top-level event, or working up from the bottom-level events. If we know the top-level risk, we might adjust the numbers below to obtain an approximation of what is happening in the actual system.

For example, if we knew that an organization was experiencing five events per year, we would have to drastically change the error rates (beyond the point of reasonableness) or add in rates to the at-risk behaviors to produce a model that predicts what is actually happening. The model might now predict a rate that matches actual event occurrences, but are the underlying estimates reasonable, or accurate? While this model is quantitative, it is far from being considered the objective “truth.” This, however, is not the point. Does the model give us information that will guide further inquiry? Does the quantitative model provide us with more information than the merely qualitative model? The answer is “yes” – it provides a visual representation of risk that can guide decision-making around further inquiries into the sources of risks, along with giving the organization the ability to update probability estimates through specific data collection and event data to understand if norms or behaviors have changed.

The Role of the Imposer

imposers

imposersLet’s talk about imposers and their role in your organization. These are the people we allow or elect or appoint to hold us to task, to keep us on track with our rules and our values.

The most obvious example of an imposer, or at least the one everyone is familiar with, is the government. In order to have a functioning society, we have to assign people and resources, including laws, to the maintenance of the society. The people can be represented by police officers, tax collectors, and military personnel, just to name a few.

Imposers are granted the authority to reward or to punish based on the set of rules or guidelines in place. The organization granting the authority places the imposer in a potentially awkward or difficult situation, so it’s important for the organization to fully support and cooperate with its team members. It needs to trust the people it empowers, and it needs to get and use feedback from them as well. The government has laws in place to support the police and tax officials, for example. It’s really easy to forget about the reward part of this process, also. Much emphasis is placed on the negative, but positive reinforcement is a great way to boost morale and encourage good choices.

In a corporate setting, managers and supervisors take on the imposer role. They are “in the trenches” alongside your employees, so they are the ones regularly tasked with praise and sanction. To make that job easier for them, you’ll want to have your expectations as clearly established as possible for the sake of consistency. Supporting the imposers as well as the employees will give your team the strong foundation it needs to succeed.

How do you empower your imposers? What do your imposers enforce? How do they punish? How do they reward? Continue the discussion in the field below.

Define Justice

Define Justice

Define JusticeJustice is a word we in the Just Culture Community throw around quite a bit. It’s also a word with some very specific connotations, though, so let’s look into what it means and why we use it so much.

It’s no wonder people get confused when thinking about justice; the dictionary has quite a variety of definitions listed, most of which depend on the context. We see terms like morally right, factual, and legally accurate in there, and it doesn’t do much to delineate these widely varied ideas. One that comes up quite a lot, though, is fairness. And fairness is a more complicated idea than most people realize.

In a Just Culture justice is primarily addressed in terms of accountability; we have to hold people accountable for the choices they make. In order to effectively do this, we need to be able to distinguish what is a choice and what is out of the control of the individual, specifically that which is controlled by the system. Documenting the work process allows us to gauge whether the failure occurs in the process or in the choice. We can then address the process flaw or hold the employee accountable for choosing to vary from the process.

The intent is to strike a balance between the overly punitive and the blame-free cultures. When we punish every slight, people will hide everything they do. When we hold no one accountable, people will lose the desire to do their best. The middle ground is to set a standard and hold people to it consistently.

We need to be consistently and repeatably fair and just, both to our employees and to the company. The ability to do that is key to building a Just Culture.

How to Create System Designs that Activate Your Internal Risk Monitor

Hello from above – I write this article while cruising at 30,000 feet, returning home after spending 10 days conducting and building a Socio-Technical Probabilistic Risk Assessment (ST-PRA). I wanted to share some of the general insights we had in this past couple of weeks while conducting the modeling exercise. First there is this popular cliché floating around that states, “Culture eats systems for lunch.” I wanted to take a moment to examine this statement with a more deliberate intent.

As with many things, we humans tend to become somewhat black and white, forgetting to examine the middle ground. In the case of culture eating systems for lunch, this only tends to be true in socio-technical systems that are heavily dependent upon behaviors such as medication ordering or aircraft maintenance processes. These are the types of systems that require human participation with a relatively high degree of thought; they often have a substantial risk of human error (often due to operator drift). In other words, culture can decimate behavioral systems. A structural system is less human-centered, like the barcoding process used on aircraft boarding tickets to ensure passengers get on the right flight by using a scanner that will sound an alarm when the bar code is incorrect, rather than trusting a human to verify the information on every ticket. These systems are much less vulnerable to culture/behavioral choices.  These are systems with strong visual cues such as a shadow board, bar code or something of that nature which says to the operator, “I have a missing component or something is wrong,” giving the operator a warning of impending error regardless of whether or not the choice to review properly was completed. Thus, when we think about good system design, we must design systems with the necessary structure or layers of redundancy to be fault (culture) tolerant.

Secondly, I wanted to explore the concepts of system 1 and system 2 thinking within system design, giving due credit to Daniel Kahneman and his book, “Thinking Fast and Slow,” along with my colleague David Marx and his examination of the internal risk monitor. These concepts involve the activation of our risk monitor and how when it is activated it shuts down our system 1 thinking – that cognitive state of mind involving our unconscious reasoning, multi-processing and automatic error-prone, strong confirmation bias state. Our daily commute home is often a place where we are operating on system 1; we are following a set of instructions or processes that we perform regularly with little variation.  With system 1 shut down we now engage our system 2 thinking – conscious reasoning, slow, serial processing, effortful, less error prone than the system 1 state of cognitive engagement.  The daily commute home in an ice storm may be an instance where system 2 is activated; in this case we’ve added significant risk to the usual process, forcing ourselves to pay more attention to the road and its additional hazards. It is from this place (system 2) we humans can become much more reliable in the environment that is inherently risky.

That is to say, our system design should create interventions, like the different warning systems on your car that indicate the road is icy, activating the internal risk monitor so as to engage our system 2 thinking when behavioral choices can be impactful to system reliability. Yet at the same time ensure we have the appropriate structure or layers of redundancy to manage the influence of culture and behavioral choices on our system. Combining the concepts of cultural tolerance and system 1 and system 2 thinking within our socio-technical system design gives us the opportunity to manage inherent risk with a more holistic approach.

John WestphalPost by: John Westphal, Advisor.

John Westphal is an advisor that has been with Outcome Engenuity since its inception. John builds tools and helps our clients learn the five skills. John can be reached by email at jwestphal@outcome-eng.com.

The Label “Reckless” and the Learning Culture

Reckless Behavior

Have you ever tried having a conversation with someone when the other party has already decided that you were in the wrong?  Not the easiest feat, is it?  In fact, I wouldn’t be surprised if you felt like you didn’t even care to have that conversation at all, seeing as how the other person had already made up their mind about you.  Perhaps you felt like to even speak would only be adding more fuel to their accusations.

Your employee is likely to have the same response if, while you are investigating the event, he or she feels that you have already decided that he or she was reckless. Once a label of “reckless” has been applied, the investigation and the conversation—and opportunities to learn and prevent future risk or harm—will shut down. Even the perception that judgment has already been passed can cause an employee to withdraw from the conversation, making it difficult to learn more about the incident. This is one reason why we recommend getting as much information as possible about an incident before your discussion turns to what should have happened or what does procedure require. If an individual believes the judgment has already been rendered and it is one that is unfavorable, they are likely to be on the defensive and not prone to helping you better understand what happened.

This is not to say that you should be hesitant to identify a behavior as a reckless choice if it genuinely is reckless.  That may be the most accurate assessment of the person’s choice, and accountability is called for.  But before coming to that determination some caution is advised.   Ask yourself, “is there anything more I can or should learn from this situation?”  If the answer is “yes,” this indicates that perhaps this is a time to ask more questions, to air out the situation for further review and analysis.  For once a behavior has been deemed “reckless,” those learning opportunities are likely to be closed.