Budget For Culture: How Investing In Your Team Drives Results – Forbes article

Forbes.com recently posted an article addressing the the benefits an organization receives when it decides to invest in its culture and how that communicates its values. For your convenience we have posted it here so you may continue reading below.shutterstock_186067874

As a leader, every decision you make shapes your organizational culture, and when it comes to budgeting your limited resources, these decisions send powerful messages to your people about what’s most important. After all, money doesn’t just talk — it shouts your priorities through a bullhorn. You have to make budgeting decisions that drive your business’s strategy and goals. But too often, the technical aspects of your strategy are prioritized over the most important facet of your organization’s long-term performance: the people.

The best plan in the world won’t survive if your people aren’t on board. But if you support your employees and nurture their enthusiasm, they’ll take care of your business. In fact, investing in your people can reap rewards that ripple across your entire organization and beyond. According to Gallup, organizations with above-average levels of employee engagement reap 147 percent higher earnings per share. Furthermore, when they engage both customers and employees, organizations experience a 240 percent jump in performance-related business outcomes. Clearly, you need to start investing in culture.

The concept may still seem abstract, so here are six concrete aspects of organizational culture to focus on:

1. Recruitment, orientation, and the employee experience: A new team member’s impression of how you treat employees is set from the beginning. Even during the recruiting process, the way candidates are treated sends a clear message about your company culture. These messages about expectations and a person’s value are reinforced during the onboarding process. With this in mind, you need to be thoughtful about your employee experience throughout their tenure with you and make it as seamless and supportive as possible. This kind of investment will pay dividends down the road.

2. Professional and leadership development: It’s not uncommon for business leaders to create strategies that require a significant shift in employee behavior to succeed. However, if you’re asking employees to do things differently, you need to anticipate their apprehension.

By setting aside resources to train your employees in the knowledge, skills, and abilities they’ll need to implement your plan, they’ll see that you’re serious about your changes and are willing to support them through the transition.

3. Compensation and incentives: Compensation is a massive and complex topic in business — one that can’t be underestimated. As a professional services firm, the lion’s share of my company’s budget goes into compensation. Our team members are expected to dedicate a lot of time and energy to the success of our clients, and they’re paid as well as possible because we value and trust in their abilities.

People’s total compensation (not just their base salary) will drive all sorts of behaviors, but your plan must be designed thoughtfully. If it’s not, you may find yourself in a no-win situation with employees behaving in ways that maximize their personal gain but don’t move your organization forward.

4. Rewards and recognition: Like compensation, rewards and recognition require resources, but they also send clear messages to your people about what behaviors are acceptable and encouraged and which are not.

Finding creative ways to recognize people who are creating value in your business is worth its weight in gold. Rewards and recognition aren’t one-size-fits-all strategies, though. Different people value different things, so you must take the time to get to know your team members and develop an understanding of what incentives will be the most appreciated.

5. The physical environment: The space in which people work can promote desired behaviors, but it can also be used to reinforce what’s most important to you in less direct ways. Put careful thought into the design of your office space. If your strategy dictates significant changes in how people do their jobs, you may need to make extra room in the budget to align their workspaces with your expectations.

6. Tools and equipment: When you’re budgeting to drive your strategy, a final key consideration is whether your people have the proper tools and equipment to fulfill your expectations. Outfitting your team with the wrong equipment will lead to disaster. You can’t ask your team to get to the moon with a roll of duct tape and a spatula; it will only hold your team back from accomplishing your overarching goals.

If you fail to think more holistically about the “what” and the “how,” your perfect business strategy will be left on the launch pad, unable to take off. Don’t let all that planning go to waste by ignoring the needs of the people who make your strategy effective. Investing in ways that communicate how much you value team members will drive the behaviors you need to reach your goals this year.


Millitary hospitals looking to Just Culture for the answer?


Mass. hospitals show how to fix military medical care




Courtesy Boston Globe / Associated press
Army Surgeon General Lieutenant General Patricia Horoho speaks about military health care at the Pentagon in October. http://www.bostonglobe.com/opinion/editorials/2015/01/05/mass-hospitals-show-how-fix-military-medical-care/2IWh1zeNGC2goyYXho0KeP/story.html

Military hospitals charged with one of the country’s most important missions — serving active duty personnel — are roiled by dysfunction. As reported by The New York Times over the last several months, military hospitals suffer from chronic lapses in patient care and safety. Outgoing Secretary of Defense Chuck Hagel addressed the problem in October, when he ordered the military health system to reassess and revamp its procedures. But it might take nothing less than an act of Congress to change practices and procedures that are ingrained in military culture.

The command and control system that works well on the battlefield puts the military health care system out of touch with most modern medical institutions, where questioning of the system is a crucial component of everyday practice. The latest Times report described a system in which physicians and nurses who point out lapses in care are transferred or passed over for promotion, compromising patient safety and quality of care.

The Times report found that two areas of treatment in the military health system were particularly vulnerable — maternity care and surgery. A Pentagon review of the military’s hospitals found a systemwide problem: a reluctance by medical workers to identify problems, for fear of reprisal.

The reluctance to report errors is understandable. But in a medical setting, decision-making can literally be a matter of life or death — which is why civilian hospitals and medical centers have been working hard over the past 20 years to encourage “blame-free” reporting.

At three of Boston’s biggest hospitals, various high-tech systems for reporting errors are in place. Such a system is sometimes called a culture of safety or, after one model that was developed in the late 1990s, “just culture.” Massachusetts General Hospital, Brigham & Women’s, and Beth Israel Deaconess Medical Center all follow some version of the “just culture” model for reporting errors. Anyone from a janitor to a nurse to a surgeon is encouraged to report errors in a non-punitive environment, and there are active campaigns to encourage reporting. The principals of “just culture” defer blame from an individual to the system as a whole.

To gather these reports, hospitals establish websites available to all employees. The reports are vetted and analyzed, with protocols for followup. In some cases, individuals are held accountable for a decision that’s seen as reckless. But for the most part, “just culture,” says Karen Fiumara, director of patient safety at Brigham & Women’s, describes “a culture of trust and shared accountability.”

Such a reporting system sounds like common sense. But “just culture” is antithetical to the military hospital system for a very basic reason: chain of command. As hospital administrators point out, the “just culture model” won’t work unless leadership insists on it. The assistant secretary of defense for health affairs, Dr. John Woodson, an Obama appointee, has made strong statements about reforming the system, but his power is restricted to making policy recommendations. He cannot give orders to military commanders, and they’re the ones charged with running military hospitals.

One person who does have responsibility for change is the Army Surgeon General, Lieutenant General Patricia Horoho. Horoho has issued a statement demanding transparency regarding patient safety, and she has won praise from at least one member of a civilian agency in charge of inspecting and accrediting hospitals. “I applaud the way she’s handled the situation,” Dr. Ronald M. Wyatt said in an interview, adding that hers are the kind of actions “that resonate throughout the system.”

But the system, as it’s structured now, is working against Horoho, a decorated Army nurse. For one, commanders rotate out of assignments approximately every three years. And there’s no guarantee that Horoho herself, who has been Army surgeon general since 2011, will remain in her job much longer. “Imagine if the CEO at a civilian hospital changed every three or four years,” said Wyatt.

The problems in leadership stability are also compounded by the fact that the military hospital system is divided into three units for each branch of the armed services. What’s more, the system — whose primary mission is to train medical personnel for combat— is under strain after 12 years of war.

Clearly a system overhaul is required, one that at the very least involves the implementation of a stable leadership program in which just culture protocols are implemented. At best, the system would be streamlined, unifying all the branches of the military into one hospital system. Military service men and women put their lives at risk regularly overseas. They and their families shouldn’t be put in harm’s way when they seek medical help at home.

Healthcare Associations Can Offer Deep Discounts To It’s Members – Here’s How

Leaders around the world are supporting Just Culture initiatives

By being a Just Culture Connector, you can support your members in their initiatives and lead them in their journey by any of these three options:

  • Host a course for your hospitals and get 5 free seats
  • Promote a course with an exclusive member discount code for registrants
  • Support an OE Just Culture statewide initiative and offer discount Enterprise Licensing for all of your hospitals

How ‘Zero’ Language Can Hurt More Than It Helps


From time to time in the Just Culture we encounter various uses of the term ‘Zero.’

They usually fall within in two categories. The first of which is in regards to how many of a particular event an organization seeks or expects to have, as in ’The goal is Zero". The second use we often see, which can be in conjunction with the first, is a notion of intolerance for any form of violation of a specific rule or policy, as in a ‘Zero Tolerance rule'. I will disclaim that generally, in Just Culture, we try to avoid the use of 'zero language’ in organizational initiatives or policies. However, since many Just Culture organizations operate in a world with inherited language sets and competing ideologies, we felt it appropriate to address some of the common pitfalls with these ideas.

In regards to thinking about ‘Zero’ as a goal, the major consideration ought to be that while zero can be used as an aspirational goal, it really cannot be an expectation or standard. The reason being that ALL systems will fail at some rate. When we are intellectually honest, we must acknowledge that, given enough time, 'never ever' is a statistical impossibility. Even when we set the inescapably fallible human component aside, metal will still rust, engines will still seize, and circuitry will corrode over time. Whether failure is a result of mechanical limitations or whether it be the behavioral choices of the human components, all systems possess a statistical rate of failure. Really great systems, obviously, fail at very low rates. But perfection is unfortunately, just not in the cards. That’s the bad news.

The good news is, however, we can get better! We can improve our rates by designing systems, with built in recoveries, redundancies and barriers, that are error tolerant and robust enough to get us pretty close to zero in some cases. Take for example, the commercial aviation or nuclear industries. They are widely regarded as among highest in reliability, and yet the systems they have designed are not producing adverse outcomes at a rate of zero. In Commercial aviation, they have designed a system to a standard of one catastrophic event in a billion, and in large part achieved that rate. In the world of system design that figure is remarkable when you consider that Six Sigma’s whole premise is to get to three defects per one million.

Even though these catastrophic events are absolute tragedies when they occur, they make headlines in part because they are so rare. And though it never will be said this way: as a society, and as an industry, we generally accept that rate. We accept the rate because we value our transportation options, and we acknowledge inherent risks with doing business. We’ve looked at the one in a billion rate (which averages to approximately one plane crash a year), we’ve weighed the alternatives, and ultimately we’ve collectively said that we can live with that rate. (What’s really astounding is that in regards to our roadways systems as they are currently designed, we also choose to accept a rate of around 30,000 automobile fatalities annually in the US.)

In order to improve our systems in a world of limited resources we often have to make tradeoffs with other values to do so.

But perhaps you don’t accept that rate. Perhaps you want a better rate of one in 10 billion or one in 20 billion (in continuation of the aviation example). Ok, that’s fair, but realize doing so comes at a cost. In order to improve our systems in a world of limited resources we often have to make tradeoffs with other values to do so. Anyone who has ever been stuck at an airport terminal waiting on “maintenance” knows that when we stress safety, it impacts other values like timeliness, customer service, and, from the airliner’s perspective our profit margins. So when designing systems as an organization you have to find a rate you can live with, realize that it can’t be zero, and then you have to figure out what you are willing to trade to get there.

The second use of zero language we see is the ‘Zero Tolerance’ policies or rules that organizations use when trying to emphasize a particular safety risk. Here is where I will issue some major cautions in regards to how these rules relate to Just Culture. Often times, in efforts to alter the perceptions of risk around a particular class of event we opt to draw lines in the sand and implement artificial danger to “assist" our employees perceive risk. Forgoing a time investment into coaching around the actual harm, our desperation for immediate improvements leads us to seek those improvements by giving the issue a very serious tone and attaching some punitive threats. Sometimes organizations can forget that in a Just Culture, altering perceptions of risk takes time and deliberate effort. It is true that in Just Culture we will, in fact, leverage artificial danger (mainly a progressive disciplinary path) from time to time, but we go awry when we try to implement rules that no longer require us to evaluate the employee’s behavioral choices.

Our challenge to organizations is to self-evaluate ‘zero rules’ by asking the following three questions:

  1. Could an employee violate this rule by human error?
  2. Could an employee violate this rule while genuinely not perceiving the risk (despite previous training)?
  3. Are we now saying something different about the way we intend to deal with these behaviors?

The reality is that in Just Culture, we can really only have a zero tolerance policy towards the reckless and above behaviors. Any other punitive actions, save for repetitive errors or at- risk behaviors, are simply not aligned with Just Culture. 'Zero language' is not inherently wrong, but it can be a slippery slope. Even when the term ‘Zero Tolerance' is not associated with punitive action, it can at the least be confusing to staff. As in, if one rule is a ‘zero' rule and not to be broken, what does that imply about all the other rules?

At the end of the day, we must proactively protect our Just Culture commitment from the line of thinking that just because we’ve made expectations clear in the past, or because we’ve trained around specific procedures and risks, our staff now no longer can or will make human errors and at-risk choices. We must constantly defend against our ever present severity bias and refuse to undermine our open learning/reporting culture through the guise of zeal. We strongly urge the Just Culture Community to be cautious with zero language, in our expectations and policies, and realize that even Just Culture organizations are susceptible to drift over time.

Download “Whack-a-Mole” Digital Copy

David Marx says, "Give it away...". Whack-a-Mole: The Price We Pay For Expecting Perfection, by David Marx is now offered in a digital copy.



Whack-a-Mole: The Price We Pay For Expecting Perfection explores the role of human error in society, from aviation and healthcare, to driving and parenting—and where accountability rests for those errors, especially when they take the life of another. David Marx argues that regulatory and human resource prohibitions, along with the criminal prosecution of human error, have been counter-productive to helping society deal with the risks and consequences of human fallibility. Marx advocates a different approach to addressing our shared fallibility.

Scroll down to get your copy (digital download) of Whack-a-Mole: The Price We Pay For Expecting Perfection. by David Marx, JD, CEO of Outcome Engenuity and father of Just Culture and engineer of the The Just Culture Algorithm™ 

-Learn More About Just Culture-  -Just Culture Training Events-  -Event Investigation/Root Cause Analysis-

Evidence: The Ultimate Game Changer

Authored by Joan Ellis Beglinger, MSN, RN, MBA, FACHE, FAAN

My first serious exposure to applying David Marx’s Just Culture model (1) to a real-world event was unfortunately born of tragedy.  As the Vice President for Patient Care / Chief Nursing Officer (CNO) of a large, tertiary medical center, I was called while on vacation, on a beautiful July day in 2006, to be informed that a young, laboring mother had died as result of a medication error.  The impact of this event was profound, lasting and life-changing for many people.  When this event happened, our organization was in the early stages of integrating Just Culture and we had limited application experience.  The utility of this approach in helping us to understand and learn from this tragic event was immediately evident.  The logic of Just Culture made incredible sense; that in order to prevent this tragedy from happening again, we needed to look at the system design of our hospital plus learn from the choices of our employees within the system.

Investigating questionIn the subsequent years, our organization took serious steps to integrate Just Culture.  Every employee participated in educational sessions designed to provide the fundamentals of this new way of thinking about our work and the Administrative Team taught these courses to underscore their importance.  The Just Culture AlgorithmTM  was incorporated into our Root Cause Analysis (RCA) process and we gained a new understanding of our duties and the importance of understanding different behavioral choices in determining the appropriate organizational response when things go wrong.  We came to truly believe that, despite our best efforts to be conscientious in the service to our patients, we would always make mistakes as an elemental function of our humanity.  Further, we  understood that to create an environment in which people would surface errors, admit mistakes and seek to continuously improve, it was vital that our approach was driven by principles of fairness and justice.

There are a relatively small number of things I have learned on my personal developmental journey, throughout nearly 30 years of administrative practice, that have impacted my thinking and my approach to the extent David Marx’s Just Culture has, and naturally I have begun considering how these principles can and should be applicable to management decisions.  The executive decision makers of the organization create the conditions, at the point of care, that enable or impede excellence in clinical practice.  It is the allocation of resources that serves as a foundation for all of the work that is done.  But despite the criticality of these decisions, we have not sufficiently addressed how to hold management accountable when they make risky choices.

...we must refine our decision-making process, especially now as “the way we’ve always done it” will no longer suffice in the face of evidence to the contrary

When we evaluate the behavioral choices of a clinical professional, we are essentially asking, "what would a competent and conscientious professional have done in this situation?"  We consider what the individual actually did and compare it to what other similar professionals, his peer group, would have done under similar circumstances.  The context for this evaluation includes the standards by which we practice.  These may include organizational policies, procedures and expectations, but also include appropriate clinical practice, which is driven by best available evidence.

Management practice is strikingly lacking in similar evidence to drive decision making.  We lag significantly behind our clinical colleagues in research to support our actions and managing "the way we've always done it" is common.  We make critical decisions based on our intuition or our experience (or both).  We tend to think in linear terms when our complex world demands a systems approach.  However, we must refine our decision-making process, especially now as “the way we’ve always done it” will no longer suffice in the face of evidence to the contrary.

The sweeping changes of the Affordable Care Act, specifically reimbursement pressure on hospitals, have the issue of executive decision making looming prominently on my mind.   In the absence of evidence to guide decision making, leaders of our industry have frequently engaged in the same detrimental pattern of response to declining reimbursement, despite the fact it has never produced sustainable results.  The cycle goes something like this: in the face of decreasing revenue, we perceive an urgent need to cut costs in order to maintain a margin.  We identify nursing as the largest cost.  We are certain the "new normal" will require us to learn to do more with less. We cut staffing at the point of care.  We congratulate ourselves for the significant expense reduction (on paper, in the short term).     The flurry of activity subsides and there is no systematic, longitudinal evaluation of the impact of the cuts. (By design or default?)  We have no substantive analysis of the impact on patient outcomes, patient satisfaction, nurse engagement, nurse turnover, physician engagement, as examples, though the anecdotes are endless. Over time, as a result of the unworkable environments created by this activity, resources are gradually added back.

In recent years, there has been significant research examining the relationship between professional nurse staffing and the professional practice environment and patient outcomes.  We have learned with certainty that key outcomes such as mortality, "failure to rescue", preventable complications and readmissions are directly linked to both professional nurse staffing and the professional practice environment.  We understand that a nurse can be stretched too thin  to be able to perceive subtle changes in patient condition or effectively coordinate post-discharge care.  We can now demonstrate, through substantial evidence, that organizations with the best staffing and positive practice environments produce the best results. (2-7).

We have crossed the critical threshold that requires a change in management practice.  We must transition from individual preference to  evidence-based  decision making as it relates to nurse staffing, and it is time to apply the principles of Just Culture to executive decision making with the same rigor and enthusiasm that we apply them to clinical practice.

A useful analogy is the change in practice that occurs in the clinical setting with the emergence of evidence.  Consider a familiar example: care of the surgical patient.  In years past, the care of the surgical patient was largely driven by surgeon preference.  There were significant variations in practice and, in time, research demonstrated a corresponding variation in outcomes.   Over time, research demonstrated that there is a "best practice".  National standards evolved and the Surgical Care Improvement Project (SCIP) was born. (8)  Today, in surgical settings across the country, a standard approach to care of the surgical patient has replaced individual surgeon preference.  It is inconceivable that the hospital or medical staff leadership in any setting would tolerate a surgeon or nurse veering from the standards.  It is considered a matter of patient safety.

The evidence now available to us regarding professional nurse staffing requires us to abandon the arbitrary decision making of the past.  Determining appropriate staffing requires an analytical approach with a deep understanding of patient care, analysis of the population being served, evaluation of the capacity of the nursing organization, consideration of the geographic setting in which care is being provided, support systems available and meaningful voice from the direct care nurses.

David Marx (1) taught us that the most important duty we as human beings have is to avoid causing unjustifiable risk or harm towards one another.  When we continue to rely on cutting nurse staffing as a short-term financial solution, knowing this decision will significantly increase the risk of mortality, failure to rescue, and preventable complications, is that not a violation of our most important duty to avoid causing unjustifiable risk?  Is it not time to apply the same standard to our executive choices as to the behavioral choices of our clinical colleagues?   Zero tolerance for repeated risky decisions in accordance with the Just Culture model.  The evidence has crossed a critical threshold and evidence is the ultimate game changer.



Principal / Joan Ellis Beglinger, Designing Tomorrow - Madison, WI

Consultant / Tim Porter-O’Grady Associates, Inc. - Atlanta, GA

Contact Joan by email.


Marx D. Whack a Mole: the Price We Pay For Expecting Perfection. Plano, TX: by Your Side Studios; 2009.

Aiken LH, Cimiotti JP, Sloane DM, et al. Effects of nurse staffing and nurse education on    patient deaths in hospitals with different nurse work environments.  Med Care. 2011;49:1047-1053.

Aiken LH, Sloane DM, Cimiotti JP, et al.  Implications of the California nurse staffing mandate for other states.  Health Serv Res. 2010;45:904-921.

McHugh MD, Ma C.  Hospital nursing and 30-day readmissions among Medicare patients with heart failure, acute myocardial infarction, and pneumonia. Medical Care. 2013;51:52-59.

McHugh MD, Berez J, Small, DS. Hospitals with higher nurse staffing had lower odds of readmissions penalties than hospitals with lower staffing.  Health Aff. 2013;32:1740-1747.

Needleman J, Buerhaus PI, Stewart M, et al. Nurse staffing in hospitals: is there a business case for quality?  Health Aff. 2006;25:204-211.

Needleman J, Buerhaus P, Pankratz VS, et al. Nurse staffing and inpatient hospital mortality.  N Engl J Med.  2011;364:1037-1045.

The Surgical Care Improvement Project-Joint Commission, available at http://www.jointcommisision.org/surgical_care_improvement_project/.  Accessed January 27, 2014.



Safe Choices Training For Staff Online


Once your entire managerial team is proficient on Just Culture principles and the use of the Algorithm™, it’s time to get your front line employees on board. We recommend that you educate all front-line employees on their role in the organization’s learning culture, their ability to impact system design, and their ability to influence outcomes and organizational success through safe behavioral choices.

To better assist you, the one-hour online Safe Choices™ Training for staff is designed to provide your employees with a high level overview of the Just Culture concepts. The purpose of this course is to highlight the impact of an employee’s role in the Just Culture at your organization. Not only is this training about making choices that impact safety, but it is also about aligning values, designing better systems, and about how we communicate with each other.

The online training consists of:

  • Five minute introduction video that outlines some Just Culture basics
  • 23 minute movie that shows some everyday scenarios where staff can observe and apply Just Culture principles
  • Six modules that reference back to the movie, ending with a question or two that checks the employee’s learning progress
  • Upon completing the Safe Choices™ Training for staff, each employee will be on their way to making safer choices and having impact in the Just Culture of your organization.

The price of $29.00 is per online user. If you are interested in bulk "Enterprise Licensing" please feel free to contact our friendly Client Relations Team.

Provided below are a few screenshots of the basics of the online training materials. Click on the image for the full size view.






Preventable Error in Skilled Nursing Facilities Costs $2.8 Billion

preventative error, hand hygiene, just cultureAn estimated 22 percent of Medicare beneficiaries experienced adverse events during their [Skilled Nursing Facility] stays. An additional 11 percent of Medicare beneficiaries experienced temporary harm events during their SNF stays…59 percent of these adverse events and temporary harm events were clearly or likely preventable...this equates to $2.8 billion spent on hospital treatment for harm caused in [Skilled Nursing Facilities] in FY 2011.”1

Startling numbers; imagine the good the $2.8 billion could do within healthcare if it was not needed to recover from preventable medical error. However, “preventable medical error” is a term used broadly to describe three very different behavioral choices. In order to achieve significant and lasting improvements in patient safety, we have to investigate further and look into the underlying behavioral choices that contributed to this “preventable medical error.”

For example, consider the issue of healthcare associated infections, which accounts for 26 percent of these adverse events and temporary harm within Skilled Nursing Facilities.2 Contributing to this percentage are infections caused by healthcare providers failing to follow proper hand hygiene. Should a patient acquire an infection because his or her healthcare provider did not follow hand hygiene protocols, this would be categorized as “preventable medical error.” Yet this “preventable medical error” could have happened due to three very different behavioral choices:

(1) The healthcare provider enters a patient’s room that is crowded with family. Distracted by the family, he forgets to wash his hands before handling the patient. This is human error. No conscious decision, merely a genuine mistake.

(2) The healthcare provider decides to not wash her hands. She has already washed her hands six times this morning and the patient only needs a quick routine check, so she decides washing her hands again is an unnecessary precaution. This is at-risk behavior, a choice that is risky but she does not see the risk.

(3) The healthcare provider is running late for his gym class. He knows how risky it is to not wash his hands before touching the patient but he decides to skip hand hygiene anyways in order to finish a few minutes faster. This is reckless behavior. He sees the risk and chooses the behavior anyways.

While resulting in the same “preventable medical error,” these behavioral choices are very different. Each deserve a very different leadership response and guide us as leaders as to how to best prevent the “preventable medical error” in the future. The challenge for Just Culture Champions is to never settle for the label of “preventable medical error.” Instead, investigate and learn from the underlying behavioral choices. It is through this process of understanding the Three Manageable Behaviors that we will achieve significant and lasting improvement to patient safety.

1 Department of Health and Human Services, “Adverse Events in Skilled Nursing Facilities: National Incidence Among Medicare Beneficiaries,” February 2014, p. 2; available at http://oig.hhs.gov/oei/reports/oei-06-11-00370.asp (emphasis added).


2 Ibid., p. 21.


System Changes and Risky Choices


ProductivityMost employees will not actively choose to reduce their productivity; most people have enough self-preservation instinct to keep from making such an unwise decision. What can happen, however, and frequently does happen, is that an employee can choose to make small changes in their system. Usually their intentions are good; they’re trying to speed up the process by cutting out a secondary check or by rushing through a minor part of the process.

But those processes were specifically built over time in order to produce the outcome you desire with a minimal failure rate. Choosing to circumvent the process runs the risk of increasing that failure rate, and why would anyone do that? Perhaps they have created the outcome so consistently for so long without failure that they have forgotten that failure is an option. Or maybe they’re trying to improve their production rate. Or perhaps they just have the confidence in their own skills to continue to avoid failure without following all of the guidelines.

It’s not malicious intent. It’s just that their perception of risk has drifted. They have grown accustomed to the risk; it doesn’t seem as, well, risky as it should. And as long as we learn to recognize it and get ourselves back on track, we’ll be fine. We can use periodic training to realign our sense of risk. We can change the systems we use to allow for more checks or to add redundancy. Most and easiest of all, we can keep the lines of communication open, letting workers use each other to remember what can happen when we fail.

Once we recognize what’s causing the drop in productivity, we can do something about it. But it can take a little extra effort to look behind the symptom to find the real cause. That’s why we encourage a culture of learning as a vital part of a Just Culture.

When a Human Error Catches Negative Media Attention

Negative PR

Negative PRWhat if a human error attracts negative media attention and risks harming your organization’s reputation, or compromises your mission or values? To err is human, and humans will make mistakes, we will even make at-risk choices that cause errors and mistakes. Knowing this, should employees be punished if the general public is looking for retribution for a harmful outcome? What role does public opinion have in our management of employees involved in negative outcomes? Keep in mind that even public outrage is an outcome, and should not be taken into consideration in the evaluation of an event.

Human errors are not good things – yet we must anticipate that they will occur. When your investigation confirms that an event that has garnered public attention is the result of human error, we must look to next steps.

Ideally, we can settle negative attention in a Just Culture without compromising the commitments that we have made. Take a deep breath and be willing to stand behind what your organization has decided is the "right" way to practice and manage. Stand up and be open and honest: “Yes, a bad thing happened, but after a thorough investigation we have found that this tragedy is the result of a human error, one that any human could have made. Pursuant to that finding, and in light of the Just Culture principles we have supported since [date], we will continue to support and console the involved employees. We as an organization, however, have taken every step possible to ensure that the circumstances that led up to this mistake have been thoroughly investigated and we have enhanced our systems and processes to ensure that this tragedy cannot be repeated.”

Proactive steps to explain our organization's implementation of Just Culture to your local community may assist you with being able to speak openly and honestly about the choices your organization will make when an event occurs.  Without the ability to withstand public pressure, punishing an employee based on public outcry can damage the trust that a Just Culture works to build between leadership and employees; your organization will have to decide if that’s a price worth paying.

Total Quality Management: Is there a trade off involved?



Perhaps the greatest challenge for any organization is the management of competing values and the acceptance of design trades. When I speak of design trades, I am eluding to the fact that in any system design exercise, we often allow the infringement of one value in trade for the better fulfillment of a competing value. For example, in healthcare today we allow patients/residents the freedom to move around their rooms and the hospital while we provide care. This is of course done in the fulfillment of patient satisfaction and other important values. That said the design trade within this socio-technical system is the increased likelihood of a serious injurious fall. The value of safety is now compromised due to the unassisted movement with a patient who has impaired mobility or additional factors and who will engage in these activities for a multitude of reasons. This fact considered, we employ a number of different system design strategies to mitigate the risk of a serious injurious fall, things such as bedside alarms, fall risk assessments and floor mats to name a few. All of this speaks to a particular design law we must heed when designing socio technical systems. Design law eleven from Outcome Engenuity’s sixteen design laws states, “All systems suffer from design trades – maximizing performance toward one value will ultimately hurt another value, or the mission itself. The closer we get to perfection toward any one value, the higher the costs to other values.”

The importance of this law becomes critical in the application of Total Quality Management philosophies such as six sigma and lean sigma. As a six sigma black belt and a lean sigma practitioner, I can appreciate the value that both of these principles bring to the table. However one of the challenges that exist in applying these methodologies is to ensure we have examined the system design through the lens of each value. In many instances we are removing the variation within a process while only examining that variation or non-value added step through the scope of one value. Thus again, potentially overly impinging upon another value that we did not consider in our design of experiment or work flow analysis. It is vitally important that as we employ TQM philosophies we keep in mind the design trades and multitude of values we are attempting to uphold within our operational expanse.

At-Risk Behavior

At-Risk BehaviorJust Culture defines three behaviors: human error, at-risk behavior, and reckless behavior. At-risk behavior is the tricky one to work with; it’s the most prevalent and it takes many forms. Risky behaviors can be difficult to spot and the employee may not even realize they’re making a risky choice.

In most cases, the employee either doesn’t see the risk in the choice or doesn’t think the risk is significant. Maybe they’re a little too comfortable in what they’re doing. Maybe they think they’ve “got it under control.” Or maybe they’re just trying to improve their efficiency, skipping or glossing over steps they feel are not important. A risky choice can also be hidden behind a human error, making it more difficult to identify and correct. For example, a baker bakes a cake for the wrong length of time (human error) because she chose not to read over the recipe that she has used for many years (risky choice).

A Just Culture treats at-risk behavior with coaching. When an employee makes a risky choice, we talk to the employee about the risk, making sure they understand the how and the why of the situation. This is not considered punitive; it’s intended as part of the learning process. We need to ensure that the employee understands how the risk can threaten the values of the organization.

Risk is part of doing business; we will never escape it. But we can manage risk; we can be careful and smart about which risks to take. We can also learn from risks taken in order to avoid taking the same risks over and over. And we can encourage our people to let us know when they have taken a risk, whether it worked or not, and explore how it could have worked better.

Human Performance Management Process

Performance Management Process

Performance Management Process

People are a major part of our work systems. People are also prone to making mistakes, but that’s not always the whole story. Managing the human element means working with a diversity of variables, but let’s briefly touch on a few helpful strategies for controlling behavioral choices in our work systems.

Education is a fundamental strategy. For training purposes we make a distinction between knowledge and skill. We use the term knowledge when we refer to what people know, and skill relates to the ability to use and apply knowledge to perform a task or accomplish a goal. We can impart knowledge with training in a classroom setting, but skill comes with experience in using the knowledge to improve the system in some way.

In the analysis of a work process, aspects will be noted that can affect the worker’s performance, in good ways and in bad ways. As we identify these, we can take them into account in our system, minimizing any negative effects that we find. For example, if workers are having difficulty with the controls of a certain operation, perhaps the controls can be re-ordered to make the operation flow more smoothly. Decades of experience have gone into the organization of an airliner’s controls with the intent of streamlining the process of flying the plane.

The ability to see and understand the risks in a work process is vital to the strategy of avoiding risk. Teaching your people why each step is on a checklist is an important part in their understanding of why they shouldn’t skip any of the steps. It’s also important that they understand how small problems can lead to bigger problems, especially in processes that can have catastrophic outcomes. Back to the plane example, we need to have the skills and training to spot failures before the plane takes off. If we catch the small risks and failures, it helps to avoid the really bad outcomes.

These strategies are all designed to improve communication with your people and help your team produce better outcomes. What are some examples of ways you help keep your team on track? Let’s discuss it in the comments below.

Qualitative versus Quantitative Risk Models

Qualitative versus QuantitativeA socio-technical risk model can be both qualitative and quantitative; the qualitative model illustrates system design with the quantitative model becoming a guide to actual practice. As a qualitative risk model, the model itself becomes a visual representation of basic events and how they combine to produce an undesired outcome. A qualitative model shows the inter-relation between equipment failures, human error, and at-risk behaviors that can combine to produce the undesired result. With probability estimates included, the model provides a quantitative estimate of the relative risks, from one branch to another and from one failure combination to another. Quantitative risk models are the higher value trees; however, they also face criticism associated with the uncertainty of their numbers.

These quantitative models better represent the risk factors we might expect to see in an operational environment, including the contribution of at-risk behaviors. At this point, there are two paths to complete the model – working back from the top-level event, or working up from the bottom-level events. If we know the top-level risk, we might adjust the numbers below to obtain an approximation of what is happening in the actual system.

For example, if we knew that an organization was experiencing five events per year, we would have to drastically change the error rates (beyond the point of reasonableness) or add in rates to the at-risk behaviors to produce a model that predicts what is actually happening. The model might now predict a rate that matches actual event occurrences, but are the underlying estimates reasonable, or accurate? While this model is quantitative, it is far from being considered the objective “truth.” This, however, is not the point. Does the model give us information that will guide further inquiry? Does the quantitative model provide us with more information than the merely qualitative model? The answer is “yes” – it provides a visual representation of risk that can guide decision-making around further inquiries into the sources of risks, along with giving the organization the ability to update probability estimates through specific data collection and event data to understand if norms or behaviors have changed.

The Role of the Imposer


imposersLet’s talk about imposers and their role in your organization. These are the people we allow or elect or appoint to hold us to task, to keep us on track with our rules and our values.

The most obvious example of an imposer, or at least the one everyone is familiar with, is the government. In order to have a functioning society, we have to assign people and resources, including laws, to the maintenance of the society. The people can be represented by police officers, tax collectors, and military personnel, just to name a few.

Imposers are granted the authority to reward or to punish based on the set of rules or guidelines in place. The organization granting the authority places the imposer in a potentially awkward or difficult situation, so it’s important for the organization to fully support and cooperate with its team members. It needs to trust the people it empowers, and it needs to get and use feedback from them as well. The government has laws in place to support the police and tax officials, for example. It’s really easy to forget about the reward part of this process, also. Much emphasis is placed on the negative, but positive reinforcement is a great way to boost morale and encourage good choices.

In a corporate setting, managers and supervisors take on the imposer role. They are “in the trenches” alongside your employees, so they are the ones regularly tasked with praise and sanction. To make that job easier for them, you’ll want to have your expectations as clearly established as possible for the sake of consistency. Supporting the imposers as well as the employees will give your team the strong foundation it needs to succeed.

How do you empower your imposers? What do your imposers enforce? How do they punish? How do they reward? Continue the discussion in the field below.

Define Justice

Define Justice

Define JusticeJustice is a word we in the Just Culture Community throw around quite a bit. It’s also a word with some very specific connotations, though, so let’s look into what it means and why we use it so much.

It’s no wonder people get confused when thinking about justice; the dictionary has quite a variety of definitions listed, most of which depend on the context. We see terms like morally right, factual, and legally accurate in there, and it doesn’t do much to delineate these widely varied ideas. One that comes up quite a lot, though, is fairness. And fairness is a more complicated idea than most people realize.

In a Just Culture justice is primarily addressed in terms of accountability; we have to hold people accountable for the choices they make. In order to effectively do this, we need to be able to distinguish what is a choice and what is out of the control of the individual, specifically that which is controlled by the system. Documenting the work process allows us to gauge whether the failure occurs in the process or in the choice. We can then address the process flaw or hold the employee accountable for choosing to vary from the process.

The intent is to strike a balance between the overly punitive and the blame-free cultures. When we punish every slight, people will hide everything they do. When we hold no one accountable, people will lose the desire to do their best. The middle ground is to set a standard and hold people to it consistently.

We need to be consistently and repeatably fair and just, both to our employees and to the company. The ability to do that is key to building a Just Culture.