Licensees Page Header

Physical Therapist guiding a patient through an exercise

Text/HTML

Basic Concepts of a Just Culture

Originally published in Forum Magazine

Originally published in the Winter 2009 Federation Forum Magazine.

Just culture is the process, the concept, of attempting to manage human fallibility through system design and behavioral choices that we have within our organization.

Just culture has been successful at a number of different organizations. One airline cut its maintenance errors by 50%, while another airline reduced its ground damage by 50%. A hospital’s hand hygiene compliance rate went from 65 % to 95 % and it was attributed directly to the work done around just culture. One of the results of just culture implementation at the Medical Malpractice Insurance Company in Minnesota was that a member of the staff realized he/she could admit to a mistake and have the entire staff benefit from it.

In the medical industry today, prevention consists of punishing people for making mistakes. Therefore, when someone does make a mistake, we don’t have the ability to learn from it. The ability to learn from our mistakes is our starting point, and it is one that varies from hospital to hospital and from healthcare system to healthcare system.

The Federal Aviation Requirements – rules that guide all pilots and mechanics – dictate that no person should operate an aircraft in a careless or reckless manner so as to endanger the life or property of another. We have a sense of reckless, but what is careless? The NTSB, the governing body for aviation, defines it as the most basic form of simple human error or omission. It is the simple human mistake. If we say that you cannot make a mistake and we are going to hold you accountable for your mistakes and punish you, will you raise your hand when you make a mistake?

Healthcare is falling into the same track as aviation in the sense that we too often hold people accountable by punishing them for human error. That’s not going to advance the culture of learning. As regulatory bodies, we have to be careful about what we want to regulate and how we are determining accountability. That doesn’t mean we can’t hold people accountable for human error. Just culture is about the most effective way to hold someone accountable. Punishing someone for a mistake may not be the most effective way to help them to learn from that mistake. Reviewing all the things that occurred to cause the error may further the reliability and the advancement of the operation.

Noncompliance of hand hygiene is pretty important. But what if we fired or punished every person for noncompliance? Human resources would be busy and there would certainly be a shortage of nurses and doctors! It’s not necessarily the wrong way to go, but it is at one end of spectrum - blame the people involved. Find out the person who made the error, punish them and you’ve solved the problem. Others may say that the problem is seldom the individual but rather the fault of the system. Change the people without changing the system and the problems will continue. However, sometimes people make bad choices. Just culture attempts to find the most effective way to hold both the people and the system accountable.

Sometimes our system puts the employee, the staff member or whoever it might be between a rock and a hard place. It’s going to turn out badly either way. Most pick the lesser of two evils. People do not come to work wanting to do a bad job or wanting to have a bad event happen. They come to work wanting to do things right but they drift.

As managers, executives and regulators, there are things we can control. We have to decide if we need to make changes to have a greater impact on our human errors and our adverse events. We must balance our input with our output, and decide how we can be proactive. Missed events are a precursor to bad events. It is very important to look at those missed events and examine the system design and behavioral choices that occurred. Risk exists. To err is human. To drift is human. We will all make mistakes.

We are taught to drive with our hands at 10 and 2. We drift to 9 and 3. I am an 8 and coffee kind of guy. My wife is 8 and makeup. Others are 8 and cell phone. That’s drifting. I have as much control of my car at 8 and coffee as I do at 10 and 2. But, in a blizzard, I am not at 8 and coffee. I’m at 10 and 2 with the radio shut off and no distractions, because I perceive the risks. My drifting has stopped.

Risk is everywhere. It can be a perception, it can be an absolute and it is not essentially bad. Physical therapy involves risk. Is the risk worth it? I think probably so. Surgery is risky. Is it worth it? Absolutely. There are many things that we do in healthcare that are extremely risky but they are worth it when we manage and support our values. We must also examine the severity of risk versus likelihood of a good outcome. It’s a gamble that we face many times in healthcare. How we regulate and manage that risk is another tough question. Safety is just one of our values. Integrity, collaboration and innovation are all values that must be supported and in some way put on an equal plane.

As a CEO of a healthcare system, your resources are not infinite, so you have to make decisions. Is a patient’s only concern safety? No, there are other values, such as privacy and comfort. There are a lot of competing values, and we are accountable across all departments, across all positions and across all behaviors. We are all accountable for human error, at-risk behavior and recklessness. That’s what makes the just culture effective. Everyone sees that we are all working to support these values, from the CEO on down.

Text/HTML

Managing system design

One of the things that we can control is system design. We can manage system reliability by looking at the factor that influences our rate of reliability or rate of error. Human factor design is one of them. For instance, the use of red and green lights on a car dashboard is the result of an old military study on humans. Those colors are easier for night vision, so they reduce the human risk factor.

We can also institute barriers that won’t even allow the error to occur. In Minneapolis, five registered nurses delivered an adult dose of a medication to premature infants. Three of the infants died. Why did they make the mistake? What was the at-risk behavior? The medication was in same spot as it has always been and it had the same orange labels it had always had. But the nurses fell into at-risk behavior and didn’t read the label. If it’s in this spot, and has an orange label, it must be what it has always been. But what they didn’t know was that the manufacturer had changed the color of the adult dose and infant dose prepackage and the packages looked the same. The pharmacy tech got it mixed up and stocked it, and they had their bad event. The hospital reacted by no longer doing a prepackage dose, but doing calculations instead, and two nurses must now verify that calculation before administration of the medication.

We must look at factors that influence our system reliability and work to manage them. Managing human reliability involves examining the rate of human error within the operation, skill and knowledge, perception of risk and qualifications. You also must understand the person’s strengths and weaknesses. Are they book smart, or can they also apply what he learned?

Text/HTML

Managing behavior

There are three behaviors - human error, at-risk and reckless.

  • Human error is when the mistake was not intended.
  • At-risk behavior is when a person chooses to do something not knowing or not ascertaining the risk.
  • Reckless behavior is substantial, non-justified and conscious disregard.

We can solve these behaviors by keeping employees informed, coaching about at-risk behaviors and holding people accountable and taking disciplinary action for reckless behavior.

Text/HTML

Event investigation

An event investigation is simply a tool that helps us learn about what happened. Normally, that includes a procedure to determine why it happened, what lead to the event, what were the causes and what was our system design in play at the time?

As you ask these questions, you gain more value to your investigation. You can determine the cause-and-effect relationship, explain every human error, and explain at- risk behavior and the procedural deviation. Learn from the event. Form your own risk model. Don’t just be a firefighter who runs around all day putting out fires without learning why the fires are occurring.

Just culture is a journey. Its point is not to go from A to Z, but rather a journey on which if we take three steps forward, we might take one step back. You measure your outcome, you measure your system, you measure your choices, you measure your bad outcomes and you try to make improvements. It’s about that continuous process of measuring. This is the core tool in just culture. Just culture is about executive commitment for values and system design and behavior choices. It’s a partnership with the regulator, and it’s about the doing the right thing.

John Westphal has worked with NASA, multiple hospitals and healthcare systems and the airline industry developing prevention strategy at a systematic level. John’s recent work at NASA includes work on human engineering design requirements and risk modeling for the NASA constellation project. Specifically in healthcare, his work has centered on cultural implementation for healthcare systems and risk assessment.