Human Error and Just Culture

Sidney Dekker’s Just Culture made me thankful I don’t work in an occupation with a high risk of impacting public safety (those described in the book include aviation, health-care, and policing). In our society we believe that practitioners should be accountable for their actions, that without legal consequences after a tragedy there would be no justice. The dilemma is that tragic outcomes are more likely to be the result of systemic issues rather than bad actors, and the legal system is fundamentally unsuitable for dealing with issues of systematic safety. Worse, the risk of legal consequences stifles learning, and so our search for justice makes tragic outcomes more likely, rather than less.

Reading Just Culture after Charles Perrow’s Normal Accidents was a serendipitous pairing. Normal Accidents illustrates very convincingly that safety is an issue that largely transcends our traditional idea of human error. It makes the case that some accidents are normal and expected because of the properties of the system, and that the easy finger pointing at the practitioners misses the real story. As we should already know from Deming and manufacturing, quality is a property of the system, not the people in the system.

Picking up from there, Just Culture shows how the concept of accident doesn’t exist in law. There is always someone who was negligent, either willfully or not, and that someone shall be held responsible. The law isn’t interested in the learning of the system. It isn’t really interested in the truth as most of us would understand it. It is really about blame and about punishment.

How does your organization respond to a system outage? Are blame and finger-pointing the order of the day? We may not be subject to the criminalization of error described in Just Culture, but the organizational reflex can all too easily be to blame the developers, the testers, the system administrators, or others, when the focus should be on organizational learning, on fixing the system.

The idea of Blameless PostMortems is not new to TIM Group. We’ve done our best to use our RCAs as a tool for improving the system for several years now. Just Culture served as a reminder that we are fighting a cultural bias, and we need vigilance to avoid outdated ideas of human error creeping back into our organization. The pressure to do so is both pervasive and subtle. It would be easy to detect and fight if it were a case of managers asking “who screwed up?” It is harder when it seems like a virtue, when it is an engineer who is quick to assume responsibility for a mistake. It is a valuable trait when each individual is willing to be self-critical. The challenge is being able to look beyond the individual to the contribution of the larger system.

This is the balance we are trying to strike, between individuals who feel enough safety that they are willing to acknowledge their own contribution to the problem, and a system that doesn’t accept “human error” as a reason to avoid learning. We believe this is the path to a high-performing, and just, culture.