So given my vocal advocacy of prevention, you might be surprised to learn that I believe that in many cases prevention has gone overboard and that in many cases companies would be better served by doing LESS prevention and more contingent planning.
Heresy?
Consider the organization that spends tens of thousands of dollars each year preventing accidents that would likely have little or no chance of ever happening. These companies have 20-person safety committees that meet once a week to argue about why an over-burdened maintenance department hasn’t fixed a low-priority hazardous condition.
Prevention costs money and resources that may well be better spent elsewhere in the organization—and not necessarily safety. Equally damning: organizations that continue funding convoluted safety bureaucracies that unnecessarily add heads, complexity, and cost in the name of preventing injuries.
I’m not suggesting that we return to reactive safety practices, far from it. What I am saying is that there is a time and a place for prevention, but it is not a panacea. Simply put, you can’t prevent every accident, and in some cases you should be looking for ways to protect workers when your best efforts to prevent an accident fails INSTEAD of wasting time on prevention.
Variation in Human Behavior
As organizations, we’d all like to think that we hire smart, capable people, and for the most part we do. We spend days (and thousands of dollars) screening candidates: we ask them probing questions to find out how they reason, how they solve problems, and how they think. We do background checks and ask professional references whether or not the candidate is worth offering them a position. We screen the candidate for illicit drug use, criminal misdeeds, and the things in life that indicate that whether or not the candidate has sound judgment. In the end we confidently hire the candidate and invest time and money training the new hire so that he or she can meaningfully contribute.
And then it happens.
The person that we spent so much time screening and training gets hurt and we think to ourselves, “if only that idiot would have…” Huh? Now because the employee got hurt he/she’s suddenly an idiot? You may read this and think that you are immune to such thoughts, but the majority of the people I hear describing injured workers as idiots are safety professionals.
They Call Them Accidents For A Reason
As much as we would like to assign accountability for injuries, the fact remains that in almost all cases whatever happened to injure the person was unintentional, or at very least, the person who committed the unsafe act didn’t fully comprehend the potential consequences of his or her actions; the accident was an unintended outcome; in short, the injury was an accident.
Accepting that things will go wrong, that people make mistakes, is a bitter pill to swallow.
We are taught to believe that making mistakes are bad, subject to punishment, and indicative of poor judgment or out-and-out stupidity. But everyone makes mistakes—we learn by trial and error and without mistakes there can be no learning, at least not organic learning that lasts.
Everyone Makes Mistakes, But No One Should Have To Die Because of A Mistake
I’ve read (I can’t remember where) that the average person makes 5 mistakes an hour. Multiply that by the 2080 hours in the average work year and you have a boat load of mistakes.
Some theorize that because, biologically speaking, change is reckless and dangerous (nature tends to have a “if it ain’t broke don’t fix it” approach to survival) if a species is thriving it resists change. In fact, change is so dangerous, that our bodies are hardwired to resist it. When we are confronted with change it triggers our flight/fight response and causes us stress.
Conversely, species that are unable to change are unable to adapt to changes in their environments and are driven to extinction.
So it would appear that we are damned if we do and damned if we don’t. But if the research that found that the human brain will make 5 mistakes an hour is correct, what possible advantage would there be in these mistakes? Making tiny subconscious, non-cognitive mistakes could be our brain’s way of testing the environment by disrupting our routines in small ways. If the mistake leads us to a better way of living we make serendipitous discoveries and innovations, but if the mistake leads to an undesirable outcome we see it as an error. But in both cases our brains learn about the safety of deviating from its routine and we are better able to safely adapt.
Variation Leads To Errors
Experts in quality, particularly in manufacturing, cannot emphasis the danger of process variation strongly enough; when the process varies things go sour very quickly. Manufacturing and process engineers have made huge strides in reducing mechanical variation, but the variation endemic to human behavior is so pervasive that it’s all but impossible to eliminate it, or substantially reduce it. Outside of the military (and quasi military—police, security, etc.) it is very difficult to control human behavior. Even variation in cognitive behavior is difficult; how many companies have problems with poor attendance? Certainly at least some of the causes of absenteeism are cognitive decisions where the offending employee simply chose not to come to work.
Focus On Contingency Not Prevention
Okay, relax. I know that I preach prevention above all things, but when it comes to variation in human behavior you just can’t prevent most of it. If we could there would be no crime, no traffic accidents, and no medical malpractice. And to make things even more complicated, human behavior can be very tricky to predict, and even more difficult to prevent.
We have to stop pretending that all our problems can be solved through preventive measures; sometimes—despite our best efforts—things go sideways and when they do we had ought to have some contingency in place to prevent a mishap from becoming a disaster or a tragedy.
When it comes to contingency versus prevention, it doesn’t have to be an either or decision. I used to teach problem solving and we used a very simple tool for determining whether to use a preventive countermeasure or a contingency countermeasure. We would rate both the probability and severity of an error in terms of high, medium, or low. If the probability that the particular failure mode (engineering speak for a screw up) is high—in other words it is almost certain to happen under the given circumstances—then one should definitely find a preventive action.
If the probability is low (fairly remote, but possible) one would need to temper the response after considering the time and money it would require to implement.
Similarly, if the failure mode’s severity was high (if it DID happen the consequences would be severe) than one would have a contingency in place to protect workers, property, and inventory. Of course if the severity was expected to be low one would again determine whether the protection offered would be worth the cost of the required resources.
Because one rates the severity separately from the probability, one ends up with two scores that must be considered together. Certainly if the probability is high AND the severity is high one would implement both preventive and contingency controls. On the other end of the spectrum, if both the probability and severity were low, one would likely only take action if the countermeasures were cheap and easy to implement.
But scores that are in between (medium probability and low severity, etc.) are subject to a lot more judgment-based decision making. This may seem like a serious weakness to some, but on the contrary, this subjectivity allows an organization to customize it’s countermeasures to its unique environment and situation.
It would be great if we could accurately predict and prevent injuries, but the reality is we can’t. We have to be pragmatic and take important steps to ensure that when someone does have an accident, protections are in place to keep the injury from becoming life-altering or fatal.
Originally posted at http://philladuke.wordpress.com/2013/01/