Posted February 2, 2013 on Phil LaDuke’s blog:
“He’s as blind as he can be, just sees what he wants to see”—John Lennon, Nowhere Man
Hazards come in many shapes and sizes—from the physical to the behavioral and all points in between. And the efficacy with which hazards are identified to a large extent shape the overall effectiveness of your safety management system. So what happens when your personal or organizational biases prevent you from seeing things accurately and honestly?
In broad strokes you tend to find the things for which you are looking and scarce little else. If your organization, for example, gathers most of its information about hazards by watching workers perform their jobs they are likely to find a host of unsafe behaviors at the expense of other hazards that are equally (or potentially more) dangerous.
Think you are immune to letting your prejudices getting in the way of your observations and decision-making? Experts would disagree.
“When You Sell Hammers, All The World Is A Nail” —Source unknown |
Bias 1: Most Injuries Are Caused by Unsafe Behavior.
Entire methodologies have grown up around the belief that you can reduce injuries by reducing unsafe behaviors. Irrespective of your personal opinions around BBS, when you believe that worker behavior is the overwhelmingly most frequent causative factor what sense is there in looking at things like poorly maintained machinery, facility issues, or ineffectual training.
Furthermore, many injuries are that ARE the result of unsafe behaviors are in fact, basic human error and may not be proceeded by overtly observable unsafe acts. So the bias toward behavior, even when behavior is INDEED a risk factor, may blind you to other threats.
Bias 2: Severity Bias. Author David Marx, identifies several biases that he believes can directly undermine worker safety (and public safety). Marx, in his book, Whack a Mole: The Price We Pay for Expecting Perfection Marx introduces the concept of severity bias. According to Marx, severity bias is the practice of enforcing greater consequences for those events that produce a more severe outcome.
Marx argues that the outcome of at risk behavior is immaterial—that the true risk lies in the flawed decision making and recklessness. In other words, it doesn’t matter whether or not an employee’s actions have never killed or injured someone, the fact that the behavior’s rewards are so out of proportion with the potential for harm is enough to judge it inappropriate. If we buy into this bias, we tend to excuse inappropriate risk taking—and even recklessness—provided that the behaviors don’t result in an incident.
Bias 3: Professional Bias. Marx also identifies a tendency to treat behaviors more harshly as one gets closer to the front line of operations. Research has shown that people tend to let higher ranking professionals off the hook not out of fear of retaliation, but simply because the higher the rank of a professional the more likely that people will assume that the executive knows what he or she is doing and is therefore less deserving of coaching or discipline. When you exhibit professional bias you create a multi-tiered system of accountability. Simply stated, you have a double (or triple) standard.
Bias 4: Some Hazards Are Just Common Sense. Another great thinker on the topic of bias as it pertains to safety is Dr. Robert Long. Long explores the relationship between risk and human judgment in his book, Risk Makes Sense.
Long contends that there is no such thing as common sense. According to Long intelligent people make sense of the situation based on their personal experiences, things they have been taught by their parents, teachers, and peers.
To expect that a worker will intuitively assess the risk of a hazard the way others in the population would is unreasonable. But often we take it for granted that people will understand the intrinsic dangers of a circumstance and fail to manage the hazard as being too trivial, condescending, or even insulting were we to mention it.
Bias 5: The All’s Well Expectation. In the fantastic book, Why We Make Mistakes,: How We Look Without Seeing, Forget Things In Seconds, And Are All Pretty Sure We are Way Above Average Joseph Hallinan takes a critical look at the factors that cause us to…well, screw up.
Hallinan notes that people tend to see the world through rose colored glasses (particularly when they are examining themselves). This tendency to see things that aren’t there can cause us to miss hazards rooted in the absence of an element. Remember the puzzles “what’s wrong with this picture?” the same phenomena is at play in our assessments of the safety of the work environment.
Assumptions
Sometimes it isn’t a bias, per se, that gets us into trouble. Sometimes we miss hazards because we make assumptions. One of the most deadly assumptions is that something is true when it is not. Dangerous assumptions pervade our work assessments like the assumption that one worker does the job exactly the same as another. Another such assumption is that the work is done the same across shifts. Because we make these assumptions our hazard assessment is intrinsically flawed.
What’s The Answer?
Putting aside our biases isn’t easy—for one, just because we have a predisposition toward a certain belief doesn’t mean we are always wrong—but being mindful of our prejudices is a great place to start.
If we can find ways to look at the work place differently (for example, listing all the individual actions, like walking, carrying, etc.) we have a better chance of getting a good view of our workplace.
Another useful method of overcoming our biases is to invite someone who knows little or nothing about the process to help in assess the risk. The fresh set of eyes is likely to yield surprisingly results.
A similar, yet no less effective method of hazard analysis is to “swap” an area with another inspector. Like the person with no experience with the process, the other inspector is likely to find hazards that you have walked by a dozen times without noticing.