How does your organization respond to bad news?
When something goes wrong, do you get pushback and denial: Why change? Our safety record is better than average. Or is there a rush of meetings, anxiety and guilt and a scramble for a quick fix?
How about employee safety suggestions or reports of risks - are they respected or bound for a black hole?
Let’s examine how workplace cultures, and especially executives, try to make sense of safety issues. We draw on the work of Dr. Karl E. Weick, Rensis Likert Professor of Organizational Behavior and Psychology, at the University of Michigan.
Dr. Weick's research interests include collective sensemaking under pressure, medical errors, and high-reliability performance. He's authored two books that might interest you - Making Sense of the Organization (Blackwell Publishing, Ltd. 2001) and Managing the Unexpected (with Kathleen M. Sutcliffe, John Wiley & Sons, Inc. 2001).
Taking in inputs You make your living in safety presenting all sorts of information to your organization - injury stats, comp rates, compliance audits, accident reports, perception surveys, climate surveys, and on and on. So it "makes sense" (pun intended) to probe how organizations respond to these inputs and make decisions. It's a process called "sensemaking."
For example, here's how three organizations chose to interpret safety data:
- The National Tank Truck Carriers studied OSHA recordkeeping data covering 70 facilities in 2002 and concluded that about 90 percent of tank cleaning workplace injuries are the result of employee behavior. The message for managers: develop safety programs that focus on employee behavior, said an industry official.
- Lobbying the California legislature for "rider responsibility" laws in 1999, the Walt Disney Company blamed rider misconduct for eight out of ten amusement ride injuries.
- Duane Amato, former VP for environmental, health and safety compliance for Baxter International, Inc., said in a company web posting that "85 to 90 percent of our accidents occur due to unsafe behavior on the job." Baxter decided to eliminate unsafe behavior through root cause analysis. Identify the specific organizational deficiencies that might have contributed to at-risk behaviors. Then the company rolled out an occupational health and safety audit series called OHSAS 18001 to assess how well managers and employees fixed those deficiencies.
High stakes How your organization "makes sense" or rationalizes safety reports is critical to your success as a safety pro.
How's that?
It's a chain reaction. How your execs choose to act on (or ignore) safety data defines your organization's beliefs about safety. Those beliefs become cultural norms (the way we "do" safety and think about safety around here). And they shape the employee attitudes and actions you deal with daily.
What gets attention? Here's one way to check how your organization makes sense of safety issues:
When your leadership looks at accident reports (assuming they do), what do they focus on? We'll simplify their options to two choices: the "sharp end" or the "blunt end" of the chain of errors that led to the incident.
Sharp end and blunt end are terms used by medical professionals assessing errors or adverse events in patient care.
Errors at the sharp end, also called active errors, are noticed first. They are usually obvious (pushing the wrong button, ignoring a warning sign) and almost always involve someone at the frontline. Studying accident reports in the 1930s, insurance investigator H.W. Heinrich focused on sharp-end errors, concluding that 88 percent of all workplace accident and injuries resulted from the unsafe acts of workers.
Blunt-end errors, also termed latent conditions, are trickier to make sense of. They are triggered by system breakdowns in your organization - not frontline behaviors. These "systems" often defy easy definition. Systems can be formal and documented, or simply described as the way things get done. One way or another, you have safety-related systems for allocating resources. Assessing and prioritizing risks. Training employees. Reporting and responding to workplace hazards.
Complicating blunt-end assessments: other organizational systems influence safety performance. Staffing levels, for instance. Performance objectives. Incentive and discipline polices. How employees are screened and selected. How shift schedules and production schedules are decided.
Paul O'Neill, former Treasury Secretary and CEO and Chairman of Alcoa, Inc., is a big picture, systems guy. Testifying before Congress in 2001 on the startling number of people dying each year due to medical mistakes (45,000 to 100,000), O'Neill processed those numbers and concluded:
"The level of mistake-making… is not because doctors and nurses are being sloppy or careless, but because they are not working within systems designed for quality care and patient safety."
The road less traveled The path of least resistance when organizations try to make sense of safety problems is to go after the obvious. Those frontline errors. Workers need more motivation. Or discipline. They need to work on their "self-talk;" they need to "self-trigger" a sense of awareness.
This reaction is only human nature doing its thing. Incentive contests or training to make employees more mindful take less time and money, less organizational will, and less leadership than confronting the complexity of systems and beliefs.
Weick writes: "People are cognitively lazy." When they find an answer to a question (about a safety issue, for instance), people stop searching, he claims. No alternatives are evaluated. The trouble, of course, is that people might not know the half of what's causing the problem.
Reacting to safety issues by deciding to assess blunt-end errors - failed systems and faulty beliefs - requires authentic leadership. It's hard, slow work subject to relapses, says Weick. He says it requires "mindful management." What's that? "Create awareness of vulnerability. Get comfortable asking people, 'What's risky around here?'," he explains.
It's the road less traveled in safety, and the one Baxter International took. Baxter set performance objectives for every manager and supervisor. Managers were required to commit to safety as a personal responsibility. The strategy: use systems (such as OHSAS 18001) to correct organizational deficiencies. Of course systems are only as good as their execution.
When Marv Broman, director of corporate safety for Valmont Industries, wrote a "call to action" memo to his company's safety leaders earlier this year, he chose the same path. His message: Unless a site has sound, measurable safety systems (lockout/tagout, confined spaces, PPE, auditing, accountability, training, etc.) operating at high levels, there's no way to know if a site's injury record is a matter of good performance, or good fortune.
Sensemaking: A case study We'll close with an example of organizational sensemaking that led to a very public change in beliefs for one of safety's largest consulting firms.
In the late 1990s, Behavioral Science Technology, Inc., studied results of 73 client projects aimed at producing safety improvements. Writing in his forthcoming book, Leading with Safety (to be published this fall by John Wiley & Sons), BST Chairman Dr. Tom Krause reports, "What really caught our attention was something we had not even intended to study. We were surprised by the huge variability" in results. Some clients got results quickly, others took longer to improve, some never improved, and a few got worse.
"We became fascinated with understanding the factors that accounted" for this variation, writes Krause.
BST proceeded to study the best and worst performing organizations, and concluded that the quality of leadership and the organizational culture that resulted was the most important factor in predicting success of safety improvement initiatives.
Now BST talks less about behaviors, more about systems. Krause writes that focusing on the belief that 80 to 90 percent of incidents result from behavior causes (a statement he concedes BST made in a 1990 book) leads to blame. "And blaming is always a mistake." The way to make sense of safety mishaps, the most useful reaction, according to Krause, is not to ask, "Who was at fault?" but rather "How can this injury and others like it be prevented in the future?"
What questions does your organization ask after an accident, or when confronted with unexpected safety findings?