Skip to main content

LOOKING FOR SOMETHING?

Don’t Blame People for Human Error

Written by Mark J. Steinhofer, CHST, CSP, CUSP on . Posted in .

The first lineman scaled the pole and tried to perform the task on the conductor. After a minute or so, the supervisor yelled, “You’re doing that wrong!,” told the lineman he was incompetent and sent a second lineman up the pole in his place. The second lineman started the task only to hear, “That’s not how it’s done!” before returning to the ground. A third lineman took a deep breath before he climbed. He looked over the job and started to work. Soon the supervisor bellowed, “What’s wrong with you? That won’t work!”

This scenario illustrates the way the utility construction industry traditionally has dealt with human error: by blaming people instead of flawed processes. The supervisor assumed the linemen were making mistakes instead of reasoning that there must have been a fundamental flaw in the task or their training.

What is Human Error?
We define human error as undesirable human decisions or behaviors that reduce or may reduce safety and effectiveness. Errors typically fall into one of four categories:

  1. Mistakes result from ignorance of the correct task or the correct way to perform it.
  2. Mismatches occur when tasks are beyond the physical or mental ability of the person asked to perform them.
  3. Noncompliance or violations happen because someone decided not to carry out a task or did not carry it out in the way instructed or expected.
  4. Slips and lapses result from forgetfulness, habit, fatigue or similar causes.

Blaming individuals is the easy way out, and it doesn’t prevent errors. For one thing, sometimes the best people make the worst mistakes. And second, mishaps are anything but random; they tend to fall into recurring patterns.

The System Approach
Instead of focusing on individual shortcomings, the system approach assumes that humans are fallible and errors are to be expected. It categorizes errors as consequences rather than causes, originating not in human behavior, but in the conditions under which individuals work. Since we can’t change the human condition, we need to change the conditions under which humans work by building defenses or trying to mitigate the effects of errors.

The system approach assumes that you could put multiple people in the same position with the same conditions, and each of them will fail because the system is inherently flawed. Utility construction is full of complexities and unclear design aspects, and construction workers are accustomed to compensating for them. But doing so can create errors because humans can misperceive situations and then take flawed courses of action. By analyzing processes and leaving less up to chance, using the system approach reduces the potential for incorrect actions.

The Swiss Cheese Model
A number of years ago, James Reason proposed the Swiss cheese model, the use of which can be applied to your company’s safety program. Think of the elements of the program as slices of Swiss cheese. One slice represents training, another is documentation, a third is engineering controls, another may be personal protective equipment and so forth. Each slice has some random holes you can’t prevent, and some slices have more holes than others. For example, if your safety training is excellent, that slice may have only a few small holes, but if PPE enforcement is lax, the slice may be full of larger holes.

Each cheese slice is part of your defense against errors and incidents, but those holes allow errors to sneak through. When you line up the slices, a hole in one is probably covered up by another. Your crews might not have all the correct PPE, for example, but your safety training is so good that it compensates for that. Still, it’s possible that even with all the slices stacked together, there’s enough of an opening somewhere that something can slip through every slice. So, your efforts should focus on identifying and eliminating the holes in your defenses. As each layer, or slice, becomes stronger, the overall potential for incidents declines.

Where do the holes in your defenses come from? There are two sources. One is unsafe acts that people commit, including slips, lapses, fumbles, mistakes and procedural violations. The other is latent conditions, which arise from decisions that have been made by designers, builders, procedure writers and top-level managers. Latent conditions can lead to error-provoking conditions in local workplaces, such as understaffing, unrealistic deadlines, inadequate equipment, inexperienced workers and fatigue. They also can create long-lasting weaknesses, such as unworkable procedures, and design and construction deficiencies.

Managing Human Error
Minimizing human error involves two goals. First, limit the incidence of dangerous errors. Second, create systems that are better able to tolerate the inevitable occurrence of errors and contain their damaging effects. The better your error management system, the better you’ll be able to stop a threat early in the chain of events.

Taking the time to review objectives and responsibilities for each task may seem to slow things down, but in truth it prevents problems and delays. When workers make assumptions about what is going to be done instead of having clear knowledge, mistakes are more likely.

A crucial element of effective error management is establishing a reporting culture in which workers are unafraid to report not only mishaps and incidents, but also near-misses. Without those reports, there’s no good way to uncover and identify recurring errors that may grow into incidents. You cannot achieve an effective reporting culture without having a strong safety culture and management systems that support it.

By combining the system approach with a solid reporting culture, you can expect a significant reduction of incidents, along with an increase in worker satisfaction and productivity, because everyone will clearly know what to do – and what to avoid.

About the Author: As manager of HSE services for Safety Management Group, Mark Steinhofer, CHST, CSP, CUSP, has managed safety advisers and worked with diverse clients in the utility, pharmaceutical, construction, chemical and general industries. He also has consulted as an expert witness in several court cases and currently is an adjunct professor at Indiana University-Purdue University Indianapolis, teaching an undergraduate course in construction safety and OSHA standards.