Skip to main content

LOOKING FOR SOMETHING?

Martin Art

Overcoming Safety’s Blind Spot

To expand our collective intelligence and better protect the workforce, we must treat all employee concerns as predictions of unwanted outcomes.

“If you’re not prepared to be wrong, you’ll never come up with anything original.” -Sir Ken Robinson

Innately curious and hardwired to seek order, humans often grasp onto the latest ideas and inventions that help us satisfy our need to understand the world around us. Don’t believe me? Consider geocentricism, or the now-obsolete belief that the Earth is the center of the universe.

In earlier millennia, we followed herds of animals as they migrated, all of us sleeping under the stars. Our ancestors knew the night sky better than many of us do today. As they watched their world revolve around them, they assumed that the universe must have been created in service to them – right up until the works of Copernicus, Galileo, Kepler and Newton confirmed heliocentrism (i.e., that the sun is the center of our solar system).

Naturally, our ancestors were slow to accept this new information because changing deep-seated beliefs is no easy feat. They experienced discomfort and outright fear, preferring the warm and fuzzy feeling derived from trusting that the universe was solely focused on their needs.

Much like our ancestors, humans today often feel certain that we know all there is to know. When we dip a teaspoon into the proverbial ocean of available knowledge and find no whales, it is easy to assume whales do not exist. There weren’t any in our saltwater sample, right?

I noted earlier that it is not uncommon for us to acquire a small amount of information and run with it. Sometimes we will even protect and defend that information when someone challenges us or proves us wrong. Why do we respond in that manner? Well, our beliefs feel like a cozy security blanket, and our discomfort is provoked when someone snatches that blanket away. Though modern human society has undoubtedly evolved, the reality is that we are not vastly different from our ancestors. One exception is that in 2026, there is no valid reason why we shouldn’t trade in our teaspoons for much larger buckets of knowledge.

Predictive Processing Errors
We should not be surprised that safety has evolved in much the same way, given that it is a product of human systems and behaviors. Still, recognizing that can be difficult without the proper perspective. Let’s start by acknowledging that so long as we are breathing, there exists an infinite number of circumstances, decisions and other possibilities that could impact us. These possibilities intersect – again, in infinite ways – sometimes combining to produce worksite incidents. Yet too many industry professionals believe that we can identify and prevent the limitless number of potential event scenarios, using incident analyses to determine and address root causes.

Incidents stem from predictive processing errors, which is just a fancy term for certain mistakes made by the human brain. Here is the critical part that readers must understand: We cannot and will not predict an incident that we have never previously experienced or imagined or do not believe could occur. Recall the teaspoon-and-whales fallacy referenced earlier.

Time constraints and other job pressures are not uncommon in our industry. When we feel as though we must make a snap decision, the brain looks for cues and patterns based on our previous experiences and education. It will not make any decisions based on information it does not possess. In other words, greater intelligence and safer jobsites are much more likely to emerge when we share information with each other, preferably in healthy group settings that support quality interactions.

But here’s the catch. First, the human brain wants to conserve as much of the body’s energy as possible. Second, many of us are painfully aware that some individuals and working environments require more of our energy than others do. With that said, we can try to avoid interacting with coworkers who deplete us, but we also must realize that nothing results from a conversation that never takes place. No one will learn anything new that could enhance safety for all.

Optimism Bias and Experiential Blindness
Optimism bias – or a person’s inclination to underestimate the likelihood that something could go wrong – is a predictive processing error often made when the brain is in a state of experiential blindness. Our ability to accurately predict future events is severely hampered if we have never experienced those scenarios or imagined they could happen. Discovering our mistakes can be troubling as well, often triggering an emotional response that blocks the brain’s frontal lobe – the part that controls our critical thinking and executive functions.

Keep the previous paragraph in mind as we work through this next part. Let’s say someone raises a concern during a job briefing that is almost immediately dismissed by the rest of the crew. A concern shot down that quickly will likely never elicit any feedback or suggested actions from the group. But what if we reframe the way we think about concerns, treating them as predictions of unwanted outcomes? The more concerns that we raise, the more possibilities we can imagine, which increases our likelihood of identifying mitigation strategies that protect employees, customers and the public.

Safety is a product of our interactions on the job. Humans have invented the social hierarchy that exists in many organizations, but here is the reality: Every single worker is a piece of our puzzle. Remove one of them and outcomes change. The same is true in non-utility environments. For instance, a patient’s care could be undermined if the sheets on their hospital bed are soiled or their meals haven’t been carefully prepared. The next time you decide to bake your favorite loaf of bread, omit a couple of ingredients and let us know if it looks and tastes the same way you remember.

Tracking Events That Don’t Happen
Stop me if you’ve heard this one before: “If we can’t measure it, we can’t manage it.” That may be true for many things, yet it is not wholly accurate for safety. Yes, we collect and analyze data after incidents have occurred. But in an organization that fosters high-level, quality interactions among workers – interactions that enable us to carefully and intentionally move forward together through uncertainty – how do we measure and manage those incidents that never happened because of our safety efforts? The simple answer: We don’t. Instead, we focus on learning what we can from every outcome – wanted, unwanted, expected and unexpected.

Upstream Signals
There is one last thing I want to mention here. Since we do not typically notice what we are not looking for, we can miss weak upstream signals that point toward unwanted and unexpected outcomes. That means we must make a concerted effort to improve our observational skills, with success greatly dependent on team dynamics. Is there synchrony among our team? If not, who or what is out of alignment?

Our next safety meeting could provide some clues. While there, observe who is sitting in the back of the room. Have those individuals mostly assumed the same body language and posture, such as crossing their arms? Isn’t that interesting when we consider the tidy sum many of us are willing to pay for good seats to an event we are eager to attend? In fact, I do not believe our workers who sit in the back row, arms crossed, are consciously or voluntarily making that choice. I believe their behavior is an indicator of an unsettling, undefined “something” occurring upstream in the organizational system.

Conclusion
Safety should be about playing to win – not merely playing not to lose. To continue the utility industry’s safety evolution, it is imperative that we foster work environments in which concerns are treated as predictions of unwanted outcomes and our interactions with one another are considered opportunities to expand our collective intelligence.

About the Author: William N. Martin, CUSP, NRP, RN, DIMM, is president and CEO of Think Tank Project LLC (www.thinkprojectllc.com) and SAFR LLC. A third-generation electric utility worker and medical professional with extensive experience in high-risk operations and emergency medicine, he served nearly 20 years in lineman, line supervisor and safety director roles. Additionally, Martin spent 23 years as a critical care flight paramedic and registered nurse with cardiology and orthopedic experience. He earned a Diploma in Mountain Medicine and was an instructor/trainer for the National Ski Patrol. Currently, Martin writes and speaks nationally about safety and human performance, with a special focus on unleashing human potential.