“To err is human.”
We all make mistakes. Each error in “judgment under uncertainty” (risk decision-making) is a potential lesson if we’re open to learning. In safety and risk, we learn better from what we do wrong than what goes well. (Hopefully it’s not a critical error that causes harm – safe mistakes are preferred lessons.)
Making mistakes and learning from them is part of our experiential risk system. So, what can we do with these lessons or data? We should encode them as individuals and in our groups. This means we can learn, contextualize them, be able to recall and apply them, and improve across the enterprise.
What are near misses?
Close calls and near misses are events where something went awry or not as planned yet didn’t do major damage or harm. They are a warning shot across the bow. “Look out!” or bad things will come your way.
The term “accident” is often thought of as something bad happening. But in safety and risk, we moved away from using accident despite the general public’s familiarity with it. “Accident” conjures up visions of an unpredictable event with a particularly negative outcome. This is not usually the case. Instead, we often use incident to denote an event that didn’t go as planned. It may or may not have had a negative outcome, but it didn’t go well.
Addressing myths around learning from close calls
Management guru W. Edwards Deming and many others believe that the overarching cause of incidents is more likely management systems and decisions, not human behaviors by individual workers.1 So, while we may observe fewer serious injuries and fatalities than minor harm, we can’t rely on numbers quoted. We also should assess our systems and decisions at the highest levels for causative factors with available data. However, some believe that documenting near misses poses threat to value concerns.
Close calls and near misses are events where something went awry or not as planned yet didn’t do major damage or harm
Threat to value concerns
People typically express or react to the things that bother them most. These are often termed threat to value concerns as they apply to risk decision-making.2 Here are three common concerns about using close calls:
Organizing them is a pain and not worth it
Some believe that near misses are too challenging to maintain and sort. Most problems are multi-factorial, and so are near misses. They do have many aspects to them, many of which should be catalogued as sorting criteria. Here’s an example:
Someone gets a splatter of some droplets of a chemical on their face shield. They escape unharmed due to their personal protective equipment (PPE). Some potential factors include PPE for eyes and face, experiment, process at that time, proximity to the chemical, and fume hood sash location. Each factor can be a field in a database spreadsheet. The factors can then be noted, rated, and ranked.
Liability concerns prevents us from using them
“The legal liability is too great.” The liability of what? Being sued?
These are typically de-identified and obfuscated such that only those directly involved would recognize the description. Plus, they can and should be done in a non-judgmental way.
Personal privacy pre-empts posting
Personal privacy is important. It’s natural not to want to share your own mistakes. Yet don’t we all want to help prevent others from making those same mistakes? If I trip on an extension cord but I’m not injured, I should report it regardless. Who wants to hear that someone else also tripped and was injured? An objectively written de-identified near miss account shouldn’t present any significant privacy concerns. But many EHS professionals worry that those involved in a near miss will still be known locally despite obfuscation. In the majority of cases, however, this is an unnecessary worry. By de-identifying the near miss, most won’t know who was involved—and chances are, they’re too busy with their own work to spend much time trying to deduce the identity anyway.
Near misses contain a wealth of data
Data is good for three things: bragging (“10 months without an injury!”), compliance (“two OSHA recordables last year”), and decision-making (“install modesty curtains for our emergency showers”). With limited data, we have a limited decision-making range. Near misses provide a wealth of data.
Many folks collect numeric data only: quantitative metrics like how many spills didn’t result in an exposure due to gloves and a lab coat. But they don’t assess why those spills didn’t have a more negative outcome. Consider the values of harvesting descriptive data, also known as qualitative metrics. You might learn that one part of an experimental process has an awkward manual technique that causes spills, or that petite staff must wear oversized gloves because there aren’t enough small-sized gloves available. The “why” behind these situations matters if you want to solve the problem. Both qualitative and quantitative data are used in research, so why not safety?
With limited data, we have a limited decision-making range. Near misses provide a wealth of data.
Telling stories to teach lessons
Consider the power of stories for sharing near miss data in a relatable way. Stories pack more data and context into a retelling of an incident than any amount of technical information. By including the perspectives of different characters, the data, factors, and causes of the near miss become clear.
Learning organizations can benefit the most
Individuals tend to be good self-directed adult learners. But at the group level, people struggle with learning. Here are a few ways organizations can use close calls as effective learning opportunities:
1. Application
Adult learners need it to apply to them. They like to problem solve. And people learn more from what goes wrong than what goes well. Learning from mishaps is part of our experiential risk system which we encode and reuse in the future. Near misses align with all of these principles.
2. Dissemination
How best can one display and disseminate near misses? Consider using differing formats and collecting feedback on which resonates and helps the most. Try creating infographics, case studies, stories, databases, photo albums, posters, interviews, etc. Each style is a chance to help more lab denizens learn what happened, why it happened, and how to prevent it from happening again.
3. Discussion
Use a human factors perspective to ensure you focus on not only what happened but also why it happened. Examine local rationality, risk perceptions, our fast-thinking brains, heuristics, and our many cognitive biases. Facilitate larger discussions, smaller conversations, or individual chats as appropriate and when the opportunity arises. Look at process improvements and revamp any existing systems or create new ones as needed.
Systematizing qualitative and quantitative data is vital
“You’ve got the wrong data.” I was talking with a client about the many pie charts, histograms, and tables they set out for me. They had everything: time of day, worker, part of the body, task, training, etc.—everything except human behaviors. Be sure to capture what people tell you. You can learn a lot with a simple prompt: “Tell me about safety here.” Then evaluate their comments for trends. You can assign each comment a score for three factors (frequency, strength, and valence) to aid in ranking them.
When trying to determine cause, be careful. There is no one right solution. Try using discourse analysis (i.e., language in use). For instance, by studying online comments after an article of a lab incident with human damage, I learned a ton about how three groups saw it differently. Taking diverse perspectives into account can help identify the best possible solution. Lastly, assess your lab operations and processes. Decide which ones to stop, continue, or start.
Key takeaways
We’re human, mistakes happen. We experience many close calls and near misses throughout our lives. Learning our lessons is vital to both the organization and all of us as individuals. It’s how we survived over millennia. There are many approaches to helping make them relevant, encoded, and scaled to the larger team or unit.
Embrace judgment under uncertainty—especially when it goes awry.
References:
- Johnson, Ashley. "Examining the foundation". Oct. 1, 2011. Safety & Health. National Safety Council. https://www.safetyandhealthmagazine.com/articles/6368-examining-the-foundation.
- Maynard, Andrew. “Thinking innovatively about the risks of tech innovation”. The Conversation. July 12, 2016. https://theconversation.com/thinking-innovatively-about-the-risks-of-tech-innovation-52934.