Nothing is “a little bit” safe. When we judge a situation as safe, we mean that the chance of a negative outcome is perceived to be low enough that we can tolerate the situation. If the scenario is deemed unsafe, we feel that the chance of a significantly distressing negative outcome occurring is too great to be tolerated. This probability of a negative outcome is defined as risk.
To have productive conversations about acceptable risks and safety in the lab environment, Jonathan Klane, EHS and risk management expert, and senior safety editor for Lab Manager, suggests discussing the factors that increase or decrease risk and how much risk is acceptable to us as individuals. Many daily activities have potential for some kind of negative result, which means we are well versed in risk and deciding which risks are worth taking. We choose what food to eat, where to take a walk, and whether to take the bus or drive to work. You can assess risk using both formal and informal methods to determine what risks are acceptable in the lab.
Three-factor analysis
Risk, as the probability of a negative outcome, is a human construct. It is a concept rather than a tangible object. In contrast, a hazard is a physical object that can cause harm. One tool we can use to assess risk is a three-factor analysis. To use this technique of risk assessment, we consider that there are three factors that influence risk:
Severity: How bad will the negative outcome be?
Exposure: What is my proximity to the negative outcome, or how close am I to the hazard?
Probability: How likely is the negative outcome to happen?
Risk is the product of severity, exposure, and probability. We can make comparisons between risks by considering these three factors. We can also influence the degree of risk by changing any of these factors. For example, we can reduce the severity of the hazard by using a less toxic chemical rather than a highly toxic one; we can reduce exposure to a hazard by wearing protective equipment; or we can reduce the probability of contact with the hazard by performing a hazardous procedure less frequently or with greater care.
Overview of risk assessment tools and techniques
In addition to a three-factor analysis, there are other useful tools that can help us communicate effectively about risk in the lab.
One common risk management tool used in labs is the acronym RAMP, which reminds us how to manage hazards in the lab. We must recognize the hazard, assess it, minimize it, and prepare for negative outcomes. For more information about using RAMP, see this article: https://www.labmanager.com/beyond-compliance-the-ramp-framework-for-risk-assessment-26566
Another way to look at the probability and severity factors of risk is to use a pre-mortem technique. We can ask an “if” question to analyze risk before an experiment or event begins. For example, if we undertake this experiment, what could destroy it; what would a bad outcome be? Answering this question helps us assess what the most severe and likely risks would be and diffuse them before we begin the experiment. If the most probable and severe risk would be a fire, for example, then we can implement fire prevention strategies.
Psychological factors that influence risk perception
Perhaps the most fundamental consideration that influences our understanding of risk and the way we talk about it is the human factor. In the fields of psychology and neuroscience, there are two key systems that have been identified and studied that describe individuals’ perceptions of different risks: the analytic system and the experiential system.
The analytic system of comprehending risk is the logical, step-by-step, data-valued method of assessing a situation or choice. This system takes a lot of effort and conscious thought, so it is relatively slow in triggering a swift decision or action.
The second system is the affective, intuitive, experiential system that is reflexive, and therefore quick to drive action. This system denotes risk as a feeling that a situation or choice is “good or bad” for us and our response is automatic and generally beyond conscious awareness.
While we might believe we make completely rational decisions, both systems are usually at work during decision-making, and are important to appreciate. We need to apply reason to offset strong emotions in some situations. However, in cases where experience offers more insight than rational data or when immediate action is required, we must incorporate emotion into decisions.
To demonstrate the experiential system, Klane provides an example of slamming on the brakes when driving to avoid hitting an object we do not have time to fully interpret. This system is not always completely accurate, and the action elicited is not always necessary or correct, but it allows us to act quickly when we sense danger.
Another important consideration of the psychology of risk is that we are all, as Klane says, “subjective beasts” who much prefer our own perceptions over others’. We trust our own views about risk based on what has worked for us thus far. Klane points out that our perceptions of what is risky and our tolerances for the amount and kinds of risks we are willing to take differ “widely, wildly, and weirdly.” Appreciating that our perceptions are subjective, as are everyone else’s, can help us be open-minded about other people’s opinions on navigating risk and persuade us to take a harder look at our own perceptions.
Klane adds that being aware of our cognitive biases can help us communicate better about risk. While there are many cognitive biases that have been identified, Klane points out a couple common biases to recognize. Human nature makes us susceptible to confirmation bias, which leads us to preferentially seek out and pay attention to only the data that agrees with our existing mental model or beliefs. Another common bias to consider is attribution fallacy, wherein we overly attribute a perceived negative behavior by another person to their personality rather than to the situation or environment. Awareness of common cognitive biases can encourage empathy and lead to better conversations.
Productive conversations about lab safety start with talking about risk. Remember that good conversations are not one way. Klane reminds us, “Listening with an open heart and mind” allows us to understand each other’s thoughts and feelings. We are best to admit that we are all subjective in our perceptions and are all prone to cognitive bias. If mistakes are made, we need to collectively look for conditions of the environment that can be improved rather than placing blame. Reframe your perspective to see someone else’s point of view.
Lastly, Klane encourages us to share stories because good stories have risk and perspective baked in to encourage the empathy and understanding that is linked to safety culture.
To learn more about strategies and tips to facilitate productive conversations surrounding acceptable risks, tune into this free, on-demand webinar.