Lab Manager | Run Your Lab Like a Business
Woman in lab coat with hands on either side of her head

Managing Crisis

What the Deepwater Horizon disaster can teach lab managers about decision making

John K. Borchardt

Dr. Borchardt is a consultant and technical writer. The author of the book “Career Management for Scientists and Engineers,” he writes often on career-related subjects.

ViewFull Profile.
Learn about ourEditorial Policies.
Register for free to listen to this article
Listen with Speechify

Making good decisions is perhaps the most important lab management skill. It is essential to managing crises, both big and small. While some may believe decision making is completely innate or only gained through long years of management experience, it is also a skill that can be learned and perfected.

After reviewing some decision-making processes, we’ll take a look at the recent Gulf of Mexico oil well blowout and the lessons it offers laboratory managers. The greatest industrial accident in history, it provides useful examples of the same problems that can occur in laboratory environments.

Decision making is an interdisciplinary process that involves applying social psychology, group dynamics and management theory. Making decisions is a complex process with psychological, social and emotional components. By understanding and controlling these components, you can make better decisions. An important part of making a good decision is accurately defining the problem; only then can you solve it. Hidden difficulties, not obvious ones, present the greatest challenges in making effective decisions.

Considering case studies is a good way to understand and improve your decision-making processes.

Cut your losses

Early in my first industrial research job, I was fortunate enough to learn (in hindsight) an important decisionmaking lesson from observing the mistakes of others. A common decision-making error is not to cut one’s losses soon enough. A research project had continued for several years and progressed to the point where a pilot plant was built, producing 50,000 pounds of material per year. Two problems became apparent when operating the pilot plant. The first was that the properties of the polymer produced in the plant were inferior to those produced in the lab. The second was that the pilot plant product was too expensive to achieve targeted sales volumes, particularly if the properties could not be improved. The program was continued for approximately three years in an effort to solve these problems.

It gradually became apparent that the company was throwing good money after bad. Millions of dollars were involved. The laboratory manager could not be persuaded to give up on the project and direct resources elsewhere. Finally this manager was replaced by another person and lost his job. Having no emotional attachment to the project, the new lab manager quickly killed it and shut down the pilot plant. Some staff members lost their jobs.

Individual decision making 

The Deepwater Horizon drilling rig burning shortly before it sank last April
Photo courtesy of the U.S. Coast Guard

In making decisions, many individuals do not examine every possible alternative but rely on experience and rules of thumb to make decisions. This can lead to cognitive biases—systematic mistakes when making choices between options. In the case of the example above, it may have been a systematic bias toward optimism— based on previous successes—that resulted in the research program being funded year after year without the critical problems being solved.

Another nonquantitative, nonanalytical tool used in decision making is intuition. Intuition is more than just gut instinct; it is the result of pattern recognition capability. Well-honed intuition can be a useful decision-making tool and is often involved in making breakthrough decisions resulting in the development of revolutionary new products and processes.

Group decision making

Decisions are often made by teams or groups. Are teams and groups smarter and more capable of making better decisions than individuals? The answer can be yes if an important pitfall can be avoided: “group think.” Group think can occur when the group discussing decision options is pressured into conforming to the view of a powerful individual. This can occur through subtle influence. The powerful individual does not have to be actively pressuring other members of the team.

Another problem occurs if there is little synergy between team members. This results in each team member making a decision independently rather than reaching a consensus. One sign of this occurring is when a group tries to come to a decision by voting on options with little discussion.

Decisions at the organizational level

Decisions made at the organizational level often cannot be attributed to a single leader. The structure and culture of an organization can shape its decision-making process and the decisions made. A small firm owned by an individual usually can reach decisions much more quickly than a large company can. An entrepreneurial organization often reaches different decisions than a large, conservative, well-established company would.

Often decisions made at lower levels of a large organization must be passed “upstairs” for ratification before they can be implemented. This can lead to delays. For the lab manager it means delays in initiating new projects, in moving projects to the field trial stage or in commercializing new products resulting from R&D projects.

In difficult economic times, some managers become afraid of making decisions for fear of making mistakes. This can slow decision making, resulting in “paralysis by analysis,” in which potential courses of action are subjected to overly exhaustive study. As a result, new revenues from new products and processes are delayed when needed most.

Sales and marketing personnel can become involved in disputes with lab managers who delay introducing new products and processes. If the new product is successful, this can have severe repercussions for the lab manager. I was involved in a situation like this in the early 1980s, just after the oil boom turned into an oil industry recession. A product I developed was not released to the field despite my optimistic reports to company field engineers. My department manager refused to permit its release. When a vice president attended a division engineers meeting, he received complaints that led him to overrule my department manager and order 30 drums of product be made immediately for field testing. Excellent test results came back, just when new products were badly needed. The research department manager was transferred to a staff position without supervisory responsibilities—a dead-end job. He was laid off in a subsequent staff reduction.

Scenarios can aid decision making

Wise decision making can be encouraged by use of scenarios developed long before decisions must be made. Shell Oil made the use of scenario planning famous: 2005/10/03/8356715/index.htm (Fortune). Company planners had developed various scenarios of future crude oil prices and developed plans for how the company should respond to each scenario. Shell already had a plan in place before the 1973 Arab oil embargo effectively doubled crude oil prices for its oil refineries, and calmly moved to execute that plan. In a sense, no decisions had to be made. However, frenetic activity characterized the headquarters of some other major oil companies as rush decisions had to be made.

Today many firms in many industries use scenario planning to help guide their decision making.

Avoiding bad decisions

Poorly thought-out decision-making processes usually result in poor decisions. Managers need to think about how to make a decision, try to remove personal biases, collect needed information in advance and determine the diverse perspectives of others. By doing all this, they can greatly improve the success of their decision making.

In a rapidly changing environment such as the recent recession, past decision-making methods that were used in better business conditions may not be best processes to follow.

Risk factors in projects

Mark Abkowitz, a Vanderbilt University engineering professor and author of the book Operational Risk Management: A Case Study Approach to Effective Planning and Response, has identified ten risk factors in projects. These are:

  1. Design and construction flaws (think laboratory building design, instrument design and project design)
  2. Deferred maintenance (a recent problem for some lab buildings due to budget cuts)
  3. Economic pressure (project budget cuts)
  4. Schedule constraints (which can result in rushing projects to completion prematurely)
  5. Inadequate staff training (a problem in some labs due to budget cuts and staff reductions)
  6. Not following procedures (usually due to pressure to save time and rush projects to completion)
  7. Lack of planning and preparedness (which can occur due to lab managers’ heavy workloads)
  8. Communications failures
  9. Arrogance resulting in overconfidence and a refusal to allow for the possibility of failure
  10. Political agendas that exist due to the desire to please major customers or finish projects on budget

The Gulf of Mexico well blowout

Scene at the blowout site on July 11, 2010. Two drillships (foreground) drilling relief wells. A drilling rig working on the original well (background) and numerous workboats.
Photo courtesy of the U.S. Coast Guard

These risk factors don’t occur only in laboratory projects. Many, if not all, also occurred in the events leading up to the recent Gulf of Mexico well blowout. This led to a disastrous explosion resulting in the loss of 11 lives, the sinking of an expensive drilling rig, an environmental disaster caused by the flow of oil and gas into the Gulf of Mexico, and billions of dollars in economic damage to the Gulf Coast economy.

Abkowitz noted that risk factors often work together to generate an event with disastrous consequences. This was certainly the case in the blowout. BP’s own September 8, 2010, report ( identified eight interacting factors that together caused the well blowout. These were:

  1. The cement barrier did not block oil and gas from surging up the well.
  2. Downhole tools called shoe tracks did not block oil and gas from shooting up the well.
  3. Negative-pressure tests to determine whether oil and gas were entering the well were accepted despite anomalous results.
  4. Operators did not recognize the influx of oil and gas into the riser—the pipe between the top of the well and the drilling rig at the ocean surface.
  5. Well-control response actions failed to regain control of the well. These included activation of the blowout preventer.
  6. Oil, gas and drilling mud were not diverted overboard into the ocean. Instead, these fluids were diverted to a mud gas separator, resulting in natural gas venting onto the rig.
  7. Natural gas invaded areas of the rig not electrically classified as spark-free. The rig’s fire and gas system did not prevent hydrocarbon ignition from an electrical spark.
  8. Three methods for operating the blowout preventer in emergency mode were unsuccessful in sealing the well. The explosions and fire on the rig probably disabled the emergency operation sequence.

Had effective action been taken or effective preventative measures been in place to deal with any of these eight factors, the disaster could have been prevented or its scale substantially reduced. Some experts have suggested an additional causal factor: the well design used by BP, thought by some to be inherently less safe than well designs favored by other oil companies drilling in deep Gulf of Mexico waters.

Abkowitz has also noted that communication failure is a risk factor in every disaster. Congressional testimony and news media interviews with rig survivors suggest that this was the case in the well blowout. Poor communications also can result in ineffective action or no action being taken to manage risk factors in laboratory situations. This is why a busy work schedule shouldn’t prevent the documentation of lab results and their communication to all concerned parties in the form of meetings, meeting minutes, e-mails and formal laboratory reports.

Overconfidence can lead to underestimating risk factors, according to Abkowitz. While the facts are in dispute (due to poor and often undocumented communications), BP engineers may have ignored concerns expressed by rig operating personnel and other contractors in the two weeks or so leading up to the well blowout.

Another Abkowitz observation: It often takes a disastrous event to convince people that something needs to be done. This was certainly the case in the Gulf of Mexico well blowout. A six-month drilling moratorium was instituted to allow time to develop improved safety regulations.

Back to the laboratory

Abkowitz also noted that a lack of uniform safety standards in different nations creates an uneven safety risk environment field for international companies. This is certainly the case for laboratories as well as for offshore oil drilling. Laboratory managers, including research managers at the highest levels, need to be sure that safety measures are uniform in all their global laboratories. These have to be effectively communicated to laboratory staff members despite differences in language and culture.

Laboratory managers are fortunate that, while the business risk factors may be great, risks to the health and safety of laboratory personnel and to the environment are less than those of the Gulf of Mexico well blowout are. However, that does not mean that laboratory managers should not take their risk management responsibilities very seriously.