Lab Manager | Run Your Lab Like a Business

Product News

Simulation-Based Planning

Simulation tools have been available for 40 years, but advances in computer technology have now made them truly practical for use in managing operations in laboratories, which by their nature are complex due to the mix of tests conducted, the variety of equipment involved and the scientist skill set needed.

by Mike Lickley,Jim Curry
Register for free to listen to this article
Listen with Speechify
0:00
5:00

Operational Results Include Reduced Cycle Times and Improved Equipment Utilization and Scientist Effectiveness

Simulation is growing in popularity as a best-practice tool that allows companies to move to the next stage of optimization of the pharmaceutical manufacturing, quality and laboratory environment. It is due both to the sophistication and the robustness of the tools available, as well as the need to optimize operations in response to changing market needs and financial pressures.

Simulation tools have been available for 40 years, but advances in computer technology have now made them truly practical for use in managing operations in laboratories, which by their nature are complex due to the mix of tests conducted, the variety of equipment involved and the scientist skill sets needed.

A simulation-based planning system has a variety of uses within a laboratory for Lean teams, Six Sigma experts, lab supervisors and management. This article describes our experience and learnings from the use of simulation over the past four years in a complex laboratory environment that conducts a variety of tests for in-process, product release, stability and raw materials as well as ad hoc research analyses.

This article describes how simulation can be used within a lab to predict the load on staff and equipment, using a scenario testing “what-if ” capability for shift changes, campaign size, test mix and volume changes. It also describes how the lab schedule can be integrated with the production stream model for a complete end-to-end flow.

The models used in both laboratories and production are OpStat’s Lean simulation models that use Excel inputs for test details, equipment inventory and shift skills assignments, as well as outputs for management summary and detailed reporting.

Description of facility

The J&J Alza facility in Vacaville, CA, has three laboratories, one primarily for raw materials and two for commercial products and controlled and noncontrolled substances. Organizationally, there are three commercial lab teams aligned with the different production streams in the facility and the raw materials team. The equipment in the raw materials lab can also be used by the commercial teams when needed.

Across the three labs, there are 155 different types of test Simulat ion-Based Planning TECHNOLOGY & OPERATIONS Operational results include reduced cycle times and improved equipment utilization and scientist effectiveness by Mike Lickley and Jim Curry equipment, totaling 2,035 pieces of inventoried items. As a representative year, in 2006 the facility completed a total of over 8,500 test requests, requiring over 70,000 individual attribute tests, i.e., 70,000 assays. Total staff in that year was 190 scientists and supervisors.

In the Quality organization, there is also a separate QA documentation department responsible for test documentation review and sign-off. This department’s resources and estimated process times are defined to the model, as are the testing lab’s skills and equipment.

Management vision

The Process Excellence, Quality and Production management at the Alza facility had the foresight to utilize best-practice tools to improve operations with sustainable processes that included making laboratories part of a seamless production flow. This included shared metrics, starting with ordering of raw materials and progressing through final product release.

The end-to-end view highlighted where some improvements in one area could cause problems in others. For instance, traditional Lean manufacturing concepts of smaller lots and more frequent ordering of raw materials can cause a workload problem for pharmaceutical labs that must test each lot received. The objective was to optimize the entire flow.

The vision also included applying the use of an Op-Stat Lean Laboratory simulation model, similar to one already successfully part of the manufacturing planning process. This helped to learn the facts, to understand the dynamics in the laboratory. A laboratory is a type of “job shop” where, depending on the test type and product, the equipment and skills required vary.

Resource utilization metrics in a job shop tend to be lower than with a process flow, where there is a high degree of repeated operations. Capacity planning for staff and equipment, particularly expensive equipment such as GCs and HPLCs, relies on understanding the facts and dynamics in the operation.

Objectives for system

The overall objectives within the context of the endto- end system were to “Lean” the lab, i.e., eliminate variability and delays wherever possible and meet throughput commitments for all types of tests.

The complexities in the operation are similar to those faced in all test laboratory facilities, mainly that it is difficult to understand all the interactions that take place and impact the results in meeting commitments to complete test requests. As requests arrive at the facility, a prescribed set and sequence of attribute tests are required. These specify:

  • The equipment that needs to be available
  • Skill sets that need to be available
  • Probabilities that certain conditional level two or three tests will need to be completed

The simulation model allowed different priority schemes to be tested, and responsibilities for defining who allocates equipment and skills and when they are allocated. The model also allowed the impact of equipment downtime to be measured, as well as the impact of process variability, using the latest probabilistic Monte Carlo simulation techniques.

The model needed to be able to simulate the laboratory requirements for the distinctly different production streams, incorporate the analytical tests that were not as predictable as the regular production tests, and measure performance with metrics that were meaningful to the lab supervisors and Quality management.

Laboratory model overview

The model uses spreadsheets for data inputs, including actual test history, forecasts, test profiles and equipment, and skills matrix. It then processes tests based on priorities and rules to determine lab capacity. Summary reports are output in spreadsheets and graphs. Figure 1 shows the flow of input, processing and outputs for the model. Once the model has completed a run, a detailed overview summary is provided as shown in Figure 2. This one-page view provides a data-rich overview of lots completed, cycle times and resource loading. Drilling down into the model gives further details, and Excel reports are provided for further analysis.

Using the model for planning

The sources of demand for the labs consisted of a variety of different types of tests and required turnarounds. Some in-process tests required completion within a few hours while the production stream waited, while others had up to 30-day requirements. In total, the mix of work included tests for lot clearance releases, multiple types of in-process, stabilities, customer complaint investigations, equipment swabs, validations, R&D analytical requests and raw materials receipts. Some of these demands could be predicted based on production schedules, while others were more random and used probabilistic distributions.

Supervisors provided inputs on the test requirements for their areas of responsibility down to the detailed task level and required resources. Figure 3 shows the sequence and requirements for test execution. Since batching of attribute tests and setting up level-loaded schedules where possible are important to throughput, batch quantities and frequency are entered in this Test Profile. Figures 4 and 5 show examples of the equipment inventory and skills scheduling inputs. The supervisors also performed a reasonableness checkout for each step in the process and were trained to use the system.

The model was first validated by using the volume history from the LIMS system; equipment outages from the metrology system were also input to the model. The volumes and actual process times from completed tests were run through the system, and output metrics for completion vs. target times were compared to actual management metrics that had been derived in the manual system.

Once validated vs. management metrics, the model was ready for “what-if ” scenario evaluations. Forecasted volumes and test/product mix were applied to see the resulting impacts on throughput performance and resource utilizations. Early on in its use, the model gained credibility when it showed dynamics that at first appeared counterintuitive. For example, the accepted thinking was that HPLCs were a critical bottleneck. However, the model showed that a type of balance was a bottleneck for a number of different products and tests. When these data were presented to the scientists in the lab, they confirmed the delays they were encountering on a regular basis.

The model was fed from Excel workbooks, which made the training easier. The inputs included the test matrices with min/max times for each activity, equipment inventory, skills and shift assignments, actual LIMS history, and forecasted volumes and mix changes. The model was also set up to feed an Excel output workbook that had been compiled manually, so that the metrics were consistent with what management had been used to. The probabilistic capability of simulation is a key benefit to the technology, since it provides metrics such as ranges of results and confidence intervals around service levels and utilization results. Probabilistic inputs may be based on minimum/maximum estimates or specific probability distributions as the expertise in the use of the system grows. Figure 6 has an example of a probability distribution for demand derived from actual LIMS history.

Data management is an important aspect to using a system such as this. The four team supervisors and Process Excellence staff all had access to the model, so keeping data in sync required some up-front planning. A core copy of the latest configuration was maintained on a shared server, and the files were then available to whoever was going to run a scenario.

Results

The result of deploying capacity simulation cannot easily be quantified individually but must be considered as a part of our overall sitewide Lean initiative and a contributor in our achieving year-over-year process improvement for multiple years while maintaining benchmark results in lab quality. The simulation was used to level the workload, integrate the labs into a rhythm cycle synchronized with production where possible and incorporate standard work metrics to manage the staff.

Operational results were reduced cycle times and improved equipment utilization and scientist effectiveness. Perhaps more important, the simulation provided needed understanding of the real capacity of the laboratory and insights on the drivers of capacity utilization and service level.

Summary

When we consider the complexity of most supply chains, our teams need tools to be able to predict future performance based on a wide variety of input variables, changing conditions, and demand uncertainty and volatility. The OpStat Lean Laboratory capacity simulator helped our site achieve the imperative—which is true of most supply chains—to reduce cost while improving customer service.

The OpStat simulator was used in both the development of staffing plans and in communication to supply chain partners on expected lab delivery performance. It was a vital tool in understanding and driving process performance improvement. By predicting lab capability and optimizing operations, the lab was able to participate and contribute to overall supply chain synchronization, waste reduction and asset utilization while improving cost and delivery performance.