As a lab manager facing demands for more or faster output, automation is often the first approach to address these challenges. When you start to tackle capacity improvement challenges, think broadly. Evaluate not only the core change but map the entire workflow to anticipate how automation will affect both upstream and downstream processes. You might discover new bottlenecks will be created with the addition of automation. This article walks through considerations around the core automation and offers guidance for addressing secondary bottlenecks.
Achieving higher sample throughput with automation
Primary bottlenecks often revolve around increasing capacity of existing or new testing equipment. Many analytical capabilities such as gas/liquid/gel permeation chromatography, nuclear magnetic resonance, differential scanning calorimetry and matrix-assisted-laser-desorption-ionization MS routinely come with sample changer technology. Automation assumes the sample itself is pre-prepared and placed in a crimped vial, tube, sampling pan or 96 well plate. Some technologies offer sampling for flexible sample type (gas, liquid, solid) to reflect your specific experimental needs. Automation capacity can vary from handling a handful of queued samples to a hundred or more using well plate high throughput (HTE) screening needs.
In your decision process, evaluate the variability of your sample streams and your lab’s charter. Based on your mission, the focus can fall into one or more of the below categories:
- Quality control lab – Known sample type and controlled concentration. Known controls and standards are also tested in concert with the samples of interest.
- Research support – Sample types can vary greatly depending on the breadth of chemistries your team supports. The questions to be answered or experiment type might differ greatly (major component identification, impurity profile, quantitation).
- Forensic work – Unknown samples needing various sample prep and screening protocols.
- Method development work – Capacity for methods development is often driven by sample prep and experimental condition assessment needed to establish the final protocol.
This assessment comes down to the control and knowledge you have over the samples themselves. Can you set up the same prep and experimental protocol to cover a majority of your samples without doing some initial screening? Automation can increase capacity to perform more individual tests on multiple samples but can also allow you to explore a wider experimental space needed for methods development or HTE support.
Ensuring sample suitability and integrity
Sample compatibility and stability in an automated environment should be clearly assessed as well. Are prepared samples stable or compatible with vials and sampling hardware? Understanding your material’s stability in the prep solvent and the timeline for degradation or interaction is vital to automating your processes without compromising quality. Here are some questions to consider:
- Do samples need to be stored at sub-ambient temperature for stability?
- Do they need storage at elevated temperature to ensure all components stay in solution?
- Will samples degrade or react with vials, tubes, seals, and caps while sitting in the automation queue?
- Do the samples carry over from run-to-run requiring multiple cleaning injections or frequent column changes?
- Do the samples need storage in an inert environment requiring specialized prep conditions such as glove box?
Sample stability is a key parameter when understanding how best to utilize unattended automation for extended periods of time.
Finally, with automation in place, sample queue organization becomes a new factor you might not have faced previously. Group samples by solvent type, concentration, and experiment conditions in a testing queue to reduce number of instrument set-ups. Longer run times or multiple injections volumes might be used on any individual sample to ensure sufficient sensitivity is achieved. It is inevitable that sample re-runs will be needed—e.g., a sample was unexpectedly diluted, the instrument was contaminated, issues with material stability, or the dreaded power outage. The additional capacity enabled by automation usually compensates for the re-runs required from the unexpected.
Adjusting staffing to accommodate automation
Automation has the greatest impact when there are multiple short runs during a 24-hour period, allowing for better utilization of overnight or weekend hours without operator intervention. Most organizations find that adding capital is easier to justify than increasing headcount; however, before investing in automation accessories, assess how simple staffing shifts might get you the boost you need. For example, a work-hour shift might enable you to double or triple your capacity without any capital investment. For example, an early staff start time and a later staff start time can help you initiate 12-hour runs at 7 AM and 7 PM as opposed to a single run per day. This is where a granular evaluation of the experiment and sample types ensures you make the most responsible decision.
Addressing post-automation bottlenecks
Following implementation, you might discover the bottleneck has not been removed but simply shifted, requiring additional creative solutions. Anticipating these prior to implementation will help you manage customer expectations on the benefit. Two common areas relate to sample preparation and analysis capacity. For the front end, streamline the sample prep area to reduce search time. Have all needed components (balances, solvents, waste stream containers and more) in a well-organized area to reduce set-up and search time. Evaluate opportunities for batch prep of standards or samples to increase capacity as well. Get team buy-in on maintaining a clean and organized space—which you might find to be the biggest challenge of all!
Transcription errors are another potential pitfall of handling large numbers of samples. Sample parameters such as weight, IDs, experimental conditions, and position on the sample queue are all spots for potential errors. Bar code systems can be beneficial, as well as pre-prepared paper templates.
Processing data generated by automation
Finally, automation brings with it a plethora of information that needs processing, quality review, analysis, and interpretation. One of the keys to automation success is ensuring data quality and consistency throughout a long run. Use interim standard samples within a queue to ensure system cleanliness and stability throughout the long run. Establish a protocol to assess data quality before investing time in analyzing potentially corrupt runs. Junior team members can often play a pivotal role in this phase, allowing senior scientists to focus on more advanced interpretation.
Software can be your friend in this phase. Consult with instrument vendor scientists on large dataset management. Additional vendor or third-party software might be a must-have investment to help your team manage the mountain of data coming their way. Even readily available software such as spreadsheets can perform batch calculations and reduce errors.
Additionally, don’t overlook the role your internal customers play in the decision between speed of findings and formality of output. Many come to the workplace with experience in analyzing data, and they will favor early access to the processed results for self-analysis over waiting for a formal report.
Today, automation is considered a worthwhile investment for most labs in the analytical test world. Commercial automated systems are readily available, and the associated software gives users tremendous flexibility in set-up, monitoring and re-organizing queue runs remotely. Even capabilities that would not be straightforward to automate such as electron microscopy and Raman spectroscopy have been adapted to allow multiple sample analyses using a smart combination of sample positioning and data collection software.
With the industry’s greater focus on high throughput screening, automation is the key to handling researchers’ needs for large sampling space. As you look to remove bottlenecks in your lab through automation, don’t overlook upstream and downstream processes or your desired outcome won’t be achieved.