Conference vendor exhibits show the latest wares in laboratory equipment. Those focusing on life sciences, including an October 2007 LRIG1 meeting in Cambridge, Massachusetts, displayed microplate handling equipment to measure bacteria growth, cell counters, physical sample management systems, and more; most with one element in common — they were all microprocessor controlled and programmable.
Automation is an integral and inescapable part of laboratory work.2 Much of it is at the task level, built into instrumentation to carry out a fixed, but programmable, set of functions. Many offer some form of electronic output, some as files, others with USB3 connections. Using the instrument output is left as an exercise for the user.
Figure 1: Modeling laboratory processes results in a graphical description of how the lab works, what the major structural elements are, and how they are used.
Making effective use of that equipment and other automation options in the lab will require managers to take an active role in developing policies that guide automation projects, including the purchase of intelligent instrumentation, so that the next stage in automation can be addressed: systems integration and the improved utilization of the labs intellectual property.
In addition to the items noted above, the term “automation” in a lab also covers automated pipettes, instrument data systems (e.g., chromatography, GC/LC-MS, etc.), LIMS, and electronic laboratory notebooks. Some are focused on making manual tasks more efficient, others completely offload what had been manual work to fully automated workflows. As workloads increase, more manual tasks are going to be shifted to electronic/mechanical devices, many with data capture capabilities and the ability to connect to computer systems. Management planning for each step in that transition is essential. This article serves as an introduction to management’s role in lab automation — setting the guidelines and expectations for project development and technology management.
These policy guidelines will help lab managers with two relevant issues in particular. A recent article4 covering a small survey (72 individuals in 37 companies) reported that:
- Only 56% of the automation projects “succeeded in delivering the expected results”
- An increasing dependence on outside sources for project development
The planning process described in this article and elsewhere5 will improve the success of projects by providing a strong architectural platform for their development. It will also provide a method for documenting and communicating project expectations and relationships between automation projects, giving product vendors and outside contractors an understanding of a laboratory’s current automation operations, and how new projects fit in.
Figure 2: This model represents three researchers working from a common “data”database
with each individual working on independent projects.
TECHNOLOGY PUSH SHIFTS TOWARD MANAGEMENT POLICY
In its early days, laboratory automation was a technology push — we learned how to acquire data, process, and report it. We also learned how to use robotics to off-load repetitive tasks. LIMS were developed to manage workflows in testing labs and electronic notebooks provided a means of helping researchers work with laboratory data.
The next stage in the development of laboratory automation is going to take us from the task-automation level to that of integrating systems so that the full benefit of the technologies and products that have been developed can be realized. That stage is going to require management to take a more active role in driving the application of technology to laboratory work.
The policies and practices developed by management are going to provide the infrastructure that will enable integration within the lab, cooperation between labs, and successful partnerships with Information Technology groups.
Those policies are critical to the successful development of systems that:
- Are effective and supportable
- Meet the labs automation and regulatory needs
- Reduce the overall operating costs of lab operation and improve the lab’s return on investment
- Provide a basis for working with other groups including Information Technology departments
- Coordinate the work of outside contractors and equipment vendors
- Protect the intellectual property developed in the labs
- Make the most effective use of people’s talents
TRANSITIONING FROM INDEPENDENT PROJECTS TO COOPERATIVE AND INTEGRATED SYSTEMS
Whether you are working in research, testing, or quality control, the introduction of laboratory automation systems requires planning. That planning should include:
- The definition of policies and practices that govern the use of automation technologies and products within a lab
- The relationship between different labs (one laboratory supporting the operations of others) and groups (QC sending results to process control)
- The development of workflow models that describe how each lab does its work.
Those three points apply to both existing facilities and those in the planning stages. Integrating systems may cause you to step back from the lab as it exists now, plan for the lab-as-you-need-it-to-be, and then develop a migration plan to get there.
A LABORATORY AUTOMATION ENGINEERING6 (LAE) METHODOLOGY
This approach differs considerably from practices currently used today. Systems that work together are the result of planning and engineering.
- Task-oriented, focused on bottle-necks, local work spaces
- A laboratory being viewed as a collection of individual task stations
- Technology-driven — how can a product’s technology be used to improve a given situation
- Automation-as-an-extension-of-the-instrument viewpoint — automation is used as a setup to instrumental analysis or as a post-run activity, supporting the instrument is the center of focus
- Engineered systems point-of-view
- A top-down structured approach
- Designed to protect the value of a laboratory’s products (knowledge, information, and data) and enhance the labs (and larger company’s) ability to gain value from those products
The first stage in the planning process is the definition of policies and practices that all automation projects have to address. This is the equivalent of describing the infrastructure of the labs operations. “Lab” may refer to a single room or a large complex; common infrastructure/automation policies should apply equally to both situations. Their purpose is to describe the parameters and standards that guide project development and implementation, and provide a common basis of standardization to communication, software systems, and automation components so that the outcome of those projects can work together and minimize support costs.
Those policies should include (they are given a brief treatment here due to space limitations):
The management and protection of the lab’s intellectual property
The knowledge, information, and data that the lab produces (the primary products of the lab’s work) need to be managed so that they are retrievable and accessible by anyone who needs them. This would include data file structures, requirements for cataloging, backup, and archiving. Technical details may be left to engineers; however, the expectations of how things should behave and function are management policies.
Control over access to the labs facilities, protection against unwanted intrusion by malware (e.g., viruses, worms, etc.), and protection against power loss. (Note: protection against power loss is a major issue since instrumentation, computers, etc. may react differently to power interruption; a couple of seconds delay in switching from normal to backup power may result in equipment damage or unsafe operating conditions.) In some cases, uninterruptible power backup may be required. Management’s responsibility is to make sure that these points are understood and addressed for each project. When needed, facilities and IT may be required to provide needed services and support.
Setting standards for documentation, testing, controlled release of software versions, and use of spreadsheets and other easily modifiable software tools. This is a key consideration since it is possible that those working on one project may not be the same people working on a future version or related project — you want to be sure that whoever works on a project at a later date can understand the details. Software development and customization is significant portion of most automation projects, so the ability to understand earlier work is basic to a system’s ability to be supported and to exchange data with other automation projects.
Whether you operate in a regulated environment or not, the concepts and practices underlying systems validation are key to the success of an automation project. A system that has not gone through a validation process will be difficult to maintain, support, and upgrade. As managers, you are responsible for setting the criteria that validation protocols have to meet. In larger companies with formal regulatory oversight, those departments may provide insight on specifications that have to be met.
INTEGRATION: MERGING PROCESSES AND INSTRUMENTS
Integration needs careful consideration. What is it that you are integrating? Unless you are an instrument vendor, you don’t integrate instruments (hyphenated techniques such as ICP-MS for example), you integrate processes that involve instruments and laboratory equipment. Process integration requires a thorough understanding and documentation of the processes, what “integration” means, and how that integration is going to occur. Will the integration be the result of communications via file transfers, message transfers, or will the systems be more tightly linked? Will there be a supervisory system in control, providing the coordination of several devices? You also need to consider when the integration is going to occur; is it part of one project or a set of separate projects that have to be connected at a later date? Many of the technical details may be left to engineers, however ensuring that the means (technical and documentation) of carrying out the integration steps is managements and part of the systems acceptance criteria.
This also underscores the importance of the software development documentation. If the process and the software supporting that process is not well documented or understood, the likelihood of a successful integration program is small.
In addition, policies should be developed on process management, change management (the human side of laboratory operations, keeping those in the lab informed and providing training to maintain skill sets), project scheduling, systems retention, etc.
A lab manager for a quality control laboratory in a manufacturing facility should be able to determine the policies for that lab’s operations. More complex organizations, such as research laboratories, will require more extensive discussions to provide for data exchanges between groups. A common set of policies and practices is useful since they can help ensure coordination of systems when needed, and reduce development costs.
Successful systems integration is not just confined to operations with a lab, but should take into account exchanges between labs and other departments. This will make the process more complex but may also open additional funding if the efficiency and effectiveness of the overall facility is improved. It will also require the cooperation of Information Technology groups since the implementation of intra- and inter-departmental communications is going to be built upon corporate network infrastructures.
Once the policies for the lab(s) have been determined, each laboratory should examine the workflow model that describes how the lab should function. In a given facility, different labs will have different structures depending on their mission and how they decide to carry it out. Testing labs (analytical, quality control, etc.) are going to have a different structure than research labs. The need for collaborative work is going create additional requirements for products and how the knowledge, information, and/or data produced in the lab is managed.
Modeling laboratory processes results in a graphical description of how the lab works, what the major structural elements are (databases, workflow management systems, document management, etc.), and how they are used. The modeling exercise also provides a means of determining if a potential automation project is ready for automation. If it is not, the areas where issues exist can be analyzed to determine how much effort and cost it will take to overcome those issues and whether or not it is worth it (cost/benefit analysis). This type of workflow analysis will also enable you to determine the ability of each process to be part of an integrated workflow system.
You may also uncover common process elements that can be automated and benefit more than one experimental technique. Doing so will help justify the economics of proposed automation programs.
Figure 1 is an example of one such model.5 The ovals represent databases, the arrows are processes that operate on the database elements (each test/experimental protocol would have a separate process description, as would each analysis technique). In some cases “data” can be converted and still remain “data” (simple temperature conversion) in others an analysis on data can yield information (for example, developing a calibration curve of peak area vs. concentration for chromatographic data and then applying it to samples).
In a quality control lab, the “data” database may represent several instrument databases (e.g., chromatography, mass spec libraries, etc.). The “information” database may be a LIMS. The “knowledge” database would be a document management system that contained current test protocols, hazardous materials documents, etc. In a research system, an electronic lab notebook could combine both “data” and “information” databases into one system.
This is a simplified drawing. In the real world (your world) each process line in the diagram above may in fact be several parallel processes each representing a different experimental protocol or test procedure.
Figure 2 is a model variation that represents three researchers working from a common “data” database, a genome database for example, with each individual working on independent projects.
This model’s structure shows that in order for the lab to be successful, the “data” database needs some particular characteristics:
- The data format for data structure elements should be standardized so that common software routines can be used to read and write to the database
- It should be shareable with multi-user simultaneous read/write access (or people will become frustrated with delays in getting access to the data elements they need)
- It will require a cataloging system that is searchable
- In addition to each individual having their own backup/archiving system, the shared database needs its own backup /archive facility that will ensure that each user will retain their access privileges
- Since the data is being shared the database system has to be designed so that as it grows it can span multiple volumes; the database can only grow as new material is added, older material has to be retained since it’s relevance to future projects is uncertain
The models can be used to generate product requirements that will be used in evaluating vendor offerings.
These same models, and their variations tailored to specific situations, can be linked to show how the work done in one lab is transferred to another lab or department. As a lab’s mission changes, the models and policies can be updated.
Rather than using automation products to fix bottlenecks, lab managers should look ahead and determine how they want their labs to operate. Setting policies and practices that are used to develop all automation projects and a developing a workflow model that shows how each automation element fits into the larger picture will result in a more stable and streamlined operation. Costs should be reduced since you avoid recreating existing projects to make them work with newer systems and the benefits of purchases are more easily described.
- LRIG – Laboratory Robotics Interest Group, http://lab-robotics. org.
- The high level of automated equipment available to the biotech/pharma industries isn’t as evident in others. There are several reasons for that including the funding available. In addition, the experimental protocol are heavily dependant on fluid handling, small sample sizes, and the wide acceptance of microplates as an experiment platform; that standardization is key to the rapid development and easy acceptance of automation systems. Standardization of equipment as well as data interchange standards is an essential element of successful automation programs, something that should be evaluated carefully in other disciplines.
- USB – universal serial bus.
- Hamilton, S.D., “2006 ALA Survey on Industrial Laboratory Automation,” Journal of the Association for Laboratory Automation, 2007, Vol. 12, Num. 4, pp.239-246.
- A thorough description of these models and the policies can be found in a “Manager’s Survival Guide to Engineering Laboratory Automation” written by this article’s author, Delphinus, Inc., 2007.
- Liscouski, J., “Are You a Laboratory Automation Engineer,” Journal of the Association for Laboratory Automation, June 2006, Vol. 11 Num. 3, pp.157-162. (An expanded version of this material can be accessed through at www.Delphinus- LAE.com/emailforms/RULAE.htm — Note: the URL is case sensitive.)
Joe Liscouski is part owner of Delphinus, Inc. in Groton, MA. He is a specialist in the application of automation technologies in laboratories and is working to develop the field of Laboratory Automation Engineering. He is the author of a “Managers Survival Guide to Engineering Laboratory Automation” and can be reached at JLiscouski@Delphinus- LAE.com; 978-448-2836; www.Delphinus-LAE.com.