Lab Manager | Run Your Lab Like a Business

Article

Integrating Laboratory Automation

Regardless of what stage your lab is in, planning is essential to the successful application of automation and information technologies.

by Joe Liscouski
Register for free to listen to this article
Listen with Speechify
0:00
5:00

Improving the Effectiveness of Laboratory Work

On January 20, 2009, Lab Manager posted an article titled “Knowledge Management” on its Web site1. Within that piece is a quote from a 1998 British government white paper that says: “Our success depends on how well we exploit our most valuable assets: our knowledge, skills, and creativity. … They are at the heart of a modern knowledge-driven economy.” What does that mean in today’s laboratory, where automation and information technology are commonplace?

The primary products of laboratory work are knowledge, information, and data. If you believe the quote above, how do you change laboratory work and operations to more effectively produce and manage those products? The most common answer is to use automation to make producing data and information faster and less costly and information technologies to enable scientists to analyze, model, and produce better research.

That common answer, by itself, isn’t sufficient to meet the level of sophistication suggested by the quote. Today’s laboratories have a number of database systems for instrumental techniques, LIMS, and electronic lab notebooks. Some contain duplicate sets of information, and some are inaccessible except by using the tools the vendor supplies within a product. Many of the products you work with were designed as isolated entities without regard for your need to move the knowledge, information, and data they hold throughout your lab’s informatics network. Before we can fix the problem, we should understand how we got where we are.

Where we’ve been

The work in lab automation can be divided into two segments, marked by the introduction of programmable software systems, back in the days of the Intel 4004 (1971) and 8008 (1972) chips. While automation work using software had gone on before that, the cost of such systems inhibited their widespread commercial use. The availability of inexpensive computing changed the industry.

Prior to that time, most commercial instrument automation relied on programmable controllers; the most common were timer-driven cams. Process chromatographs used timer cams to control automatic sampling, back-flush valves, and chart signal attenuation controls. With the advent of programmable systems, instrument control, and data capture, processing capabilities became more extensive and began to significantly off-load postprocessing work, thus yielding real benefits to lab operations.

The beginning of the digital age

The design of instrument support systems varied with the vendor’s primary commercial focus: those that saw programmable systems as an extension of the instrument developed add-ons with limited functionality; in the chromatography market, meanwhile, they developed integrator- based functionality. Vendors with a stronger background in computer systems developed products that took advantage of storage, display, control, and, where available, multi-user, multi-instrument designs. Automation components were developed in response to bottlenecks in instrument use. For chromatography, a technique that received a good deal of attention because of its widespread use:

  •   Autosamplers were introduced to relieve the labor of injecting samples into the chromatograph and to allow the instrument to run unattended during off hours
  • Computer/integrator systems were developed to handle the data processing volume, which became significant with off-hours instrument runs.
  • Robotics systems were developed to prepare samples for instrument analysis.

The solution to one problem created backups that were solved with additional automation systems. This left us with a patchwork of products that often needed tinkering to work together effectively. Differing technologies from multiple vendors were put together in a way we might never have discovered if all the needed components were available at the same time—a prototype for something better.

PerkinElmer, in particular, divided the laboratory world into “With the advent of programmable systems, instrument control, and data capture, processing capabilities became more extensive and began to significantly off-load post-processing work.” BUSINESS & FINANCE two segments: instrument-computer data stations to service the instruments (data acquisition, analysis, and reporting) and Laboratory Information Management Systems (LIMS) to provide sample tracking/management information for testing laboratories. To a large extent, vendor products and capabilities lead customers in the laboratory automation/computing market by trying to anticipate, or define, their needs.

At the same time, authors such as Ray Dessy (from VPI, now at Virginia Tech) taught courses and published a series of articles in Analytical Chemistry that gave the market an idea of what would be possible with these new tools. Jonathan Titus wrote a series of columns in American Laboratory that showed scientists how to program low-cost computers for lab use. Based on their work, experimenters learned how to digitize analog signals and then use software to acquire and process laboratory data.

During this time frame (1970s–1980s), computer vendors marketed computers, operating systems, and software to make it easier for researchers to connect experiments to computer systems and do their work. As a result we saw two distinct lines of development emerge: the programmer-scientist and the instrument vendor (along with independent consulting companies), each developing products to address their perspective of customers’ needs. Customers looked at products and tried to see how they would, or could, be made to solve problems they were encountering, such as productivity, cost, sample throughput, etc.

The situation today

Because of the history behind automation within the laboratory, we don’t have “laboratory automation,” but rather automation/ computerization of functions and instrument stations within the lab. Vendors have a narrow focus on the techniques and products they specialize in, and the user community is similarly focused. There isn’t a universal view of lab operations except at the management level, where the concerns are human resources, fiscal responsibility, and productivity, not technology development, integration, and management.

Laboratories today, in both research and quality control, depend upon automation and computers to get their work done. In the life sciences, applications such as high-throughput screening and microtiter plate assays require automation systems. According to a 2008 survey by the Association for Laboratory Automation2, an increasing need for automation is expected (section 3). The email announcing the survey’s availability, dated January 20, 2009, included a statement from Steve Hamilton of Sanitas Consulting in Boulder, Colo.: “Laboratory automation development is being increasingly outsourced to the commercial market,” he said. Data life-cycle management is also dependent on information systems, as is the ability to meet the requirements of regulatory agencies. Data life-cycle management is viewed as a storage management issue by vendors; that is what they sell. The issue is much larger than that, since labs have to maintain access to data a decade or more after experiments are run; because of the structure of lab data systems, however, those data files may be viewable only by using the original application software.

Laboratories aren’t isolated organizations. They connect and interact with other groups. The complexity of today’s laboratory informatics environment requires lab management to pay more attention to product life-cycle planning, upgrade management, device independent programming, data life-cycle management, and the technology implications of outsourcing.

How do we move forward?

The end result is that laboratory managers have to do a better job of technology planning and management within their labs. It isn’t enough to purchase products to solve a specific immediate problem. Managers have to look at product life cycles3 to see where each product fits within a vendor’s plans and how it fits within the lab’s informatics planning. (Are you going to commit resources to a product past the midpoint of its life cycle and discover that you’ve boxed yourself into a system that may need replacement or updating?) Outsourcing isn’t going to be successful unless an overall plan3 is in place to show how the components and systems in labs fit together; part of the work specifications will come from the need to provide connections to other systems. In addition, there is an increasing need to develop data interchange and communications standards to move the implementation of lab systems from isolated data stations and database systems to more effective, integrated, laboratory-wide systems, working toward integrated laboratory automation.

Three elements are needed to reach this integrated laboratory automation goal:

  •  Develop data interchange/communications standards. Unless we address this issue and make it possible to move data between products in the lab as easily as we can in an office environment, our ability to reach the goal of integrated laboratory automation will be frustrated. This capability exists in computer-aided design, document preparation, and graphics design. It is the basis of consumer choice and connectivity in telephone systems and consumer electronics in general. Why, then, don’t we have it in the lab? An attempt at data interchange standards for chromatography and mass spectroscopy was made by the Analytical Instrument Association in 1990s with the development of the andi program, which has effectively stalled. The project had been transferred to the ASTM; the Army Corps of Engineers funded a similar project for ICP work. A new initiative with broad user support is needed.
  •  Laboratory management—and, in some cases, senior management at the director level—must take on the responsibility for laying down the policies and practices that provide the foundation for automation. This is on par with management oversight for any significant corporate information systems program. Once those policies and practices are in place, operational lab models can be developed that can be used as a basis of product selection and project design. The Institute for Laboratory Automation is working on this requirement.
  •  Develop laboratory automation engineers who are trained in the science and technology of laboratory automation systems. These people need to be able to understand the science that underlies the lab’s work, translate it into functioning systems, and coordinate those systems so that, where needed, they function as an integrated information/informatics environment. Again, this is part of the work that the Institute for Laboratory Automation has undertaken.

A structured approach to the planning, design, and development of laboratory automation systems will benefit the lab and the organization that depends on it, thereby reducing the cost of operations and streamlining the workflow so that people can be more effective. Once we’ve begun to implement systems on a structured methodology, we can begin to design labs in which automation/computerization is no longer considered a replacement or upgrade technology but rather part of the basic underpinnings of the lab’s operations.

“But I already have a running lab”

Most of the readers of this article are working in labs that have some investment in automation and computing; do these ideas apply only to new labs? They can apply to any lab at any stage in its development. Your present lab automation/ computing environment is not the situation that will exist in the future. Things are going to change, and those changes are going to present opportunities to reevaluate what you need and where you are going. The first step is to ask yourself and those working with you a simple question: “Knowing what we know now, if we had to do it over again, what would we change? How could we do it better?”

Assuming your answer doesn’t involve becoming a hermit or turning your hobby into a career, the place to start is with a plan—given a blank sheet of paper and serious look at the questions above, how do you want your lab to operate? This is the point where you:

  •  Lay out a plan that describes the desired situation based on your knowledge of the lab and any expected changes.
  • Next, draft a document that gives the benefits of the new automation/computing topography: how will the lab’s operations improve? Will there be any cost reductions? Will productivity improve? Why would anyone invest in the new structure?
  • Draft a plan for making the transition from the current situation to a new architecture.

The new architecture and the transition plan are going to change over time as changes in the lab take place; tweaking and adjustments based on new information and new perspectives are part of the game. With these items in hand, you have a basis for questioning vendors (existing and prospective) on their product direction and for telling them what changes you need—YOU drive the process. You also have a basis for evaluating new products/technologies coming into the lab and for tailoring existing ones. How will they help you get to where you want to be?

Regardless of what stage your lab is in, planning is essential to the successful application of automation and information technologies.

References

1 “Technology Management: Product Life Cycle,” Lab Manager Magazine, July/August 2008, page 20–23.

2 http://www.labautopedia.org/mw/index.php/ALA_Laboratory_ Automation_Survey_2008:Section_3

3 “Outsourcing Laboratory Work—Establishing the Necessary Policies and Practices,” April 2008 Supplement to BioPharm International, pages 34–39.