Lab Manager | Run Your Lab Like a Business

Article

Five Necessary Elements for Integrating Lab Systems

Where We Are vs. Where We Need to Be

by Joe Liscouski

The lab systems we have today are not built for integration system-wide. They are built by vendors and developers to accomplish a set of tasks, and connections to other systems are either not considered or are avoided for competitive reasons. If we want to consider the possibility of building integrated systems, the following five elements are needed: 

the number 5• Education 

• User community commitment 

• Standards—file format and messaging/interconnect 

• Modular components 

• Stable operating system environment 

Education 

Facilities with integrated systems are built by people trained to do it. But the educational issues don’t stop there. Laboratory management needs to understand their role in technology management. It isn’t enough to understand the science and how to manage people, as was the case 30 or 40 years ago. Managers have to understand how the work gets done and the technology used to do it. The effective use/misuse of technologies can have as big an impact on productivity as on anything else. The science also has to be adjusted for advanced lab technologies. Method development should be done with an eye toward method execution— can this technique be automated? 

User community commitment 

Vendors and developers aren’t going to provide the facilities needed for integration unless the user community emands them. Suppliers are going to have to spend resources in order to meet the demands for integration, and they aren’t going to do this unless there is a clear market need and users force them to meet that need. If we continue with “business as usual” practices of force-fitting things together and not being satisfied with the result, where is the incentive for vendors to spend development money? The choices come down to these: you purchase only products that meet your needs for integration, you spend resources trying to integrate systems that aren’t designed for it, or your labs continue to operate as they have for the past 30 years—with incremental improvements. 

Standards 

Building systems that can be integrated depends on two elements in particular: standardized file formats and messaging/interconnect systems that permit one vendor’s software package to communicate with another’s. 

File format standards-The output of an instrument should be packaged in an industry-standard file format that allows it to be used with any appropriate application. The structure of that file format should be published and include the instrument output plus other relevant information such as date, time, instrument ID, sample ID read via barcode or other mechanism, instrument parameters, etc. 

In the 1990s the Analytical Instrument Association (now the Analytical and Life Science Systems Association) had a program under way to develop a set of standards for chromatography and mass spectrometry. It was a good first attempt. There were several problems with it that bear noting. The first point is found in the name— Analytical Data Interchange Standard. It was viewed as a means of transferring data between instrument systems  and served as a secondary file format, with the instrument vendors being the primary format. This has regulatory implications, since the FDA requires storage of the primary data and requires that the primary data is used to support submissions. It also means that files have to be converted between formats as they move between systems. 

Ideally, the standard format would be THE format for an instrumental technique. Data collected from an instrument would be in that format and be complemented and used by each vendor. In fact, it would be feasible to have a circuit board in an instrument that would function as a network node. It would collect and store instrument data and forward it to another computer for longterm storage, analysis, and reporting, thus separating data collection and use. A similar situation currently exists with instrument vendors that use networked data collection modules. 

The issue is further complicated by the nature of analytical work. A data file is meaningless without its associated reference materials—standards, calibration files, etc.—that are used to develop calibration curves and evaluate qualitative and quantitative results. While file format standards are essential, so is a second-order description—sample set descriptors that provide a context for each sample’s data file. A sample set might be a sample tray in an autosampler; the descriptor would be a list of the tray’s contents (standards, sample ID, etc.). 

The second issue with the AIA’s program was that it was vendor-driven with little user participation. The transfer to the ASTM should have resolved this, but by that point user interest waned. People had to buy systems, and they couldn’t wait for standards to be developed and implemented. The transition from proprietaryfile formats to standardized formats has to be addressed in any standards program. 

The third issue is standards testing. Before you ask customers to commit their work to a vendor’s implementation of a standard, they should have the assurance through an independent third party that things work as expected. 

Messaging/Interconnect standards 

Developers and vendors design programs to be self-standing—the software works as though nothing else existed, and it is self-sufficient for all critical tasks. That is a reasonable viewpoint since it may in fact be true. There isn’t any standard suite of lab software. It is also true that software exists and functions in concert with other programs, and they may have the need to exchange data elements. We need a standard for intertask communications. The advent of ELNs only raises the level of complexity. Files can be imported/exported, but if we want integration, we need communication between elements. That includes the modules used in sample preparation as well as in large instrument data systems, LIMS, and ELNs. Some vendors use PDF files as a means of information exchange. While this works, it is not the ideal situation for engineered message transfer. 

Modular systems 

The previous paragraph notes that vendors have to assume that their software may be running in a stand-alone environment in order to ensure that all the needed facilities are available to meet the users’ needs. This can lead to duplication of functions. A multiuser instrument data system and a LIMS both need a sample login. If both systems exist in the lab, you’ll have two sample login systems. The issue can be compounded with the addition of more multi-instrument packages. 

Why not break down the functionality in a lab and use one sample login module? It is simply a multiuser database system. If we were to do a functional analysis of the elements needed in a lab with an eye toward eliminating redundancy and duplication—designing components as modules—integration would be a simpler issue. 

A modular approach—login module, lab management module, modules for data acquisition, chromatographic analysis, spectra analysis, etc.—would provide a more streamlined design with the ability to upgrade functionality as needed. For example, a new approach to chromatographic peak detection and peak deconvolution could be integrated into an analysis method without having to reconstruct the entire data system. 

When people talk about modular applications, the phrase “LEGO®-like implementation” comes up. It is a good illustration of what we’d like to accomplish. The easily connectable blocks and components can be structured in a wide variety of items, all based on a simple standardized connection concept. There are two differences that we need to understand. With LEGO, almost everything connects. In the lab, connections need to make sense. LEGO is a single vendor solution that, unless you’re THE vendor, isn’t a good model. A LEGO-like multisource model (including open source) of well-structured, well-designed, and supported modules that could be connected/configured by the user would be an interesting approach to the development of integrated systems. 

Modularity would also be of benefit when upgrading or updating systems. With more functions distributed over several modules, the amount of testing and validation needed would be reduced, and it should be easier to add functionality. This is what systems engineering—laboratory automation engineering— is when you look at the entire lab environment rather than at implementing products task-by-task in isolation. 

Stable operating system environment 

The foundation of an integrated system must be a stable operating environment. Operating system upgrades that require changes in applications coding are disruptive and lead to a loss of performance and integrity. It may be necessary to forego the bells and whistles of some commercial operating systems in favor of open source software that provides required stability. Upgrades should be improvements in quality and functionality where that change in functionality has a clear benefit to the user. 

Where do we go from here? 

At some point the steps described are going to have to be taken. Until they are, labs are going to be committing the results of their work to products and formats they have little control over. The use of proprietary file formats that limit one’s ability to work with the company’s data should be replaced with industry standard formats that give users the flexibility to work as they choose with whatever products they need. 

We need to renew the development of industry-standard file formats, not just from the standpoint of encapsulating data files but also so that formats ensure that the data is usable. The initial focus for each technique needs to be a review  of how laboratory data is used, particularly with the advent of hyphenated  techniques, and to use that review as a basis for defining the layers of standards needed to develop a useable product. 

At a recent ELN and LIMS forum held in Milan, Italy (September 25th –27th 2012), users expressed continued frustration with the lack of movement. The continued development of the AnIML standard (Analytical Information Markup Language) holds some promise since it addresses both data formats and context as noted above. In addition, a new organization—the Allotrope Foundation—has been funded by pharmaceutical companies and may provide some direction. 

Overcoming the barriers to the integration of laboratory systems requires a change in mind-set on the part of lab management and those working in the labs. That change will result in a significant difference in the way labs work, yielding higher productivity, a better working environment, and an improvement in the return on a company’s investment in its lab’s operations. Waiting for that change to occur isn’t going to produce the results needed. The user community needs to take a leadership role and come together and provide direction to developers.