Lab Manager | Run Your Lab Like a Business
Creative code skull sketch on modern laptop background

Guidelines to Ensure Lab Data Integrity

How to prevent data manipulation and other integrity issues in chromatographic integration

by Agilent Technologies
Register for free to listen to this article
Listen with Speechify

Data integrity has been a hot topic long enough that most GxP chromatography labs have begun to address the core issues, usually with good success. However, there are a few areas where ensuring data integrity remains an elusive goal. One such is chromatographic integration and data interpretation.

How can labs control integration and avoid the risks of data manipulation—intentional or unintentional—such as changing integration parameters, or relabeling peaks so they aren’t integrated and included in calculations for impurities? What can be done to mitigate the risks, and what SOPs should be put in place? Read on for tips for addressing data integrity issues in chromatographic integration.

Follow these general guidelines

We offer the following principles as a starting point for creating SOPs in controlling integration:

  • Know how key parameters such as peak width and threshold impact the integration of a chromatogram.
  • Never use a default method. Always develop the integration specifically for an analytical procedure.
  • Have a robust and reliable analytical procedure; the sole purpose of a chromatography data system (CDS) is not to compensate for poor chromatography—good integration requires good chromatography.
  • All integration, in the first instance, must be performed automatically.
  • Chromatography is a comparative technique; therefore, standards and samples must be treated the same throughout the run.
  • Given the regulatory concerns, only perform manual integrations under conditions permitted in your firm's chromatography SOP or analytical procedure.

Automate integration

Manual integration (which means manual repositioning of the baselines, as opposed to manual intervention, which refers to changing integration parameters) is acceptable if you make a provision in your process acknowledging the associated risk and an additional set of reviews commensurate with that risk. However, automation is far more efficient and useful in maximizing data integrity. Recent tests have found that each chromatogram that was integrated automatically saved 15–20 minutes per injection. Integration Optimizer for OpenLab CDS ensures the best possible results by applying the best integration parameters.

Find the right method

In terms of best practices for ensuring data integrity in chromatographic integration, it is important to identify the right method first. Sometimes, the method is the problem—not the manner of integration. A metric that may be useful is simply “be right the first time.” How many times can you set up a run, do the run, process the run, and release data the first time? That eliminates having to go back and reinject something, for example, and having to reprocess it all over again. Metrics could be tracked on how often you get the run right the first time versus how often you must either reprocess or reinject something to get an acceptable set of results. By tracking that metric, you could quickly discover where data integrity problems are in the laboratory and where it would be worth devoting resources.

Automate your calculations

When it comes to ensuring data integrity in the calculation of results, automation is once again a huge advantage. Laboratories should do the calculations in an automated format, as close to the source of data as possible. Sometimes, that is a challenge. For example, laboratory personnel sometimes have extracted data from a plate reader and pushed it over to a PC; they did some calculations using software on the PC; then, they moved the data set to another area to process it for their clinical trial, and so on. The Custom Calculator tool for OpenLab CDS is the perfect alternative to spreadsheets.

Secure your workflows

Moving data from place to place is a data integrity nightmare. In today's world, where you must be able to trust and defend data, you cannot perform so many maneuvers on it. You must deploy tighter workflows. For example, bring calculations to the data set whenever possible. Bring tools into the environment where data are located and protect them that way. Sometimes, people become enamored with the capabilities of tools and forget that it is really all about data. Protecting and defending the truthfulness of data should be the focus of everything you do.

Practical solutions for maximizing data integrity

The newest release of Agilent OpenLab CDS software, includes innovative features and capabilities to address data integrity issues in chromatographic integration. For example:

The Integration Optimizer guides you to the best settings for your analysis and enables easy deployment of these settings to the lab to operationalize faster more consistent results. This new feature provides assisted optimization with representative data so that less experienced analysts can do it right and expert analysts can do it fast. Analysts can then save and deploy the optimized integration settings, so everyone uses the same optimized settings for their analysis.  This capability can be set up to be role-based, which is important for ensuring data integrity.

Custom Calculator tool for OpenLab CDS
Custom Calculator tool for OpenLab CDS.
Agilent Technologies

The Custom Calculator tool in OpenLab CDS automatically computes unique values directly within the software vs. exporting to spreadsheets. This removes error-prone calculation steps, allowing you to meet the data integrity requirements of GxP regulations with less effort. You can also display the values within the software or reports, which ensures they can be reviewed at the appropriate time for your lab's processes.

Harnessing both Integration Optimizer and the Custom Calculator tool within the secure environment of OpenLab CDS helps GxP chromatography labs minimize data integrity risks, particularly compared with the traditional use of spreadsheets. Together, these capabilities enable you to remove high-risk steps from your data analysis process and delivers results in a timely manner.