Lab Manager | Run Your Lab Like a Business

Challenges and Trends in Chromatography Data System Validation

Trends with chromatography data systems in regulated environments are focused around data integrity.

by
Rachel Muenz

Rachel Muenz, managing editor for G2 Intelligence, can be reached at rmuenz@g2intelligence.com.

ViewFull Profile.
Learn about ourEditorial Policies.
Register for free to listen to this article
Listen with Speechify
0:00
5:00

Dr. Bob McDowall is an analytical chemist with over 40 years’ experience, including over 30 years of validation experience working with laboratory information management systems (LIMSs), chromatography data systems (CDSs), and other laboratory informatics systems. Bob writes the Focus on Quality column for Spectroscopy and the Questions of Quality column in LC-GC Europe. He received the 1997 LIMS Award from the LIMS Institute in recognition of his input on the subject and his teaching. He has written widely on the subject of electronic working and the use of electronic signatures in regulated laboratories, and he is author of the book Validation of Chromatography Data Systems, the second edition of which was recently published by the Royal Society of Chemistry.


Q: What are some of the recent trends in CDSs?

A: Trends with chromatography data systems in regulated environments are focused around data integrity. CDSs have been involved in a number of major falsification warning letters and fraud cases since 2005. Other instrument data systems have also been involved, but CDSs have a starring role. Many of the validations that people have done in the past are not adequate, and the reason for that is we focus on the application and not the protection of the underlying records. Basically, the problem is that many systems are stand-alone, so that means that unless there’s adequate security, a user can have access to the system clock, the recycle bin, and the data files that are stored directly within the operating system and not within a database. So the architecture’s wrong. The other thing is that stand-alone workstations have only a single hard drive, which is a single point of failure, and many laboratories fail to back up their records adequately.

Get training in Lab Quality and earn CEUs.One of over 25 IACET-accredited courses in the Academy.
Lab Quality Course

Q: What do those issues mean for labs?

A: One, labs have to revisit the validation to include the protection of the underlying electronic records that are generated during the analysis, but two, suppliers also have to address architecture issues. Stand-alone workstations in my view are inadequate for running chromatography data systems in a regulated laboratory. You need to transfer all the data to the network and back it up or store it on resilient hardware, so if there is a failure you don’t lose the data.

Q: What are the key benefits of CDS validation?

A: Most people’s perception of validation is that it’s a pain. They see it as a burden rather than a benefit. I take a completely different view—the process of validation is getting the right system for the right job. If you design the process right, you can get rid of a large amount of paper and can generate more than it costs you to validate the system from increased efficiencies and faster throughput in the laboratory.

Q: How long does CDS validation normally take?

A: The fastest I have done it is nine weeks. However, for most people, if you’ve got the right knowledge of the package, adequate resources, and you know what you’re doing, I would say the core system could be validated in about three months. If you don’t, then it takes a little bit longer, [even] up to 12 months, which is clearly inadequate because the company’s shelled out all this money to buy the system, the user licenses, and everything else. Then you spend an age fussing around trying to figure out what you’re going to do, or you don’t have knowledge of computer validation and you rely totally on your supplier to do some jobs. It may be that if you just have a simple IQ [installation qualification] and OQ [operational qualification] of the software, you think it’s validated, but nothing could be further from the truth. You have to write your own user requirements, have traceability for your requirements, document the configuration of the application and the security setup, and demonstrate intended use, which will be confirmed against your user requirements and configurations. It’s not rocket science, but if you’re not used to it, you can be a little out of your comfort zone.

Q: What are some of the biggest challenges labs face when validating CDSs?

A: First, understanding the regulations—which ones do you work to, what do they say, how do you interpret them? Second, effective risk management—how much do you need to do? One of the biggest challenges is trying to leverage what the software supplier has done in the development and testing process prior to release in your validation to reduce a lot of [your own testing]. What a lot of companies do, which can be wrong, is just send a standard paper questionnaire. I think you need to look at how the software is tested internally so you can justify reducing the amount of work you do in the regulated lab to demonstrate that it is fit for its intended use. The other big challenge, I find, is that if it’s a new data system for the lab, [users] must understand how the system works. If they don’t, the validation falls on its face, simply because the requirements will be incompletely specified, you think the data system will do one thing but [it] does something completely different, or you think it’s going to do one thing and it doesn’t do that at all.

Q: What are some ways lab professionals can handle those challenges?

A: Understand the regulations—a plain, simple interpretation. The problem here is that we all work to the same regulations and everyone interprets them differently. A CDS is in part an out-of-the-box solution and in part a configured application, but many companies apply a one-size-fits-all solution. They don’t carry out the effective risk management and they apply over-thetop testing. It’s too much. The other thing is that they don’t often resource projects particularly well, and that’s one of the reasons why you get a long delay in the validation. It takes a long time because people are doing other things and they’re not focusing on that. And the failure to leverage supplier testing is [because] they basically just go through a paper-based process: send them the supplier questionnaire, [they check] all the boxes, and it’s done.

Q: What can happen if a CDS is not validated properly?

A: In the worst cases, you will get regulatory observations. You can get a 483 [if you follow FDA regulations] or a warning letter. Or in the worst case of falsification of data, you could get either a consent decree, a permanent injunction, or, if you’re a foreign company, an import ban of your products. It can be very bad for business is the bottom line.

Q: Your book is one resource available to lab professionals on CDS validation. What are some other resources you have found to be helpful?

A: The most current one, though it was published in 2012, is the GAMP [Good Automated Manufacturing Practice] good practice guide for the risk-based validation of laboratory computerized systems. That was published by ISPE, the International Society for Pharmaceutical Engineering, and it has a comprehensive approach to the validation of laboratory instruments and systems.

Q: How do you expect CDS validation will change in the future?

A: I’d like to automate it, because at the moment it’s all paper-based, with Word and Excel. The problem is that, unless you have a sufficient number of times that you’re going to execute, an automated system would never pay for itself. The expectation always outstrips the ability to deliver. I don’t think it’s going to get any simpler, because the regulations and the data integrity guidances are getting more and more onerous.

Q: What should lab professionals look for when choosing a CDS for their lab(s)?

A: I wrote a four-part “Future of CDS” with a colleague and friend of mine, Chris Burgess, and we came up with some 15 areas for suggestion. I think, first, the architecture. Even for a simple “one- or two-instrument system,” you don’t have stand-alone workstations. You must have the ability to collect the data directly onto a network server. And in this day and age, with the ability to get virtual servers built for essentially nothing, the argument that “Oh, it’ll cost money” is, quite frankly, [ridiculous]. So, no stand-alone workstations: acquire to a virtual server on the network, even for a single chromatograph. That gets backup out of the lab and into the IT department, which is what they’re paid to do in the first place.

The other thing is around the ability to have more effective audit trails. The issue here is that there’s an implicit requirement from the FDA to review audit trails, and that’s now appeared in its draft data integrity guidance. There’s also an explicit requirement in the European regulations for regular review of audit trails. The way that many data systems are set up, this is not easy. You have to trawl through loads of data, whereas the system should actually identify if something has been modified or—if you allow it—deleted. The next thing is to make certain that you can use the system in an electronic way, so it needs to have electronic signatures. The other thing I would like to see that I don’t think is available now would be automatically generated column logs and instrument logs, so rather than writing what you’ve injected, the data’s already in the CDS. You’ll also want to make sure that it controls your instrumentation, because some data systems will control only a specific supplier’s chromatographs and others will have a wider range.