Lab Manager | Run Your Lab Like a Business

Method Development and Validation for Pharmaceuticals

Associate editor Lauren Everett speaks with Dr. Mehdi Yazdi of SEQENS North America on the process of method validation in the pharma industry

by
Lauren Everett

Lauren Everett is the managing editor for Lab Manager. She holds a bachelor's degree in journalism from SUNY New Paltz and has more than a decade of experience in news...

ViewFull Profile.
Learn about ourEditorial Policies.
Register for free to listen to this article
Listen with Speechify
0:00
5:00

Dr. Mehdi R. Yazdi, Director of Analytical Services, SEQENS

Dr. Medhi Yazdi has more than 25 years of experience, with a solid understanding and knowledge of all aspects of pharmaceutical business in the areas of analytical chemistry, product development, product formulation, process chemistry, CMC and FDA filing. He has managed or worked with method development, method validation departments, QC departments, formulation departments, stability study programs, and outsourcing to various organizations all under cGMP, FDA, DEA, ISO-9000, EPA, and OSHA regulations. He also has established several highly productive and compliance driven analytical department at various organizations from the ground-up.

He started his career at Ciba-Geigy, a large specialty/pharmaceutical company. His background includes working in pharmaceutical industries such as bulk pharmaceutical manufacturer, Contract Manufacturing Organization, Contract Research Organization, specialty pharmaceutical companies, API manufacturer, and finished dose pharmaceutical manufacturer. Dr. Yazdi joined SEQENS (formerly PCI Synthesis) in May 2011. He received his PhD in Analytical Chemistry in 1989 in the areas of chromatography and spectroscopy.

Get training in Lab Quality and earn CEUs.One of over 25 IACET-accredited courses in the Academy.
Lab Quality Course

Q: How have method development and validation evolved in recent years as regulations change and instrumentation/technology improves?

A: The concept of developing a method or validating a method for a sample of drug substance (DS), or drug product (DP) has changed drastically due to new sets of guidelines by FDA, ICH, and advancement in analytical instrumentation, and their ease of use. What used to be an acceptable method, or an acceptable validation protocol, no longer meets the standards. One has to think outside the box, and understand that the traditional approach for filing DMFs (Drug Master File), NDAs (New Drug Application) or ANDAs (Abbreviated New Drug Application) is no longer acceptable. FDA or ICH guidelines are only the foundation for a task, not the whole concept for meeting the compliance level or the regulation for a process. Also, each organization needs to have adequate numbers of the right instrumentation in place in order to comply with the regulations. For example, one may need to monitor on a routine basis the content of a genotoxin or a residual solvent at 1 ppm to sub ppm. Does the facility have the right equipment to accurately analyze the sample for the analyte? 

Each organization must understand that when it comes to compliance and regulations, the FDA would like to know:

• Do you have control over the whole process? 
• Do you understand and monitor the whole process? 
• Do you have the right analytical instrumentation to monitor the process? 
• Do you have a system in place to ensure the integrity of the data, and do they align with the latest regulations? 

That is why it is so important to understand a CDMO’s (contract development and manufacturing organization) capabilities, and experience before granting a project. The one-size-fits-all concept is no longer applicable in a cGMP manufacturing facility (especially in a CDMO) due to each product’s specific chemistry, route of administration, maximum daily dose of the drug product, and duration of the drug administration. Now more than ever, each organization must have a group of well qualified analytical and organic scientists, chemical engineers, process chemists, and well equipped QC (quality control) department which is fully equipped with latest technology, in order to deliver a product that meet the specification, and to satisfy the agency. 

Q: What sort of questions should lab managers/researchers ask, or what key information do they need, before undertaking the planning of analytical method validation?

A: As I indicated earlier, the concept of one size fit all is no longer applicable when one would like to develop or validate a method for any task.  When one would like to develop a method, the scientist first has to:

1. Understand the process chemistry, the components that may possibly exist in the sample, the physiochemical characteristic of the molecule, the impurity profile of the sample, and whether the sample contains any genotoxic component. 
2. The scientist next needs to understand the likely fate of each one of these components. Do they remain in the sample or will be purged out? 
3. The scientists need to identify the specification for each sample, aligned with ICH and regulatory guidelines. A key question: does it demonstrate process control? 
4. Next, the analytical scientist will develop a method or methods for each step of the process, and most often, it will be a chromatographic technique, mainly due to method sensitivity, specificity, and the need for tracking the process chemistry each step of the way.

When it comes to method development, the firm has to have well experienced analytical, organic scientists, engineers, regulatory body, and well equipped analytical department in order to design a suitable specification for each step, and develop methods for its intended use. That is why it is so crucial to find an appropriate manufacturing facility for the manufacturer of DS or DP. Additionally, that is why it is so important to focus on the strength of the analytical department for any project. If one can’t measure it, one can’t optimize the process. And if the process is not optimized, the FDA filing will be delayed and often delays are very costly. Now that the methods have been developed, the next task will be to what degree of certainty these methods are suitable, and reliable for its intended use? 

After all, the entire concept behind method validation is to ensure the safety and efficacy of the drug, and having a system in place to demonstrate the integrity of the data. The success of each method validation depends on well thought-out process, specification, an experienced analytical department and its capability, the method, material balance, and data integrity. If any of the above elements is missing, there is good chance that product launch will experience costly delays.

Q: How can poor validation design affect or stall the drug development process?
A:  
What used to be an acceptable DMF or NDA or ANDA filing is no longer acceptable. This is mainly due to regulatory agencies that are now interested in seeing a process that is fully under control, well understood, and that there are checks and balances in place to prove its control and data integrity, and of course all this under full cGMP compliance.  Why are these new requirements so important? To ensure drug safety and efficacy. Having said that, method validation is one of the key components of this equation.
A successful method validation depends on several factors: 

1. Is the method developed for its intended use? 
2. Is the sample specification designed for its purpose and justifiable? 
3. Is the process chemistry well understood, and is it under control? 
4. Does the specification comply with ICH and FDA guidelines? 
5. What is the maximum daily dose (MDD) of the drug product? 
6. Are the validation protocol and acceptance criteria designed around the sample specification? 
7. Have meaningful forced degradation studies been conducted for the product? 

If any of these factors are not considered carefully, the FDA will request further justification, additional method development/validation, changes to the specifications, and/or process changes. Therefore, the poor design of addressing the above tasks will certainly delay or sometimes even stall the product development or product launch, any of which is very costly. 

In one example of poor validation design, based on the MDD of the product, the concentration of each unknown impurity should be not more than 0.05 % in the sample. However, the validation protocol, and the method are based on a 0.10% specification (which typically is the norm in the industry). This shortcoming certainly will be followed by an FDA deficiency letter, ensuring a filing delay that will require re-visiting the poorly executed validation.   

In a second case, let’s say a molecule has component X, and the specification for the analyte is 6.5 to 7.5%. Upon close examination of the validation protocol, one will find the percentage of RSD for the precision study is set at NMT five percent (which again it is typically an acceptable precision value in industry at this concentration level). Unfortunately based on this acceptance criteria, if the component X in the sample is close to the upper or lower specification limit, one can fail a good batch—or pass a failed batch upon testing. This can happen when the method precision is not tight enough for its intended use. 
The point from these two examples is that the validation protocol and the acceptance criteria must carefully be designed and scientifically driven rather than based on standard criteria values used for similar studies.

Q: What steps can labs take to avoid delays and ensure they are adequately planning and executing analytical method validation?

A: There is no substitution for experience, know-how, understanding of regulations, and scientifically driven validation protocols. Having a reliable method is the first step and everything else relies on that basis (what good does it makes if a house was built on a shaky foundation?) Sample specifications must be well defined, impurity profiles must be well defined, method robustness has to be meaningful, all activities be conducted under cGMP and regulatory guidelines, and a proper forced degradation study has been conducted (a separate article will be written on this subject).

Q: What are some of the biggest challenges labs face when it comes to analytical method validation? And what can labs do to better handle these challenges? 

A: Unfortunately, some analytical testing labs frequently have limited knowledge and resources to dedicate to a project, which can lead to inconsistent results. To do a better job handling challenges, keep in mind the following three things: 

1. Conduct a forced degradation study prior to method validation if the method is a stability indicating method. If faced with a poor material balance; determine why it was obtained, and how one can detect and quantify the missing mass balance. If material balance is not addressed, method validation can’t claim a method as a “stability-indicating” method.
2. Choosing an appropriate technique is the foundation for successful method development and method validation. A method has to be scientifically sound, and justifiable; the specification has to fit the purpose and the protocol acceptance criteria need to be justifiable. 
3. Design a realistic method robustness study. A robustness or ruggedness study can determine the margin of error in your method to avoid problems when the method is replicated elsewhere or by someone else. Outcomes can change under a variety of conditions that can include different equipment and instruments, reagents, temperatures, elapsed times, mobile phase preparation, and other factors. It’s better to conduct the method robustness study earlier (rather than later) in the process. I have seen methods that worked only in the originators’ lab, and even sometimes the originator lab can’t duplicate what they did six months earlier. These types of shortcomings, certainly will delay process chemistry optimization, launch of the product, and increase project cost.  

Q: What changes or trends do you expect to see in the future for analytical method validation?
A: The FDA’s expectation has certainly changed even compared to a couple of years ago. The regulator expects the testing lab or the manufacturing facility to operate under full compliance, the facility be well suited for the task, and to have a system in place to demonstrate data integrity, they expect validation acceptance criteria that are meaningful and justifiable, and that the whole process is under control. 
Having said that, all pharma manufacturing or testing facilities must put a heavy emphasis on understanding each compound’s impurity profile, mapping out the entire synthetic route, and having qualified and experienced scientists in place. Also the firm needs to know the quality of incoming raw materials, and be well equipped to monitor them. 

Although method validation is one important part of this equation, if the foundation of process knowledge is not there, what good does it do to validate a method? All analytical testing labs must stop the cookie-cutter approach of using the same designs of testing and acceptance criteria for each method. Each validation protocol needs to be unique and be designed with the specific goals and acceptance criteria that apply to that analysis. Therefore, a group of well qualified scientists is needed to address and design such a protocol. Testing facilities can no longer ignore the importance of designing a unique validation protocols for each unique product. Additionally, with advancements in analytical instrumentation, old practices that lack method specificity, or accuracy need to be replaced by modern techniques and these certainly enhance the quality of the results and the product. If one ignores these factors, later on, the method may have to be re-developed, and be re-validated, which will be very costly, lead to delays, and possibly kill the project.