A conversation with Pfizer Chemistry VP Mark Noe, PhD
Mark Noe, PhD.As a process, drug discovery relies on myriad complex and sometimes interdependent inputs and outputs related to chemical compounds, the biological target, and diseases. These include:
- Target selection and validation
- Molecular design, including computational methods
- In silico and in vitro screening
- Informatics for managing and visualizing data
- Chemical synthesis and molecular elaboration
- Biochemical, cell-based, and animal testing
- Collaboration with internal and external experts
- Go/no-go economic decisions
- Instrumentation and analytics
- Human resource utilization
- Acquisition of molecules and expertise vs. in-house development
and many others. Given a typical drug discovery group’s workload of up to three projects at any given time, these factors are multiplicative.
“The magnitude of our concern and effort around these challenges depends on how prominently they relate to a given project,” says Mark Noe, PhD, VP of the Groton Center of Chemistry Innovation at Pfizer (Groton, CT).
Most drug discovery efforts begin with a biological target—the molecule inside the body whose activity the drug is expected to enhance or diminish. Assurance that the target is pharmacologically accessible and responsible in some way for the disease in question is based on target validation studies. “If you don’t have that confidence at the beginning of a discovery project, all subsequent efforts will be wasted,” Noe says. “A productive drug discovery effort depends on building a strong scientific case.”
Pieces of the validation puzzle might include uncovering genetic mutations that activate or inhibit the target, suggesting that modulating the target in some way will positively affect the disease. Another, more theoretical approach involves understanding the impact of target modulation on larger cellular systems and the resulting impact on the disease state. Genetic evidence and theoretical evidence are often combined.
A further aspect involves access to an appropriate disease model incorporating genetic or theoretical components discovered earlier. Treating cells or organisms in this way further validates the target and establishes basic pharmacokinetic and pharmacodynamic parameters: the level and time course of drug administration required to ameliorate the disease state.
Assessing target safety based on relationships between adverse effects and activation or inactivation of genes coding for the target in humans is also critical. Lacking this information, discovery scientists may turn to test animals, for example through gene knockout strategies.
“Drug discovery chemists and biologists are equally concerned about building a strong case for particular targets,” Noe tells Lab Manager. Chemists generate pharmacological tools to test mechanisms, while biologists contribute to the understanding of effects on biological systems. “We think of these elements of target validation very early in projects, and as the project advances we continue to build our case—or reprioritize our efforts if studies suggest otherwise.”
After confidence is established in the target, new challenges emerge related to “lead matter”—compounds likely to interact with the target to effect a desired clinical outcome. A number of strategies are available for identifying early-stage hits: screening archival molecules that are active against related targets, testing compounds from the literature, high-throughput screening of large compound libraries, or fragment-based screening through an appropriate biophysical or biochemical assay.
Turning hits into leads is aided by identifying the target’s binding site and mode of action using structural biology and biophysics, protein crystallography, nuclear magnetic resonance spectrometry, and surface plasmon resonance for binding kinetics.
“The orthogonality of these methods is a positive attribute,” Noe says. “Since all assays are prone to artifacts, good drug discovery practice involves employing several of these assays in parallel.” Biochemical assay artifacts are the most common, especially at high concentration, due to test compounds’ interference with optical assay readouts. For example, a biochemical assay may couple two or three enzymatic reactions together because the protein one is looking to inhibit lacks a convenient assay readout. Test compounds inhibiting either target or reporter proteins may produce a positive readout, which must be de-convoluted through further assays. Observing activity in the enzyme assay but no target binding in the biophysical assay strongly suggests that an artifact is at work.
Noe believes that drug discovery would benefit by the more routine, side-byside application of biophysical and biochemical assays in as close to real time as possible. This would reduce the likelihood that scientists working in one area or the other would waste significant time on artifacts, and would allow scientists to better understand the kinetic and thermodynamic parameters that underpin compound activity. “We strive to do that within Pfizer but there’s sometimes variability with regard to the close alignment of those activities across teams,” Noe admits.
Sometimes it comes down to assay availability. The biochemical test is available but the biophysical test is still in development, and teams may lack the luxury of waiting for cross-validation.
In Silico Modeling
Drug discovery chemical space is vast, with something like 1,040 possible “Rule of Five” compliant compounds. Since synthesis is costly and time-consuming, the primary challenge is determining the best compounds to synthesize. The extent to which decision-making is empirical is directly proportional to how long the discovery project takes. Some critical properties model quite well, others not so easily.
Drug discovery labs have benefited from advances in hardware and software, particularly through application of in silico models for estimating molecular properties of putative molecules before they are synthesized. According to Noe, permeability and oxidative metabolic stability models are reliable, but a great deal of improvement is needed for predicting aqueous solubility. Software works well enough within a series of related molecules but struggles with compounds that are structurally quite different.
One reason solubility-predicting software lacks reliability is because predicting thermodynamic solubility requires knowledge of the compound’s crystal packing, the lattice energy associated with that crystalline form, and solvation energy. Development scientists determine these values later, but the real power is in using this information to prioritize which compounds are synthesized in the first place. “To do experimental screens you have to first make the molecule,” Noe says.
Solubility predictions are further thwarted by the existence of several crystalline forms—polymorphs—for many molecules. Aqueous solubility, stability, and manufacturability may vary significantly among polymorphs, and the thermodynamically most stable form can be difficult to predict.
In silico coverage for the range of properties on a medicinal chemistry wish list is incomplete but continues to develop. Lipophilicity is addressed adequately. pKa is more variable and depends on the nature of the ionizable group and how well-trained the model is on particular functionalities. Similarly, drug potency models anticipate large changes in a molecule’s potential effectiveness— on the order of a factor of ten or more. That’s fine for very early-stage work but inadequate for subsequent lead optimization.
Additionally, Pfizer has developed models for oxidative stability, drug-drug interaction potential, plasma protein binding, blood-brain barrier penetration, and safety end points including hepatic safety.
“Medicinal chemists certainly know how to apply structure- activity relationships within a target’s particular potency range, and they can come up with ideas that retain activity within that potency range. But what they really need are more robust in silico tools to assess potential ADME, safety, potency, and selectivity versus different targets,” Noe explains. This would allow scientists to more confidently prioritize a list of virtual compounds for synthesis.
In silico models that predict pharmacological and chemical properties are statistical models based on tens of thousands of data points residing, in the case of large pharmaceutical companies, in their internal databases. But for startup companies lacking that data, the range of in silico modeling capabilities will likely be much narrower.
Despite incomplete coverage, modeling software has advanced tremendously in the past decade, Noe says, “both in capability and chemists’ willingness and desire to use them in the design phase.”
A related challenge is managing the vast quantities of data generated during drug discovery. Due to space considerations we have only covered a handful of data inputs lab managers must deal with in decision-making. And we have not paid proper due to the biological work predating early-stage discovery, supporting it, and continuing after viable candidate molecules emerge. For example, modern biology is increasingly concerned with pathways, systems and the “omics” that regulate and define them.
Even sticking to pure chemistry considerations, drug discovery organizations like Pfizer maintain vast databases on historical targets, hits, leads, etc. “Success depends in part on leveraging data we’ve collected on our compounds across targets, and also integrating physicochemical properties into our analyses,” Noe says.
As understanding of cellular physiology and the relationship between cellular systems and health has improved, drug discovery labs have come to rely less on animal screening. While animal testing will never be completely eliminated, more compounds than ever experience early triage based on cell-based safety and efficacy assays, thus reducing the need for or postponing more costly, time-consuming animal tests.
Labs increasingly rely on smaller, simpler animal systems as well, for example, zebrafish for early safety assessments.
“One big issue with animal testing is the quantity of compound you have to supply,” Noe observes. It is also resource-intensive because animal assays for safety and efficacy require determining a test compound’s plasma concentration over time, which taxes bioanalytical resources. In vivo assays also require significant effort from pharmacologists to conduct the assays. “The more we can do in vitro, the more efficient we will be in terms of cost and time. And the more we can do in silico during the design phase, the more efficient we will be in terms of cost and time by prioritizing the correct molecules for synthesis.”
Noe believes that animal testing will remain a permanent fixture in bringing quality drug candidates forward, if for no other reason than regulators demand animal data on safety, and often for efficacy as well. For example, infectious disease animal studies are imperative for developing new antibiotics. The humane aspects of reducing animal testing notwithstanding, the strategy reduces overall development costs and reduces time to market.
Weeding Out Discovery Labs' Inefficiencies
Pharmaceutical manufacturing has long been criticized for lack of sophistication compared with low-tech process industries. Angelo Filosa, PhD, Global Head of Scientific Services at PerkinElmer (Waltham, MA), believes the same is true for pharmaceutical research labs, which lack what he calls a “value-added culture.” According to Filosa, drug discovery scientists spend a quarter of their time on activities that have little to do with their job titles, for example, paperwork, waiting for test results, or tending to instruments.
The shrinking and accessibility of modern instrumentation is partly to blame. Mass spectrometry (MS) devices used to take up an entire lab run by degreed specialists as a core facility. Today, groups or individual workers have their own MS instruments. High-performance liquid chromatography has become a routine analysis as well. With democratization, however, came the responsibilities of cleaning, performing routine maintenance, acquiring or formulating reagents, stocking consumables, and conducting quality tests.
“These tasks are absolutely essential, but they take time and don’t directly contribute to the science,” Filosa notes. “There’s no reason why everybody needs to know how to do them. Drug discovery scientists should instead focus on running the right assay or designing the next molecule in a series.” For this reason, Filosa says, companies are returning to centralized core functions for some tasks.
With drug discovery and clinical trials often lasting a decade, it’s difficult to convince medicinal chemists that saving a few days or even one week matters. “With those timelines, delays don’t always translate to something tangible as they would with a 60-day project,” Filosa observes. The timeliness imperative becomes more obvious when stated in terms of sales: a one-day delay costs close to $3 million. “From that perspective, almost everyone realizes that a day or week really does count.”
Like this article? Click here to subscribe to free newsletters from Lab Manager