Lab Manager | Run Your Lab Like a Business

Insights

INSIGHTS on Metabolomics

INSIGHTS on Metabolomics

Within the past few years, genomics, proteomics, and metabolomics have created new disciplines within biology while inspiring the creation of techniques and methods that take life science analytics to new levels of sensitivity, specificity and understanding.

by
Angelo DePalma, PhD

Angelo DePalma is a freelance writer living in Newton, New Jersey. You can reach him at angelodp@gmail.com.

ViewFull Profile.
Learn about ourEditorial Policies.
Register for free to listen to this article
Listen with Speechify
0:00
5:00

Overcoming chemical diversity, concentration ranges

Metabolon (Research Triangle Park, NC) specializes in high-throughput metabolomics screening for clinical and research studies. Its core capabilities are based on platforms consisting of three Thermo Fisher QExactive™ hybrid quadrupole-orbitrap mass spectrometers with Waters UHPLC front ends. One LC-MS system is dedicated to negative electrospray, one to positive electrospray, and one to highly polar compounds. For fully quantitative targeted assays, Metabolon employs Agilent UHPLC systems connected to triple-quad MS from SCIEX.

Metabolon has experience with more than 350 analytical matrices, from human and animal biofluids to plant extracts and nonliving materials. The diversity of the company’s more than 650 global clients confirms metabolomics’ reach in research and clinical investigation. Half are academic and government organizations and the next-largest clientele groups consist of pharmaceuticals and biotechnology, but also included are organizations with interests in food, agriculture, consumer products, and even archeology. The company’s platforms and informatics have also led to the development of diagnostics for obesity-related diseases and partnerships in genomics-based health initiatives with groups like Dr. Craig Venter’s Human Longevity, Inc.

Get training in Biosafety and Biosecurity and earn CEUs.One of over 25 IACET-accredited courses in the Academy.
Biosafety and Biosecurity Course

Data interpretation is huge for a lab that operates in the molecular space of 4,000 known compounds and an additional 8,000 identifiable unknowns in a typical biological matrix. Rather than rely on third-party software to help correlate metabolite profiles with biological states, Metabolon developed its own in-house program consisting of two million lines of code. The software performs an initial screen, but scientists confirm those results and ascribe them to the particular biological pathway under investigation.

Each analytical platform runs approximately 180 samples per day, which the company is looking to expand by acquiring more instrumentation. This reflects one of the company’s top priorities— throughput.

“It’s imperative for us to maximize capacity and throughput while maintaining data quality,” says Luke Miller PhD, VP of laboratory operations. And for each sample, this means calling out the largest number of identifiable compounds and pinpointing the greatest number of confirmed metabolites. Human plasma, for example, contains as many as 1,300 known metabolites.

Expect the unexpected

Sensitivity issues are usually related to low-abundance analytes, particularly in the presence of very high-concentration molecules—concentration dynamic ranges may be as high as 1,014. Miller notes that “unexpected situations” arising from matrix effects can also diminish sensitivity by suppressing chromatographic regions or causing outright column failure. “Lab managers must be prepared to detect diminished sensitivity and try to fix it.”

Metabolon employs standard sample preparation methods designed for workflow efficiency. But when applied to novel matrices, these techniques may allow certain column-degrading compounds to slip through. Citrate in plasma, for example, negatively affects sensitivity, and in a high enough concentration will kill an HPLC column.

Columns are not the only sources of sensitivity problems. Compounds so abundant they fall above the linear dynamic range of the mass detector can overwhelm signals around them, but unlike HPLC-related failures, these events are transient.

Metabolon relies on long, automated, overnight LC-MS runs. When a column fails due to contamination or some other issue, subsequent data cannot be trusted until the problem is resolved. Thanks to modern communications, project leaders are alerted, sometimes in the middle of the night. Problems that can be solved remotely are handled that way. For the remainder, staff are expected to go on-site.

Hold time is an interesting topic with metabolites. We know that some sample types do not age well; for example, when bacteria multiply or digest analyte molecules from an environmental sample. While less prevalent in metabolomics, these effects nevertheless occur. Metabolon therefore asks clients to flash-freeze samples upon collection and transport them frozen. Samples are inventoried and stored at -80°C on collection and maintained as cool as possible during sample prep. “We know that certain metabolites, particularly lipids, change upon storage, even at minus eighty,” Miller says. “We preach that as long as we have sufficient sample numbers to get a statistically valid readout, and as long as each sample is treated identically and consistently, we can overcome slight changes that occur naturally.”

With 20-hour chromatography runs and instruments often operating seven days a week, Miller’s biggest worry is the unexpected. LC column failures lengthen or invalidate individual runs, but instrument failure and laboratory contamination have the potential to shut down operations. Metabolon uses risk-mitigation steps like providing clean, well-defined voltage to instruments; maintaining a constant temperature in the lab; and running a quality control check every sixth LC injection.

Holistic systems view

Yingying Huang, PhD, senior marketing manager for metabolomics and lipidomics at Thermo Fisher Scientific (San Jose, CA), identifies four persistent metabolomic workflow issues:

  • Separations: How to cover the entire chemical space for small molecules.
  • Signal detection: Noise elimination once separation is achieved.
  • Compound identification: A significant bottleneck, particularly when peaks change from sample to sample.
  • Data integration: Combining metabolomic data with other ’omics to create a holistic view of systems biology.

“The separation challenge is not knowing how many molecules you need to solve the problem. You don’t know what you don’t know,” Huang says.

One thing we do know is that separating very polar molecules— for example, phosphate sugar isomers—is tough on conventional C18 and HILIC HPLC columns. The existence and position of phosphates convey deep knowledge of the history of a sugar molecule and from which biological pathway it emerged.

Labs turn to ion chromatography (IC) for such molecules, but until recently could not access MS detection due to incompatibilities between the potassium hydroxide-containing elution buffer and the spectrometer. Thermo Fisher Scientific has recently developed an ion suppression technique that removes the interfering species and allows mass detection for IC, thus opening the door to confirmatory analysis of thousands of highly polar metabolites. A Thermo Fisher application note demonstrates separation of 11 monophosphate sugar isomers and nine diphosphates, quantitative over five orders of magnitude concentrations down to the femtogram level.

“With this new capability we can more closely study the TCA and glycolysis cycles,” Huang says.

In many instances multiple pathways produce identical sugar phosphates. Discriminating pathways that generate the same signal requires single- or double-isotope labeling of nutrient precursors and following the labels through critical transformations. Before widespread adoption of MS, labs relied on radioactive isotopes. Today MS easily discriminates between stable isotopes of carbon, for example, and the labeled species are relatively inexpensive.

Improving signal-to-noise issues

Signal-to-noise issues arise in metabolomics because the molecular targets are of low molecular weight and therefore singly charged. Proteomics targets, by comparison, are multiply charged. The more charge a molecule carries, the lower the mass/charge ratio and the higher the sensitivity and resolution.

Abundant sources of noise include high-concentration molecules, solvents, materials leaching off columns and impurities at the ionization source or in the autosampler, all of which are likely to be singly charged as well but present at significantly higher concentrations than many analytes.

Compound identification relates directly to chromatographic resolution, but this circles back to the question of noise. “Going to higher resolution will separate three molecules that previously eluted as one peak, but once you have that data you still must decide if peaks come from your sample or some other source,” Huang explains. “That’s where sophisticated data processing comes in.”

One relatively straightforward precaution involves strategic use of internal standards and injection of blanks, whose traces are subtracted from sample runs.

Quantifying the three ’omics enables construction of gene activity, which reflects the up-and- down regulation of critical pathways. But integrating data from metabolomics, proteomics, genomics, and lipidomics and deconvoluting biological systems based on those interactions have been difficult. Huang says that while researchers are investing a lot of effort in integration, “we’re not there yet.” The key will be employing a systems or pathway approach, where genes represent an organism’s potential, proteins their machinery and metabolites the end product or phenotype.

Archiving consistent data

The laboratory of Professor Timothy Garrett at the University of Florida’s department of pathology uses triple quad and occasionally orbitrap MS for targeted metabolomics and a Thermo Fisher Q-Exactive orbitrap for global metabolomics. Both systems have UHPLC front ends.

Garrett’s most serious challenges are compound verification and throughput, both related in their own way to large numbers: the former to the large number of metabolites in typical samples, the latter to the lab’s sample capacity. Like Metabolon’s Luke Miller, Garrett recognizes that instrument robustness is critical for keeping things moving smoothly. “Instrument performance issues can turn a 20-hour run into a 40-hour run,” he says.

In Garrett’s opinion, instrument capabilities have for the most part kept up with the severe demands of metabolomics research. “Our triple quads are the most robust systems in our lab, requiring very little maintenance or calibration. The high-resolution [orbitrap] spectrometers are more finicky because they require more frequent calibration.” And because they’re high-performing, orbitraps generate large quantities of data that give rise to what Garrett calls “communication failures”—mysterious mid-run crashes that arise when instrument and computer stop sharing bits and bytes.

Garrett runs all samples in positive and negative ion modes to obtain the most information from each LC injection. This brings the number of metabolites of interest for his typical workflows to the 4,000 to 5,000 range. “Not all are endogenous metabolites, and not all are real metabolites,” Garrett admits. “It’s not like electrospray ionization on GC-MS, where you get most of what you want by running everything in positive mode.”

For metabolomics work, when applicable, GC-MS is simpler and more highly reproducible than LC. Compound mass libraries are exhaustive and reliable to the point where a library search is nearly always confirmative. GC is limited, however, by molecular weight and the ability to form derivatives that don’t stick to the column. But LC-MS, Garret says, provides much greater metabolomic coverage.

LC is not without shortcomings, particularly when analyzing isomers with similar fragmentation patterns. Matched internal standards are an established workaround, but large isomers like prostaglandins can be “all over the place,” according to Garrett. “But you can quantify them as long as the fragments are in the database.”

Users of instrument columns over the past several years recognize how far LC-MS has progressed in terms of ruggedness and capabilities. Those improvements have not been unmitigated, however, because the more unique the combinations of LC system, column and detection platforms, the more difficult making sense out of retention times and masses becomes. MS platforms are innately inconsistent because of widely varying collision energies and patterns, and the LC system is a separate variable. Each combination or system in effect becomes its own standard.

It is this lack of inter-instrument uniformity that leads Garrett to wish that “lots of people would collect consistent spectra on the same instrument.”

“Many instrument companies are putting in that effort, but it will take a very long time to [ensure] that every spectrum they publish is consistent. In the meantime, buying one of each company’s mass spectrometers to be able to match with their fragment libraries would be quite expensive.”

Metabolomics has been a rich area of research, but its ultimate goal is to create a systematized foundation for personalized medicine. “We need to be ready for anything: rare diseases, novel matrices, or some issue in the lab that affects the day’s run,” says Miller. “Eventually we hope to reach the point where we can do with metabolomics what we hoped to achieve with genomics and proteomics.”