Sensitive Techniques Follow Contamination from Fracking to Pharmaceuticals and Beyond
It’s a war out there when it comes to organic contaminants in the environment. “Carcinogens are assumed to have no safe levels of exposure and are therefore of concern at any level,” says Elise Elliott, a doctoral student at the Yale University School of Public Health in New Haven, Connecticut. “However, technological advances have enabled us to analyze increasingly lower concentrations of chemicals, which has made it clear that other non-carcinogenic organic contaminants are also associated with adverse health effects at very low levels.” To study such concerns, scientists seek increasingly sensitive and robust methods of analysis.
“The techniques that work best depend on the matrix or sample type, the physical properties of the contaminant, as well as the desired detection limit,” says Richard Jack, senior director for the environmental and industrial markets for the chromatography and mass spectrometry businesses at Thermo Fisher Scientific in Waltham, Massachusetts. “Methods used for trace analysis typically involve two parts: contaminant extraction and compound analysis.” The extraction tends to be more complicated in soil versus water samples. The detection can involve various techniques, from ultraviolet and fluorescence detectors to mass spectrometry (MS). “Generally, mass spec is most often used for trace analysis due to low concentrations of the contaminants and the presence of interference from the sample matrix,” Jack says. “While providing lower detection limits, in most cases, MS also provides a measure of confirmation through mass-tocharge ratio assignment for each contaminant.”
Some applications require novel methods. In a 2016 issue of the Bulletin of Environmental Contamination and Toxicology, a research team—Amber Russell, David Martin, Michael Cuddy, and Anthony Bednar—from the U.S. Army Engineer Research and Development Center’s (ERDC’s) environmental laboratory in Vicksburg, Mississippi, points out that the United States alone suffers an average of more than 30,000 oil spills a year. As they write: “The detection of trace quantities of petroleum products in soil and water samples is necessary to identify the areas impacted by a spill, and in doing so help guide efforts to mitigate the environmental impact.” To study such contamination, these researchers developed fluorescence-detection equipment that can be used in the field. As Bednar, analytical geochemistry team leader at ERDC, says, “We developed this technique as a rapid screening technique for detection of petroleum hydrocarbons in soil and water matrices.” He adds, “The initial impetus for this technique was an oil spill on the lower Mississippi River in 2008, where dredging operations were impacted by the inability to detect oil in near real time, and thus impacted dredging decisions.”
As we shall see, different organic contaminants in different environments fuel different approaches.
Improving the Preparation
Over time, the ERDC team kept improving its technology. They included calibration standards and added a digital fluorometer. These improvements, Bednar says, moved “the technique from screening level to semi-quantitative to essentially quantitative.”
Working in the field, though, adds complications to any technique. “The limitation on the quantitative nature is more related to challenges associated with rapid extraction of hydrocarbons from the soil or water matrix in the field, where you don’t readily have enhanced extraction techniques, such as sonication, pressure, or heat,” Bednar explains. “In its present state, we use hexane as an extraction solvent, and simply hand shake the sample for a minute or two.” The scientists let the phases separate, remove the hexane, and then analyze the sample with, as Bednar explains, a digital or human optical fluorometer to measure fluorescence from polycyclic aromatic hydrocarbons present in the petroleum.
Several variables—weathering of the petroleum, where it came from, and the calibration standards being used—impact the accuracy of this technology. Still, Bednar says, “In most cases, the data obtained is in rough agreement with more sophisticated laboratory analyses, as described in the paper.” He adds, “The primary benefit of the technology is the time from collection to data in hand—a few minutes—at a minimal cost, compared to traditional laboratory techniques.”
Already, this technology is being used in civilian and military applications. Bednar describes the military uses as “related to rapid detection of petroleum hydrocarbons in austere field conditions, where data is difficult to obtain.”
Safe to Drink?
Although any environmental contamination can stir large concerns, people really worry about contaminated water. “The U.S. Environmental Protection Agency, EPA, has set legally enforceable public water system standards— maximum contaminant levels—to limit the levels of contaminants in drinking water,” Elliott explains. “These drinking water standards give an indication of which organic contaminants are of greater public health concern at lower levels.” Although the EPA sets these levels in hopes of protecting everyone, sometimes even a trace isn’t safe. As Elliott says, “Out of the 53 organic contaminants with a drinking water standard, 23 have a maximum contaminant level goal that indicates there are no assumed safe levels of exposure for that chemical.” Some examples include benzene and polychlorinated biphenyls (PCBs)—both of which are known carcinogens.
In a 2016 issue of the Journal of Exposure Science and Environmental Epidemiology, Elliott and her colleagues explored the potential dangers to drinking water caused by increasingly applied approaches to developing sources of oil and natural gas, such as fracking. They wrote: “Hydraulic-fracturing fluids and wastewater from unconventional oil and natural gas development contain hundreds of substances with the potential to contaminate drinking water.” This team evaluated 1,021 chemicals identified in fluids used for fracking and the wastewater from the process, and they found that toxicity information does not exist for 76 percent of them. Where toxicity data do exist, 43 percent and 40 percent are implicated as reproductive and developmental toxins, respectively, and 17 percent are indicated as both. The authors concluded that “carefully designed, rigorous exposure, and epidemiologic studies are urgently needed to investigate public health uncertainties and form a scientific basis for appropriate evidence-based policies.”
In this research, says Drollette, the team “focused on hydrophobic organic compounds in shallow groundwater near areas of unconventional natural gas production.” They took samples from more than 60 residential wells and analyzed them for organic and inorganic compounds. “Our findings indicated that there was a significant relationship between the levels of diesel-range organic compounds in the groundwater and the location of natural gas wells,” Drollette says. “Furthermore, we saw this relationship between the organic compound levels and gas wells that have had environmental health and safety violations in the past, indicating that the natural gas operations may be impacting local groundwater.”
This work depended on very sensitive analysis, because the contaminants were in the parts-per-billion range. So the team used two-dimensional gas chromatography (GCxGC) plus time-of-flight MS (TOFMS). According to Drollette, this technology is “one of the most exciting advances in our field for analyzing trace organic contaminants.” He points out that conventional GC doesn’t adequately separate complex organic mixtures. “In GCxGC analyses,” he says, “we are able to separate compounds in the second dimension that would typically co-elute in the first dimension.” The GCxGC approach can separate the thousands of organic compounds that can be in petroleum samples, and, as Drollette says, it “actually increases the signal-to-noise ratio of analytes.” With that separation and TOFMS, he says, they can “detect trace amounts of organic compounds, typically on the order of picograms, with greater confidence.”
This work indicated that “the organic compounds were the result of releases at the ground surface near the gas wells,” Drollette concludes.
Advancing the Detection
“Technologies are constantly evolving to keep pace with new contaminant analysis challenges,” Jack says. “For example, the growing variety of analytes and the increasing stringency on limits of detection and quantitation requirements are all driving technology development.” As an example, he mentions the Thermo Scientific Orbitrap, where “the combination of high resolution and accurate mass for targeted and non-targeted analysis increases selectivity while potentially lowering sample prep requirements.” He adds, “Thermo Fisher now has Orbitrap technology for both liquid and gas chromatography to obtain a more comprehensive picture of the contaminants present in the environment—capabilities [that] previously have not existed.”
Beyond being sensitive, the detector must also be adaptable. For example, the growing list of potential contaminants includes pharmaceuticals and personal care products. “These include thousands of compounds found in surface waters along with their breakdown products, which can also have toxic effects,” Jack says. “Because standards are virtually impossible or too costly for all these compounds, HRAM— high-resolution accurate-mass MS, like the Orbitrap—is the ideal tool for identifying unknowns in the environment.”
So from fracking to pharmaceuticals and water to soil samples, environmental scientists constantly need new methods to extract and analyze samples, which include a growing list of contaminating culprits. In addition, some samples need to be assessed in the field, and that requires the development of portable technology— field-worthy equipment that can detect and identify traces of organic compounds that could endanger us or animals and plants. It’s not a simple task, but well worth the effort.
Like this article? Click here to subscribe to free newsletters from Lab Manager