Lab Manager | Run Your Lab Like a Business

Using ICP-MS for Environmental Trace Metal Detection

A case study shows how ICP-MS played a vital role in the investigation of lead levels in a public schools system's drinking water supply.

by Sam Richardson,Zoe Grosser,Robert Thomas
Register for free to listen to this article
Listen with Speechify
0:00
5:00

Breakthroughs in atomic spectroscopy, and in particular ICP-MS, have led researchers to a better understanding of environmental pollution and the effects of trace metals on humans. The toxic effects of lead and regulatory toxicity levels have been lowered as new, more sensitive instrumentation has been developed. This case study exemplifies how one water municipality used ICP-MS to analyze over 60,000 drinking water samples for Pb, in an investigation into the impact of the plumbing system on the water supplies of the public schools.

The development of analytical instrumentation over the past 40 years has allowed us not only to detect trace metals at the parts per quadrillion (ppq) level, but also to know its valency state, biomolecular form, elemental species or isotopic structure. We take for granted all the powerful and automated analytical tools we have at our disposal to carry out trace elemental studies on clinical and environmental samples. However, it was not always the case. As recently as the early 1960s, trace elemental determinations were predominantly carried out by traditional wet chemical methods like volumetric-, gravimetric-, or colorimetric-based assays. It was not until the development of atomic spectroscopic (AS) techniques, in the early 1960s, that the clinical and environmental communities realized they had such a highly sensitive and flexible trace-element technique. Every time there was a major development in atomic spectroscopy — such as flame atomic absorption (FAA), electrothermal atomization (ETA), inductively coupled plasma optical emission spectrometry (ICP-OES), and inductively coupled plasma mass spectrometry (ICP-MS) — trace-element detection capability, sample throughput, and automation dramatically improved. There is no question that developments and breakthroughs in atomic spectroscopy have directly impacted our understanding of environmental contamination and the way trace metals interact with the human body.

Lead

Take for example, the toxicity effects of lead (Pb), especially in young children. It can damage a child's central nervous system, kidneys, and reproductive system and, at higher levels, can cause coma, convulsions, and even death. Lead has no known biological or physiological purpose in the human body, but is avidly absorbed into the system by ingestion, inhalation or skin absorption. Children are particularly susceptible, because of their playing and eating habits. Lead is absorbed more easily if there is a calcium/iron deficiency or if the child has a high fat, inadequate mineral and/or low protein diet. When absorbed, lead is distributed within the body in three main areas: bones, blood, and soft tissue. About 90% is contained in the bones while the majority of the rest gets absorbed into the bloodstream where it gets taken up by porphyrin molecules (complex nitrogen-containing organic compounds providing the foundation structure for hemoglobin) in the red blood cells. It is clear that the repercussions and health risks are potentially enormous if children are exposed to abnormally high levels of lead.

The level of lead in someone’s system is confirmed by a blood-lead test, which by today’s standards is considered elevated if it is in excess of 10 μg/dL (100 ppb) for children and 40 μg/dL (400 ppb) for an adult. However, since 1970, our understanding of childhood lead poisoning has changed substantially. As investigators have had more sensitive techniques at their disposal and designed better studies, the toxicity levels for lead have progressively shifted downward. Before the mid-1960s, a level above 60 μg/dL (600 ppb) was considered toxic and by 1978, the defined level of toxicity had declined 50% to 30 μg/dL (300 ppb). In 1985, the CDC published a threshold level of 25 μg/dL (250 ppb), which they eventually lowered to 10 μg/dL (100 ppb) in 1991. It is well-recognized that the development of flame atomic absorption in 1962 to the detection limit breakthrough of electrothermal atomization in 1971 and the staggering sensitivity and sample throughput of ICP-MS in 1983 have all played an integral part in reducing these toxicity levels.

Toxicity levels

Even though the majority of sources of lead contamination have essentially been removed from the environment, (e.g. paint, pipes, gasoline, pottery, smelters), there still remains a potential threat from the use of lead plumbing materials used in drinking water supplies. This is definitely the case with older buildings, but can also be a problem with newer homes that use copper pipes and fittings connected with lead-based solders.

This has been recognized by regulatory standards even as far back as the early 1960s, when the Surgeon General under the direction of the U.S. Public Health Service (USPHS) set a mandatory limit for Pb levels in drinking water. Unfortunately at that time, lead assays were carried out using the dithizone colorimetric method, which was very sensitive but extremely slow and labor intensive. It became more automated when anodic stripping voltammetry was developed but lead analysis was not considered a truly routine method until AS techniques became commercially-available in the early 1960s. It was not until 1974, when Congress passed the Safe Drinking Water Act (SDWA), that the National Primary Drinking Water Regulations set Maximum Contaminant Levels (MCLs) of 50 ppb for Pb. At this point in time, ETA was firmly established as the dominant ultra-trace element analytical technique and even though it could detect these kinds of levels with relative ease, it was extremely slow and laborintensive. This did not pose a real problem for small numbers of samples, but as a duplicate analysis might take five to seven minutes per sample, it became very time-consuming in high-workload environments.

When amendments were made to the SDWA in 1996, the MCL for Pb was reduced from 50 to 15 ppb. At that time emphasis was placed more on regulating a small number of contaminants that posed the highest risk to public health. It became known as “the lead and copper rule” and, instead of specifying an MCL for both these elements, they had an “action limit,” where some action had to be taken if 90% of the drinking waters sampled at the tap or faucet were above 15 ppb for Pb and 1.3 ppb for Cu. In the case of Pb, this meant that some kind of corrosion control procedure had to be put into place, together with a public education and awareness campaign. The result was that state, county, and city water municipalities and public health authorities were mandated not only to monitor lower levels in drinking waters but also to sample them on a more regular basis. This produced a significant increase in the number of samples, which led to a more rapid acceptance of ICP-MS to analyze drinking water samples, not only for Pb, but also for the entire suite of environmentally significant elements. It is no coincidence that the increased acceptance and use of ICP-MS for environmental monitoring has led to the lowering of blood lead levels in children and MCLs in drinking water.

ICP-MS in action

A recent example of how ICP-MS has met the challenge of monitoring Pb in drinking water samples is shown in a recent investigation carried out by Washington (DC) Suburban Sanitary Commission (WSSC). A study had been carried out by the District of Columbia Water and Sewer Authority (DCWASA) in early 2004, which found elevated levels of Pb in the drinking water supply of many of the schools in the District. Although this was not totally unexpected considering the age of the schools, it set off alarm bells at the Public Health Departments in nearby Montgomery and Prince Georges Counties in Maryland. Though they were expecting 20-30 drinking water samples per school, the investigation turned out to be far bigger than they ever imagined. The magnitude of the problem soon became evident when one school uncovered some abnormally high Pb levels of up to 40 ppm – approximately 3000 times the U.S. EPA action limit. As a result, it was clear that the frequency and number of samples tested was going to increase dramatically. In fact, in some of the larger schools, the water supply needed to be sampled at over 500 different locations to fully understand the severity of the problem.

So here was a large suburban water municipality, with literally thousands of drinking water samples coming into the lab to be analyzed for lead. There were two school superintendents, in charge of almost 400 public schools in the area, who were worried about a potential Pb contamination problem in their drinking water supply… not to mention the thousands of extremely concerned parents. If this was not enough, the local TV station heard about the problem and wanted an interview. An added complication to an already stressful situation was that WSSC did not have a suitable instrument to carry out the investigation.

They knew that their current ETA system and an older ICP-MS instrument would not be able to handle the expected workload and the only way the lab was going to analyze this number of assays was to invest in a new instrument. So the go-ahead was given by the lab director to purchase an ELAN® DRC-e (PerkinElmer SCIEX™). As a result, they were fully operational just one week after installation — making a dent into the huge backlog of drinking waters, arriving at a rate of 500 samples per day.

In a typical day, they would fill the autosampler and run a QC standard spike sample and blank every ten samples. If the QC standard fell outside the U.S. EPA drift specifications of 15%, recalibration automatically took place. They were able to carry-out unattended, overnight runs and quickly analyzed over 60,000 samples in less than six months.

Unfortunately, the data generated from the investigation have posed almost as many questions as answers to the problem. Evidently, when the drinking water lies stagnant in the pipes for a period of time, it leaches out Pb from the brass pipes, fittings, or the Pb-based solders used in the plumbing joints. That is why some of the older schools in the area that still had lead pipes showed extremely high levels — in some cases as high as 40 ppm lead. However, when the water system is flushed for a few minutes, the Pb levels are dramatically reduced and resampling has shown that they quickly fall back down to the U.S. EPA “action level” of less than 15 ppb. The long-term solution of the problem will focus on three main areas: adding corrosion inhibitors to the water supply, coating the inside of the pipes with some kind of resin to reduce corrosion, and/or replacing lines/fittings that contain lead or leadbased solders in the plumbing system.

The final analysis

It is well-documented that ICP-MS, with its detection capability, has contributed to a better understanding of environmental contamination and the way trace metals interact with the human body. However, in this particular Pb in drinking water investigation at WSSC, it also exemplifies the fact that the technique is now a truly routine analytical tool that can be used for ultra high throughput analysis.

Further reading

For more information about the subject matter in this article, please visit the following websites:

Sam Richardson is a member of the Washington Suburban Sanitary Commission, Silver Spring, MD.

Zoe Grosser is employed by PerkinElmer Life and Analytical Sciences, Shelton, CT.

Robert Thomas is with Scientific Solutions, Gaithersburg, MD.