About TCAW - Subscription Info
January 2002
Vol. 11, No. 1,
pp 37–38, 41–42.
Today's Chemist at Work
Focus:

FEATURE Pharmaceuticals

Clinical atomic spectroscopy

Determining the link between trace metals and human disease

opening artUnderstanding the effects of trace metals on human health is as complex as it is fascinating. We know that too low or too high of a concentration of trace elements in our diet can affect our quality of life. Equally, industrial-based metallic contamination of the air, soil, and water supplies can have a dramatic impact on our well-being. The toxic effect of lead, particularly on young children, is well documented, but is it possible to pinpoint the source of the lead poisoning? The movie Erin Brokovich alerted us all to the dangers of hexavalent chromium (Cr VI) in drinking water, but how many in the audience realized that chromium metal is necessary for metabolizing carbohydrates and fats? Selenium, which is found in many vegetables including garlic and onions, has important antioxidant properties, but do we know why some selenium compounds are essential but others are toxic? Clearly, these are complex questions that must be answered to fully understand the role of trace elements in the mechanisms of human diseases.

Atomic Spectroscopy
The development of analytical instrumentation over the past 30–40 years has allowed us not only to detect trace metals at the parts per quadrillion (ppq) level, but also to know its valency state, biomolecular form, elemental species, and isotopic structure. We take for granted all of the powerful and automated analytical tools we have at our disposal to carry out trace elemental studies on clinical and environmental samples. Accurate analysis at trace levels, however, was not always so easy. As recently as the early 1960s, trace elemental determinations were predominantly carried out by traditional wet chemical methods such as volumetric, gravimetric, or colorimetric assays. It wasn’t until the development of atomic spectroscopic (AS) techniques, in the early to mid-1960s, that the clinical community realized that they had a highly sensitive and diverse trace element technique that could be automated. Every time there was a major development in AS, trace element detection capability, sample throughput, and automation dramatically improved (1). There is no question that developments and recent breakthroughs in atomic spectroscopy have directly affected our understanding of the way trace metals interact with the human body.

Lead Poisoning
Lead has no known physiological purpose in the human body, but it is avidly absorbed into the system by ingestion, inhalation, or through the skin. The playing and eating habits of children make them particularly susceptible to absorbing lead. Lead is absorbed more easily in cases of calcium and iron deficiency or when a child has a high-fat, inadequate-mineral, and/or low-protein diet. When absorbed, lead is distributed to three main areas of the body—bones, blood, and soft tissue. About 90% settles in the bones, whereas most of the rest is absorbed into the bloodstream, where it gets taken up by porphyrin molecules (complex nitrogen-containing organic compounds that provide the foundation structure for hemoglobin) in the red blood cells.

The Centers for Disease Control and Prevention (Atlanta), recently reported that nearly 1 million children living in the United States have blood lead levels that are high enough to cause irreversible damage to their health. Lead poisoning often occurs with no distinctive symptoms. It can damage a child’s central nervous system, kidneys, and reproductive system and, at high levels, can cause coma, convulsions, and even death. Even low levels of lead are harmful and associated with lower intelligence, reduced brain development, decreased growth, and impaired hearing.

Lead levels are monitored with a blood lead test, which, by today’s standards, is considered elevated if it is in excess of 10 µg/dL (100 ppb) for children and 40 µg/dL (400 ppb) for adults. But our understanding of the long-term effects of lead poisoning has not always been as thorough and, in the early- to mid-1960s, a blood lead level above 60 µg/dL (600 ppb) was considered elevated. As investigators developed more sensitive detection systems and designed better studies, the generally recognized level for lead toxicity has progressively shifted downward and was eventually set at 10 µg/dL in 1991. However, as our understanding of the disease improves and measurement technology is refined further, this level could be pushed even lower.

The major source of lead poisoning among children comes from lead-based household paints, which were banned in 1978 by the Consumer Product Safety Commission. Other potential sources include leaded gasoline, lead pipes, water supplies, airborne lead from smelters, clay pots, pottery glazes, lead batteries, and household dust and soil. According to a recent National Health and Nutrition Examination Survey, however, awareness of the problem, preventive care, and regular monitoring have reduced the percentage of children in the United States with elevated blood lead levels from 88.2% in the late 1970s to 4.4% in the early 1990s.

Detection Methods
There is no question that the routine monitoring of children has gone a long way toward reducing the percentage of children with elevated lead levels. Lead assays were initially carried out using the dithizone colorimetric method, which was sufficient for the time (late 1950s, early 1960s), but was very slow and labor-intensive. It became more automated with the development of anodic stripping voltammetry (2), but blood lead analysis was not considered a truly routine method until AS techniques became available.

When flame atomic absorption (FAA) was first developed, an elevated blood lead level was considered to be 60 µg/dL (600 ppb), well above the FAA detection limit of 2 µg/dL (20 ppb) at the time. But when preparation of the blood samples was taken into consideration, FAA struggled to meet this level. Preparation typically involved either dilution with a weak acid followed by centrifuging and filtering, acid digestion followed by dilution and centrifuging and filtering, or, more recently, dilution with a strong base such as tetramethylammonium hydroxide (TMAH). Sometimes, a surfactant was also added to allow for easier aspiration. When sample preparation was factored into the equation, the elevated blood lead concentration of 60 µg/dL was reduced to 2–5 µg/dL (20–50 ppb)—virtually identical to the FAA detection limit.

An accessory called the Delves Cup was developed in the late 1960s to improve the detection limit of FAA (3). The Delves Cup approach used a metal crucible or boat usually made from nickel or tantalum, which was positioned over the flame. The 10- to 100-µL sample was pipetted into the cup and the heated sample vapor passed into a quartz tube, which was also heated by the flame. The ground state atoms generated from the heated vapor were concentrated in the tube and therefore remained in the optical path for a longer period of time. This resulted in much higher sensitivity and lower detection capability, which meant that the elevated blood lead level of 60 µg/dL could be detected with comparative ease. Because of its relative simplicity and low cost of operation, the Delves Cup became the standard method for carrying out blood lead determinations for many years.

Unfortunately, the Delves Cup approach was found to be very operator-dependent and not very reproducible; sometimes, it involved complicated sample preparation and required calibration with blood matrix standards. For these reasons, the approach became less attractive after the commercialization of electrothermal atomization (ETA) in the early 1970s. This new approach offered detection capability for lead of ~0.1 ppb—200-fold better than FAA. However, its major benefit for the analysis of blood samples was the ability to dilute and inject the sample automatically into the graphite tube with very little off-line sample preparation. In addition, because the majority of the matrix components were “driven-off” before atomization at 2700 °C, interferences were generally less than with the Delves Cup, which used a much cooler acetylene flame to generate the atoms. This breakthrough meant that blood lead determinations, even at extremely low levels, could now be carried out routinely in an automated fashion.

The next major milestone in AS was the development of Zeeman background correction (ZBGC) in 1981, which compensated for nonspecific absorption and structured background produced by complex biological matrices like blood and urine. In conjunction with the stabilized temperature platform furnace (STPF) concept, ZBGC allowed for virtually interference-free graphite furnace atomic analysis (GFAA) of blood samples (4). The success of the ZBGC/STPF approach, primarily due to the fact that it could analyze many different kinds of samples using simple aqueous standards, launched it as a method for analyzing most types of complex matrices by GFAA.

Although GFAA had been the accepted way of doing blood lead determinations for more than 15 years, the commercialization of inductively coupled plasma–mass spectrometry (ICP–MS) in 1983, gave analysts a tool that was not only 50–100 times more sensitive, but suffered from less severe matrix-induced interferences than GFAA. In addition, ICP–MS offered multielement capability and much higher sample throughput, making it very attractive to the clinical community. Figure 1 shows the improvement in detection capability of ICP–MS compared with the other AS techniques. It must be emphasized that these are approximate aqueous lead detection limits and are shown for comparative purposes. They do not represent detection levels achievable directly in the blood.

Isotopic Fingerprinting
An added benefit of the ICP–MS technique is that it also offers isotopic measurement capability. This is a very attractive feature to many clinical labs, because it gives them the ability to carry out isotopic tracer (5), dilution (6), and ratio (7) measurements, which are beyond the realms of the other traditional AS techniques. In fact, the isotopic measurement capability allows researchers to actually pinpoint the source of lead poisoning by comparing the isotope ratios of blood lead samples with those of possible sources of lead contamination. The principle behind this approach, known as isotopic fingerprinting, is based on the fact that lead has four naturally occurring isotopes: 204Pb, 206Pb, 207Pb, and 208Pb. Thus, when lead is ionized in the plasma, it generates four ions, all with different atomic masses. This can be seen in Figure 2, which shows a mass spectrum of the four lead isotopes and their relative natural abundances.

All the lead isotopes, with the exception of 204Pb, are radiogenic—the products of radioactive decay of either uranium or thorium. Therefore, their relative abundance varies depending on the rock type and geological area. This means that in all lead-based materials and systems, 204Pb is the only isotope that has remained essentially unchanged at 1.4% since the earth was first formed. The ratios of the isotopic concentrations of 208Pb, 207Pb, and 206Pb to that of 204Pb will therefore vary depending on the source of lead. This fundamental principle is used to match lead isotope ratios in someone’s blood to a particular environmental source of lead contamination.

A recent study of a group of people living in a small village near Mexico City provides an excellent example of using isotope ratios to pinpoint the source of lead poisoning (8). Many of the residents had abnormally high levels of lead in their blood. Researchers identified two potential sources of lead poisoning: leaded gasoline that had contaminated the village’s soil or glazed ceramic pots that residents used for food preparation. For this study, the lead isotope ratios were measured using an electrothermal vaporization (ETV) sampling accessory coupled to the ICP–MS. An ETV system uses a heated graphite tube (similar to that used in a GFAA), to thermally pretreat the sample. But instead of using the tube to produce ground state atoms, its main function is to drive off the bulk of the matrix before the analytes are vaporized into the plasma for ionization and measurement by the mass spectrometer. The major benefit of ETV-ICP–MS for this application is that complex matrices like blood, gasoline, and pottery clay material can be analyzed with very little interference from the matrix components. An additional benefit with regard to taking blood samples is that typically only a 20- to 50-µL aliquot is required for analysis. Figure 3 represents a schematic of how the ETV-ICP–MS system works, showing the two distinct steps—prevaporization to drive off the matrix components and vaporization to sweep the analyte vapor into the ICP–MS for analysis.

Using the new approach, the lead isotope ratios in blood samples from a group of residents were compared with the two likely sources of lead contamination. When the data were analyzed, the isotope ratios for both the blood and cookware were grouped very tightly together around the “primeval” (original—assuming no radioactive decay) lead value, whereas the gasoline data, which has a very different lead isotopic signature, grouped on its own. These data therefore provided very convincing evidence that the residents of this village were getting poisoned by eating from the clay pots they were using for cooking and not from being exposed to lead through leaded gasoline-tainted soils.

Final Analysis
There is no question that developments in AS have helped us to better understand the toxicity effects of lead over the past 30 years. AS has allowed us to lower the actual level considered dangerous in young children from 60 to 10 µg/dL, helped to reduce elevated blood levels of children in the United States from 88.2 to 4.4%, and allowed us to pinpoint with a high degree of certainty the environmental sources of lead contamination. However, such is the power and versatility of modern AS instrumentation and its accessories that it has also dramatically improved our understanding of other trace metal-related human diseases. The toxic effects of arsenic and hexavalent chromium or the nutritional benefits of iron and selenium would still be relatively unknown if it weren’t for the continual improvements in AS instrumentation. Although the technique has been successfully applied to many other application areas, there is no question that its use as a biomedical and environmental research tool has had a direct impact on the quality of many people’s lives.

References

  1. Thomas, R. J. Today’s Chemist at Work 1999, 8, 42.
  2. Constantini, S.; Giordano, R.; Rubbing, M. J. Microchem. 1987, 35, 70.
  3. Delves, H. T. Analyst 1970, 95, 431.
  4. Slavin, W. Sci. Total Environ. 1988, 71, 17.
  5. Ting B. T. G.; Janghorbani, M. Anal. Chem. 1986, 58, 1334–1340.
  6. McLaren, J. W.; Beauchemin, D.; Berman, S. S. Anal. Chem. 1987, 59, 610.
  7. Manton, W. I.; J. Toxicol. Clin. Toxicol. 1998, 36, 705–706.
  8. Chaudhary-Webb, M.; Paschal, D. C.; Elliott, W. C.; Hopkins, H. P.; Ghazi, A. M.; Ting B. C.; Romieu, I. At. Spectrosc. 1998, 19, 156.


Robert J. Thomas is a consultant with Scientific Solutions, which specializes in technologies related to trace element analysis. Send your comments or questions regarding this article to tcaw@acs.org or the Editorial Office, 1155 16th St N.W., Washington, DC 20036.

Return to Top
|| Table of Contents


 CASChemPortChemCenterPubs Page