[an error occurred while processing this directive]
Skip to Main Content

Government & Policy

June 22, 2009
Volume 87, Number 25
pp. 30 - 33

Next-Generation Risk Assessment

EPA’s plan to adopt in vitro methods for toxicity testing gets mixed reviews from stakeholders

Britt E. Erickson

PARADIGM SHIFT Advances in genomics and computational toxicology are paving the way for new approaches to chemical risk assessment. Shutterstock/C&EN
PARADIGM SHIFT Advances in genomics and computational toxicology are paving the way for new approaches to chemical risk assessment.
  • Print this article
  • Email the editor
HIGH THROUGHPUT Robotic technologies make it possible to screen the biological activity of more than 1 million chemicals per day. NIH
View Enlarged Image
HIGH THROUGHPUT Robotic technologies make it possible to screen the biological activity of more than 1 million chemicals per day.
NO MORE MICE New in vitro assays for predicting toxicity of chemicals in humans may eventually eliminate the need for studies using mice. Maggie Bartlett/National Human Genome Research Institute
NO MORE MICE New in vitro assays for predicting toxicity of chemicals in humans may eventually eliminate the need for studies using mice.

Text Size A A

A series of reports from the National Research Council (NRC), advances in high-throughput screening, and a deluge of data anticipated from the European Union's program for Registration, Evaluation, Authorization & Restriction of Chemical substances (REACH) have spurred the Environmental Protection Agency to strongly consider revamping how it performs chemical risk assessments.

With more than 80,000 chemicals currently on the market, and about 700 new ones added each year, a lot is at stake, observers say. The challenge is getting policymakers, Congress, and the courts to accept a new approach—one that would involve shifting from expensive and time-consuming whole-animal studies to in vitro tests with human cell lines.

A paradigm shift in chemical risk assessment will not be easy. It will require a sustained commitment for resources, collaboration, and the political will to push the effort forward. Many advocates for change say the current system for ensuring the safety of chemicals is broken and EPA needs to do something radical to fix it. Others advise caution.

In a strategic plan released earlier this year, EPA laid out a framework for incorporating advances in genomics and computational sciences into toxicity testing and risk assessment. Over the next 10 years, the agency plans to increasingly rely on so-called toxicity-based pathways in its risk assessments. The approach entails examining the network of pathways formed when genes, proteins, and small molecules interact. The goal is to understand how chemical exposures can perturb such pathways and lead to a cascade of events that ultimately cause adverse health effects.

Because the science behind this new approach is rapidly evolving, EPA asked the National Academies to convene a public symposium where all stakeholders could provide input and help EPA define this next generation of risk assessment.

“It’s been very clear for some time now that enormous changes are taking place in the arena of risk assessment, yet we are not able to pull all the threads together at this point,” Peter Preuss, director of EPA's National Center for Environmental Assessment, said as he kicked off the symposium last month at the National Academies, in Washington, D.C.

Preparing for a paradigm shift in risk assessment is like trying to turn around an aircraft carrier with two tsunamis approaching, Preuss suggested. Those tsunamis, he said, are the enormous quantities of data on tens of thousands of chemicals that will be coming in under REACH and from hundreds of recently developed high-throughput tests.

“We can’t ignore either one,” Preuss said. “But we obviously can’t fully incorporate everything that is coming at us.”

One tool EPA will use to study toxicity pathways is high-throughput-screening assays. These new tests’ rapidity and ability to analyze multiple samples simultaneously will allow researchers to examine the adverse effects of chemicals, and even chemical mixtures, under a slew of different exposure conditions in a fraction of the time that it takes to do so with animal studies. And because assays are done with human cell lines, results do not have to be extrapolated from laboratory animals to humans.

Current animal-testing methods have enormous uncertainties, Lynn R. Goldman, professor of environmental health sciences at Johns Hopkins Bloomberg School of Public Health, noted during the symposium. The high doses used on animals in many tests are not necessarily relevant to the kinds of exposures humans have every day, she said. “We are extrapolating in ways that we all know are quite shaky,” she added.

Goldman and many others at the meeting also emphasized the need to do a better job of predicting risk to sensitive populations, such as children, the elderly, or people with genetic polymorphisms that make them more susceptible to a particular chemical. The use of in vitro tests with human cell lines could help shed light on this area, too.

Momentum appears to be building in support of a drastic shift in risk assessment, spurred in part by several high-profile NRC reports, including the often-cited 2007 report “Toxicity Testing in the 21st Century: A Vision and a Strategy.” In general, all the reports conclude that current toxicity testing can’t provide all the data needed for risk assessment.

Preuss echoed this message at the symposium. “There are simply too many chemicals, too many tests, and too many questions,” he said. Using animals to test chemicals thoroughly is time-consuming and expensive, and “we just need to move on to something else,” he added.

Anticipating that bioinformatics will be a heavy component of any new risk-assessment approach, EPA created the National Center for Computational Toxicology (NCCT) in November 2004. The center plans to use a program called ToxCast, launched in 2007, to rapidly predict whether chemicals are toxic and to help prioritize which ones should be targeted for testing (C&EN, Aug. 13, 2007, page 36).

ToxCast relies on high-throughput-screening tools developed by the pharmaceutical industry for drug discovery and combines those with predictive computational methods. The idea is to cast a broad net and cover as many different toxicity pathways as possible, Robert J. Kavlock, director of NCCT, explained at the symposium.

To test the ToxCast system, about 300 pesticides and a handful of other chemicals, including endocrine disrupters such as phthalates and bisphenol A, were put through about 500 high-throughput assays. The chemicals were chosen because the agency has a plethora of toxicological data on them from whole-animal studies in mice and rats, Kavlock noted.

As a result, NCCT is now swimming in data from the high-throughput assays, and it is trying to determine whether the data agree with toxicity studies conducted with animals.

The next phase is to test additional chemicals of different structures and uses that don’t have as much toxicity data available. NCCT also plans to test drugs that failed in human clinical trials because of toxicity. “That starts to get us to predicting toxicity in humans rather than in rats,” Kavlock noted.

Pfizer has agreed to supply us with in excess of 100 drugs that have gone into clinical trials and have exhibited human toxicity,” Kavlock said, referring to an agreement NCCT signed with the company in May. Pfizer will provide NCCT with the drug, the chemical structure, the preclinical data, and the clinical data. Each chemical will be run through ToxCast, and the information will be made publicly available, he added.

NCCT is also collaborating with the National Toxicology Program (NTP) and the National Institutes of Health's Chemical Genomics Center to develop a library of about 10,000 previously screened chemicals, Kavlock noted. Approximately one new assay will be tested against that library each week, he said.

“We’d like to bring in the Food & Drug Administration,” he noted, because the agency’s Center for Drug Evaluation & Research has extensive human and animal toxicity data.

Although the new methods are promising, a lot still needs to be done, scientists say. The use of in vitro methods with human cell lines will eliminate one extrapolation but will create others: from in vitro to in vivo systems, and from simple cells to complex organisms, Kim Boekelheide, a professor of medical sciences at Brown University, pointed out during the symposium.

To address the issue of extrapolating from simple cells to complex organisms, EPA’s NCCT is developing so-called virtual tissues (v-Tissues). These v-Tissues are designed to computationally simulate key molecular and cellular processes that occur in normal biological tissues. NCCT is actively working to develop a virtual liver as well as a virtual embryo, Kavlock noted.

Another issue raised by Boekelheide is the need to determine how many toxicity pathways exist. Anywhere from 10 to 200 cancer pathways have been identified, he said, adding that a similar range may be reasonable for toxicity-pathway-based risk assessment.

Boekelheide served on the NRC committee that produced the “Toxicity Testing in the 21st Century” report. He told participants at the symposium that he had come to the committee as a skeptic but was transformed into a believer and advocate of the new approach. To highlight the need for change, he cited the example of how sunscreens are tested for toxicity by giving them orally to laboratory animals. “How does that tell us about the way humans use it in sunlight?” he asked.

Several people at the meeting urged EPA to be cautious when moving forward with a new risk-assessment approach because the methods under consideration are still new and much of the current modeling effort has focused on data-rich compounds such as pesticides. It is hard to say whether those models would accurately predict the toxicity of data-poor chemicals.

One of the biggest debates about using toxicity-based pathways for risk assessment will be differentiating between adaptive and adverse responses, many experts predicted. When a chemical agent perturbs a pathway, an adverse effect does not always occur. Often, cells will adapt to the perturbation. Until they can be directly correlated with adverse health outcomes, gene expression data will need to be used in conjunction with traditional toxicological end points, EPA officials acknowledged at the symposium.

EPA has tried to incorporate new assays into some of its work in the past with little success. Of particular relevance to the current debate is the Endocrine Disrupter Screening Program (EDSP), which has been plagued with controversy. Many participants urged EPA to heed the lessons from its attempts to implement a paradigm shift in that program.

"Using animals to test chemicals thoroughly is time-consuming and expensive, and “we just need to move on to something else.”"

When Congress directed EPA to develop a process for screening and testing chemicals for their ability to disrupt the endocrine system in 1996, EPA responded by establishing EDSP. For this new toxicity-testing program, however, EPA decided to use what, at the time, were new high-throughput assays. But 13 years later, EDSP has yet to get off the ground. The reason for this delay, according to EPA, is because it needed more time to validate the assays, which are now about 15 years old.

Nevertheless, EDSP is finally poised to take off. Last April, EPA issued a list of chemicals, consisting of 67 pesticides or inert ingredients used in pesticide formulations, that will undergo the first round of testing, as well as policies and procedures that EPA will use to order testing under the program. And the agency expects to send the first test orders to pesticide manufacturers next month.

EDSP has little support from the scientific and stakeholder communities, even among environmental groups, largely because it has dragged on for so long without producing any results. “After a certain amount of time, people just get tired,” Johns Hopkins’ Goldman said. It is so important to have “a sustained and long-term commitment” to change any government process, she emphasized. Otherwise, “you won’t get over the finish line.”

Another lesson from EDSP is that EPA can expect legal challenges to any new risk-assessment approach, particularly if it hasn’t been fully validated, said Gina Solomon, a senior scientist with the environmental group Natural Resources Defense Council and an associate clinical professor of medicine at the University of California, San Francisco.

At the symposium, Solomon noted that the pesticide industry is busily coming up with ways to challenge EPA’s EDSP test orders. Indeed, at a meeting hosted by the pesticide trade group CropLife America in April, several people pointed out that after more than a decade, the assays selected by EPA for EDSP are still insufficiently validated.

As EPA moves forward with EDSP using 15-year-old assays, newer ones such as those being discussed for pathway-based risk assessment are bound to hit similar roadblocks. But a top official at NTP dismissed the need for formal validation of in vitro assays for risk assessment now.

“We want to reconcile results from our new data-rich technologies—genomics, proteomics, and high-throughput screens—with the existing testing information that we have from our program, for what I call ‘conceptual validation,’ ” explained John R. Bucher, associate director of NTP. “If we are not convinced that this is worth anything, nobody else is going to be convinced.” Once the conceptual validation is complete, he said, the tests should be formally validated, as required by law.

Solomon agreed and suggested that EPA “go out on a limb” and start applying pathway-based approaches to risk assessment. “The hormonally active agents are a great place to start,” she said. “We know a fair amount about those pathways.” Another good starting point would be to look at carbon nanotubes and asbestos together because they stimulate the same oxidative stress macrophages in the lungs, she noted. EPA needs to start pushing the envelope because otherwise “we are going to keep grinding over and over again the same data-rich chemicals,” she said.

Modernizing EPA’s risk-assessment methods will also need a sustained investment, participants noted. When it comes to adopting new methods for risk-assessment purposes, “we have failed to present a compelling need for resources,” emphasized Tina Bahadori, managing director of the Long-Range Research Initiative at the American Chemistry Council (ACC), an industry group. The U.S. hasn’t invested in this area, she said at the symposium.

On the other hand, ACC has chosen to invest in three areas related to modernizing risk-assessment approaches, Bahadori noted. They include developing the science of interpretation to make better sense of data, looking at exposure science to help prioritize chemicals for testing, and understanding susceptibility.

Most people agree that it will be many years before EPA can rely solely on new high-throughput methods for risk assessment. In the meantime, new technologies will likely be used in tandem with conventional rodent-based methods for predicting toxicity.

As a first step, EPA is planning to test the new approach with three case studies—ambient air pollutants, phthalates, and biomonitored mixtures. “Several air pollutants act via the same mode of action to produce the same disease” and therefore can be viewed as a family, EPA’s Preuss noted. Likewise, a recent NRC report recommended that EPA consider exposure to all androgen-disrupting chemicals when assessing the risks of phthalates (C&EN, Dec. 22, 2008, page 9), which are common plasticizers found in numerous consumer products.

Now is the time for EPA to address the issue of modernizing risk assessment, Preuss stressed. “We can’t continue to wait until someone shows us the way.

Chemical & Engineering News
ISSN 0009-2347
Copyright © 2011 American Chemical Society
  • Print this article
  • Email the editor

Services & Tools

ACS Resources

ACS is the leading employment source for recruiting scientific professionals. ACS Careers and C&EN Classifieds provide employers direct access to scientific talent both in print and online. Jobseekers | Employers

» Join ACS

Join more than 161,000 professionals in the chemical sciences world-wide, as a member of the American Chemical Society.
» Join Now!