Govt and Industry Solutions to Global Toxicity

In most industries, the onus is on the manufacturer to research their product and provide evidence that it is safe for the public. Surprising to many, the one exception is the chemicals industry. For decades, outdated laws have allowed them to sell products by default UNLESS they are proven unsafe. Such lax chemical regulation laws have allowed a plethora of untested chemicals onto the market. The global toxicity problem is a direct result of this lack of oversight. Over 80,000 chemicals are in everyday use and less than 300 have fully tested and restricted. There are now a number of attempts to tackle this problem and protect the public.

The Human Toxome Project

Modern manufacturing has produced over 80,000 chemicals, mostly untested. Until recently, it was estimated that only 300 of these chemicals have been tested. New robotic technology developed by the EPA  now offers a way to test all of these chemical in a rapid and exhaustive way. The Human Toxome Project is an international scientific collaboration intent on mapping out human toxicology effects systematically employing tools such as the EPA Robotics testing system to bring order to this vast number of untested chemicals and once and for all establish their human health impact. It is envisioned to do for toxicology what the Human Genome Project did for genetics.

From the Human Toxome Project page:

Toxicity testing typically involves studying adverse health outcomes in animals subjected to high doses of toxicants with subsequent extrapolation to expected human responses at lower doses. The system relies on the use of a 40+year-old patchwork of animal tests that are expensive (costing more than $3B per year), time-consuming, low-throughput and often provide results of limited predictive value for human health effects. The low-throughput of current toxicity testing approaches (which are largely the same for industrial chemicals, pesticides and drugs) has led to a backlog of more than 80,000 chemicals to which humans are potentially exposed whose potential toxicity remains largely unknown. In 2007, the National Research Council (NRC) released the report “Toxicity Testing in the 21st Century: A Vision and a Strategy”, that charted a long-range strategic plan for transforming toxicity testing. The major components of the plan include the use of predictive, high-throughput cell-based assays (of human origin) to evaluate perturbations in key toxicity pathways, and to conduct targeted testing against those pathways. This approach will greatly accelerate our ability to test the vast “storehouses” of chemical compounds using a rational, risk-based approach to chemical prioritization, and provide test results that are hopefully far more predictive of human toxicity than current methods. Although a number of toxicity pathways have already been identified, most are only partially known and no common annotation exists. Mapping the entirety of these pathways (i.e. the Human Toxome) will be a large-scale effort, perhaps on the order of the Human Genome Project. Here, we propose to comprehensively map pathways of endocrine disruption, representing a first step towards mapping the human toxome. We will leverage our rapidly evolving scientific understanding of how genes, proteins, and small molecules interact to form molecular pathways that maintain cell function, applying orthogonal “omics” approaches (transcriptomics, metabolomics) to map and annotate toxicity pathways for a defined set of endocrine disruptors. Following the identification of toxicity pathways, we will conduct a series of stakeholder workshops to enable the development of a consensus-driven process for pathway annotation, validation, sharing and risk assessment, and develop a public database on toxicity pathways, providing a common, community-accessible framework that will enable the toxicology community at large to comprehensively and cooperatively map the human toxome using integrated testing strategies. Finally we will validate the identified pathways of toxicity and extend the concepts to additional toxicants, cell systems and endocrine disruptor hazards as well as to additional omics platforms and to dose response modeling.

The Human Toxome project will begin by comprehensively mapping pathways of endocrine disruption (ED) for the following reasons:

  1. Several large-scale testing programs, whose results can be leveraged for this project, are already under way.
  2. These testing programs have led to the prioritization of cellular assays and reference compounds (by EPA, NIEHS and the Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM)) on the national level and OECD (Organisation for Economic Co-operation and Development) on the international level that will be used in the project.
  3. The physiological pathways of the endocrine system are relatively well understood, making Pathways of Toxicity (PoT) identification simpler for ED than for other potential target organs.

Goals

The central goal of this project is to define, experimentally verify and systematically annotate ED PoT, as a proof of concept of mapping PoT by systems toxicology. Beyond that, this project will develop a common, community-accessible framework and databases that will enable the toxicology community at large to comprehensively and cooperatively map the human toxome using integrated testing strategies that combine “omics” data with computational models.

Its specific goals are:

  • Use complementary “omics” approaches (transcriptomics, metabolomics), to map and annotate PoT for a defined set of endocrine disruptors.
  • Complete the development of software and visualization tools to enable the integration, analysis and visualization of data across multiple omics hardware platforms.
  • Identify PoT and develop a consensus-driven process for pathway annotation, validation, sharing; establishment of the public database on PoT.
  • Validate PoT and extend the PoT concept to additional toxicants.

Research Publications for Human Toxome Project

Hartung T, van Vliet E, Jaworska J, Bonilla L, Skinner N and Thomas R. Food for thought… systems toxicology. ALTEX 2012, 29: 119-128.
http://altweb.jhsph.edu/altex/29_2/FFTHartung.pdf

Bouhifd M, Hartung T, Hogberg HT, Kleensang A and Zhao L. Review: Toxicometabolomics. J. Appl. Toxicol. 2013 May 30. doi: 10.1002/jat.2874.
http://caat.jhsph.edu/media/Bouhifd_2013_Review_Toxicometabo.pdf

Hartung T, Luechtefeld T, Maertens A and Kleensang A. Food for thought…Integrated Testing Strategies for Safety Assessments. ALTEX 2013, 30: 3-18
http://altweb.jhsph.edu/altex/30_1/FFTHartung.pdf

Hartung T. Pathways of Toxicity Mapping Center (PoToMaC). In: Spielmann H and Seidle T. Alternative Testing Strategies: Progress Report 2012 & AXLR8-3 Workshop Report on a “Roadmap to Next Generation Safety Testing Under Horizon 2020.” 165-169.
http://axlr8.eu/assets/axlr8-progress-report-2012.pdf

Ramirez T , Daneshian M, et. al. t4 Metabolomics in Toxicology and Preclinical Research. ALTEX 2013;30:(2): 209-225.
http://caat.jhsph.edu/media/Ramirez-ALTEX-2013.pdf

Policy Reform and Business Intervention




edf-toxic-chemical-poster

The Environmental Defense Fund addresses the toxicity problem in two ways:

  1. Policy reform via the US Chemical Safety Improvement Act (CSIA)
  2. Working with private sector to identify and remove unsafe chemicals from store shelves

Policy reform via the US Chemical Safety Improvement Act (CSIA) 

  • It will mandate safety reviews of all chemicals currently on the market and require a new chemical to be found likely to meet a safety standard before approval.
  • Some amendments of CSIA are still required including providing explicit protections for the most vulnerable, like infants and children, and significantly narrowing the bill’s imposition on state laws. If the required fundamental improvements to the bill are made, it would finally allow EPA to do its job protecting the US public from toxic chemicals

Working with private sector to identify and remove unsafe chemicals from store shelves

The EDF works directly with businesses to identify and remove the most hazardous chemicals from consumer products. As an example, the EDF works with Walmart—the world’s largest retailer—to

  • establish a chemicals policy that looks at tens of thousands of personal care and household products with the objective of removing priority toxic chemicals out of them
  • establish a policy of safer substitutes to ensure that removing one hazardous chemical doesn’t lead to replacement with another one of equal or greater toxicity.

Due to Walmart’s size, whenever a name brand manufacturer is forced to change their product supply to comply with Walmart’s new policy, the effect will ripple across the global supply chain. This will help protect consumers all around the world.

EPA Robotic Testing Tool

Of the more than 80,000 chemicals used in the U.S., only 300 or so have ever undergone health and safety testing by the EPA. In fact, only five chemicals have ever been restricted or banned. A large part of the problem is the lack of resources to manually test such a large backlog. The EPA’s new Computational Toxicology Research division has developed an innovative, new high speed robotic assay testing solution that is designed to rapidly process the backlog of 87,000 untested chemicals in its database.

“We are screening 10,000 chemicals using these rapid tests to characterize the bioactivity of the chemicals to predict their hazard and to use that information to prioritize for further screening and testing,” says biologist David Dix, deputy director of EPA’s National Center for Computational Toxicology. “We can test a lot of chemicals with a lot of repetitions at a lot of different concentrations.”

The program, initially started at EPA as ToxCast to assess 1,000 chemicals (and known as Tox21 in its expanded form), employs a robot to speed chemical screening. Plastic plates are filled with 1,536 tiny wells and the robot drops varying amounts of different chemicals onto human cells and human proteins into each well.  “In a stack of 100, we have 150,000 combinations of chemicals and targets,” Dix says.

The robotic system is revolutionary. Using it, the EPA and its partner agencies will generate more data about chemical toxicity in the next few years than has in the past century and promises to convert a reactive science into a predictive one. The projects success to date includes:

  1. screening more than 2,500 chemicals, including the dispersants employed to clean up BP’s 2010 oil spill in the Gulf of Mexico.
  2. create a predictive model for liver and reproductive toxicity, accurately forecasting tumor formation in rats and mice that had been exposed for two years to certain chemicals.
  3. model for vascular development and endocrine disruption—an area of keen interest for human exposure to chemicals such as bisphenol A (BPA).

ToxCast™

A major part of EPA’s Computational Toxicology research is the Toxicity Forecaster (ToxCast™), a multi-year effort launched in 2007 that uses high throughput screening assays  to expose living cells or isolated proteins to chemicals. The cells or proteins are then screened for changes in biological activity that may suggest potential toxic effects and eventually potential adverse health effects. These innovative methods have the potential to limit the number of required laboratory animal-based toxicity tests while quickly and efficiently screening large numbers of chemicals.

epa_toxicity_forcaster_toxcast_rev_1000x876

 

Figure 1: EPA Toxcast forecasting tool