Toxicity Testing Is On The Verge Of A Paradigm Shift
What does the discovery of penicillin have to do with the creation of a computer? Both were pivotal breakthroughs that led to paradigm-shifting changes in how we understand biology and what we are capable of in terms of computation, respectively. Many scientific opinions point to toxicology as one of the next scientific fields approaching such a pivotal point. Recent revolutions in biology and biotechnology could finally transform toxicity testing from a system based on whole-animal testing to one focused on advanced in vitro methods that can predict the toxic effects of various chemical compounds on the human body effectively and, most importantly, accurately. Anticipating such a critical moment, the U.S. Environmental Protection Agency (EPA) had asked the National Research Council (NRC) to develop a vision for toxicity testing and a strategic plan for implementing it.
Toxicological testing still often represents a system that has formed incrementally through many years of routine protocol expansion. The overall testing system has not been seriously assessed in terms of modern risk-assessment and risk-management needs. Therefore, the methods still employed worldwide today are costly, use millions of laboratory animals, and take a lot of time for both generating and analyzing the data. Such a patchwork approach, where tests are simply added on top of legacy procedures, has not provided a satisfactory solution – we are still unable to provide the following simultaneously:
- depth: accurate, relevant information for hazard identification and dose-response assessment;
- breadth: data on a broad range of chemicals, endpoints, and life stages;
- animal welfare: causing the least animal suffering possible and using the fewest animals possible;
- conservation: minimizing costs and time for testing and regulatory review.
While the current approaches rely primarily on observing adverse biologic responses in very specific and uniform groups of animals exposed to high doses of a given substance, the relevance of such animal studies is questionable. Often, the assessment of risks based on such data does not translate to varied human populations, typically exposed to much lower concentrations. Not to mention the large costs, long study durations, and large numbers of animals used – all explaining why we’ve only been able to evaluate such a limited range of chemical compounds using these methods. We need to step up our game. While there were about 62,000 chemicals on the market in 1979, we were up to 82,000 in 2010, with 700 new ones introduced each year.
Besides the drawbacks of the time and resource-intensive nature of such testing, we are acquiring only limited mechanistic information, too. Therefore, there is a struggle to understand the mechanisms behind the adverse health effects that are observed in humans. Such deficiencies in predictive power limit the usefulness of animal-based toxicity testing.
The vision and strategy, proposed by a committee of 22 experts in 2007, and spanning over 100 pages describes many specific components of modern toxicology such as chemical characterization, toxicity testing, dose-response and extrapolation modelling, population-based and human exposure data, and risk contexts in great detail. Various potential tools and technologies were also highlighted, with the expectation that the upcoming methods will continue evolving and maturing over time, despite many being already available. Such advances should provide wider coverage of chemicals of concern, reduce the costs associated with acquiring robust toxicity-test data and use animals to a far smaller extent.
However, much still needs to come together for the move toward a mechanism-oriented testing paradigm to succeed. Specifically, the researchers indicate that implementation must include:
- wide availability of in vitro tests: preferably ones based on human-derived cells, cell lines, or components, ones that could enable a comprehensive evaluation of toxicity pathways;
- the availability of targeted tests to complement the in vitro tests: thusly overall adequate data could be gathered to enable evidence-backed decision making;
- developed models of toxicity pathways: thorough understanding is key in predicting general-population exposures that could potentially cause adverse effects;
- infrastructure changes to support the basic and applied research needed to develop the tests and the pathway models;
- validation of tests and test strategies: necessary for incorporating the methods into chemical-assessment guidelines;
- acceptance: we need to be sure that the stakeholders agree with the fact such results are adequately predictive and can be used in decision making.
Given the inherent complexities associated with revolutionising a scientific field and the many challenges highlighted above, the committee believes that the development of new assays and related basic science still requires a substantial amount of research. Given concerted efforts by many research groups, high-throughput testing technology was expected to be developed throughout the 10 years after the vision had been published, substantially improving our capacities for identifying toxicity hazards. Meanwhile, the implementation of the full spectrum of the vision, where science can rapidly and inexpensively assess large numbers of substances with adequate coverage of possible endpoints, was foreseen to take 10 or more years from now. Today we can see that we are slowly getting closer to toxicity testing that’s firmly based on human biology.
With all of the recent advancements in biotechnology, animal advocates can breathe a sigh of relief – experts had believed more than 10 years ago that the envisioned transition from animal testing is expected to generate more robust data on the potential toxicity risks and to expand capabilities to test chemicals more efficiently. However, to accelerate this shift, both scientists and non-scientists must work together to advance the field quickly and achieve the full impact of the committee’s recommendations. Their conclusion that a transformative paradigm shift is needed to attain the set goals can only be reached if there is a widespread belief in modern methods and support from the scientific community, governing bodies and the society at large.
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4410863/