-
Views
-
Cite
Cite
James S. MacDonald, Richard T. Robertson, Toxicity Testing in the 21st Century: A View from the Pharmaceutical Industry, Toxicological Sciences, Volume 110, Issue 1, July 2009, Pages 40–46, https://doi.org/10.1093/toxsci/kfp088
- Share Icon Share
Extract
The report by the U.S. National Research Council entitled Toxicity Testing in the 21st Century: A Vision and a Strategy (National Research Council, 2007) lays out a bold vision for the future of toxicity testing of chemicals based on the explosive changes that have been and are occurring in the basic biological sciences. Our understanding of basic cellular biology has grown remarkably in the last years to the point where many cellular processes are well characterized at the molecular level.
In contrast to this dramatic change, the approaches that have been taken to assess human risk of adverse effect from chemical exposure have changed little over the last several decades. This is reflected in the global regulatory requirements for registration of new agricultural, veterinary, and human pharmaceutical chemicals; the data requirements for these classes of chemicals have changed little since their establishment three and sometimes four decades ago despite the dramatic advances in the sciences that are used for this activity. This fact was a central point raised in the Food and Drug Administration's assessment of the processes used specifically for drug development in their Critical Path document (FDA, 2004). Improvements in the ability to predict important human adverse effects were seen by this Agency to be central to the ability to accelerate the new drug discovery and development process and bring more effective medicines forward safely and quickly.
Comments