Since the nineteenth century, the world industry has made extensive use of chemistry, generating tens of millions of new chemicals to study their properties and applications. Nowadays, over 100,000 substances are currently used to produce everyday objects. Thanks to an extraordinary activity, imposed by the REACH Regulation, during the last 10 years European companies studied in depth the toxicity of about 21,000 substances.
A step back is needed to understand the reason why of this delay and the methods used to assess the human and environmental safety, and to tell how toxicological experiments have been conducted to date.
It is well known that toxicology has been obtaining its conclusions through the extensive use of animals. As same as in many other areas, that happened because models were resorted, i.e. similar systems, but more practical, ethical, economically sustainable. The paradigm was as simple as primitive: if a certain substance hurts a mouse, a dog, or a monkey, it will also hurt a man. On this principle, tens of millions of laboratory animals were sacrificed in the the last decades, obtaining an important amount of data. The “classic” toxicology has been built basing on these data and its cornerstone is the DL50.
DL50 is equivalent to the dose of a substance capable of killing half of the subjects undergoing the experiment. The lower its value, the more “toxic” is the tested substance. The classical methods, that are still widely used all over the world, are based on the exposure to certain quantities of a substance or mixture in one or more animal models, and on the study of the reaction of these models. In the calculation of the LD50 a statistically significant number of animals is treated with a minimal amount of substance / mixture and the effects are recorded. In this way the oral, dermal, cervical, inhalation, venous, spinal, etc. LD50 is calculated. The operation is performed on different animal models: mouse, rat, gerbil, rabbit, pig, monkey, etc.
Remarkably, even today, the methods used for the evaluation of the hazard of substances are the same ones that were used 50-60 years ago by the first classic toxicologists, that is to say the calculation of the LD50 through the use of animal models. There is no other scientific discipline where technological progress has been stopped for so long! Beyond the ethical issue, increasingly felt by companies and consumers, this resounding technological obsolescence is generating delays due to the very long lead time and the high costs involved.
Since the early 2000’s, biotechnologies (a very wide range of bio-molecular tools) and the related bioinformatics (information technology applied to the study of biological data) have given rise to a technological revolution that is upsetting entire sectors such as pharmacology, medicine, diagnostics and, last but not least, toxicology.
In 2007, the scientific community witnessed a long-awaited change in the toxicological field. The scientific authority NASEM (National Academy of Science, Engineering and Medicine) publishes a milestone: Toxicity Testing in 21st Century: a Vision and a Strategy. This work laid the foundations for a paradigme change and projects such as Tox21 and 3R are born in the wake of this first revolutionary work.
The new projects propose to abandon some of the methods and principles of the classical toxicology, and adopt new ones such as:
- determine the action mechanism of the toxicity agent;
- investigate the effects at different levels (tissue, organ, organism, population, environment) and in different stages of growth (embryo, young, developing, adult);
- do not use animal models;
- minimize time and costs compared to previous methods.
The paradigm shift drew heavily from the technological revolutions of biotechnology and bioinformatics and today the frontier research is represented by the union of the process use of in vitro models (such as 3D models that reconstruct the dermis rather than the pulmonary tissue, or the most modern organ-on-a-chip to get to the human-on-a-chip or body-on-a-chip) with omic technologies (such as metabolomics and transcriptomics) in a new highly promising science that takes the name of System Toxicology.
The same bases of the new toxicological paradigm are adopted by translational medicine: a new branch of medicine that combines different disciplines in order to improve current medical approaches and health policies.
How the modern methods work
The modern methods are based on the combination of new converging technologies such as biotechnology, molecular biology, robotics, big data management and artificial intelligence just to name a few.
The modern methods are based on new concepts, including:
- chemical characterization: reactivity, stability, bioaccumulation, etc.
- characterization of toxicological properties: target tissues, toxicity pathways, ability to perturb metabolic pathways, etc.
- dose-response: study the response with empirical models at increasing doses of substance
- exposure to the substance: determine the route of exposure, amount of substance, timing and duration, categories of exposed subjects, etc.
- evaluation of the context: health status of the categories exposed, the level of environmental pollution, etc.
In the early years of the 21st century, Europe and the United States laid down the normative bases imposing companies to determine the toxicity of the chemicals they place on the market. REACH is probably the most complex regulation ever produced, designed to improve the protection of human health and the environment from chemicals.
The European Commission also founded and financed a center for the study and validation of alternative models (those that are not based on the use of animal): ECVAM.
An essential step for the success of these technologies and their diffusion at different levels of society is the recognition of their value by regulatory authorities. The manifestation of interest of main regulatory bodies such as the FDA and the OECD international standardization body is therefore particularly significant..
The process that characterizes the recognition of a new alternative model by the authorities passes through a step that takes the name of validation. The validation envisages that the model responds in a coherent way if used in different laboratories and standard conditions and normally requires investments for about 1 million Euros and 10 years of work. So time and investments are needed to be able to replace the old classic models with the most modern ones.
Read-Across and Quantitative Structure-Activity Relationship (QSAR) are bioinformatics methods that allow to predict the hazard of substances without toxicological information based on some chemical characteristics shared with other substances with known toxicity. Some examples of chemical characteristics considered may be the presence or absence of certain chemical groups or more complex characteristics such as the volume of molecules or the energy of molecular orbitals.
FlavourArt supports the research of alternative methods of investigation
contributing to overcoming the use of animal models in the toxicological field and to develop new science-based technologies.
TRUSTiCERT adopts the new paradigms of modern toxicology
adhering to several programs recognized by Universities and regulatory authorities in Europe and in the United States.