Nynke Kramer, a Toxicology Department toxicologist at Wageningen University and Research on computer models for toxicological research
Computer models for toxicological research into chemicals
Dizziness, itching, emaciation and even unconsciousness: chemical exposure symptoms can be diverse and the damage caused to the brain and organs can be serious. How can we make an accurate prediction of each new chemical’s critical and safe dose for humans? Toxicologist Nynke Kramer, together with Wageningen University and Research Toxicology Department colleagues, has discovered promising computer models that simulate the movement of chemicals through the body. ‘The challenge is not only to predict the effect of a substance on the body, but rather the opposite, what the body does with that substance.’
Before studying environmental science in England in 2004, Nynke had already been interested for some time in how a chemical travels through an organism after exposure. ‘For my graduation research there, I investigated how trees absorb toxins from contaminated soil’, she recalled. ‘At that point I was hopeful that plants and trees would be able to store all these substances for a long time as this would avoid environmental damage. Unfortunately, in practice this wasn’t so straightforward as the trees store such substances in their leaves, which of course fall back to the ground. My fascination for the subject has only increased since then.’
Determine critical dose
Back in the Netherlands, Nynke decided to use the knowledge she gained at Utrecht University to experiment on living tissue (sometimes human and sometimes animal tissue) in petri dishes (in vitro). ‘This enables you to determine very specifically and at microscopic level how a particular substance affects our organs and from what dose the substance is toxic. This dose indicates the permitted level of such substances in our everyday products, for example in medicines. Sometimes less than a millionth of a gram is already hazardous, sometimes it can be more. But every substance is toxic from a certain level of exposure. Did you know that even water is toxic for humans from a certain dose?’
Toxicological in vitro research instead of animal testing
Nynke explained that in vitro studies have only recently become a recognised way of determining substance toxicity. ‘When public authorities introduced stricter toxicological testing requirements following several chemical scandals in the 1950s and 1960s, including medicines and the agrotoxin DDT, animal testing was the first method to emerge for toxicity testing. Although there was little proper validation that these tests were predictive of human responses, researchers continued to use them as there was no alternative. Movements opposing animal testing increased in the 1990s and in vitro toxicity testing emerged partly under pressure from this.’
From culture to ‘human’ predictions
However, according to Nynke, this method also has its flaws. ‘First of all it became apparent that this method simply can’t measure the toxicity of some substances, as you sometimes observe no effect on a tissue sample while we’re certain that the substance in question is toxic to humans (false negative). On the other hand, you sometimes observe a reaction to a substance that is in itself harmless (false positive). There’s also the issue of how to translate (‘extrapolate’ in scientific language) a certain amount of chemical in a culture into a safe dose that humans can take. This proves difficult. More generally, in vitro testing results also don’t always match those of the animal tests we want to replace.’
The real culprits: metabolites
Nynke explained how incorrect results from in vitro testing usually occur. ‘The symptoms and damage caused by chemicals in humans are rarely due to the chemicals themselves but are usually caused by metabolites: the metabolic products into which the liver converts a substance. For example, with methanol, also known as wood alcohol, formic acid is the toxic metabolic product that affects organs and results in headaches and dizziness. In an important group of pesticides, so-called oxons cause neurotoxic effects in the brain. This means that you can’t only test that substance to determine the toxicity of a substance in humans. You also need to focus on the extent to which the body absorbs, distributes, metabolises and excretes a chemical into and through the body (so-called pharmacokinetic processes). In vitro studies don’t take into account the role of those pharmacokinetic processes. These may, for example, demonstrate that while a substance is very toxic to brain cells, in practice its toxicity in animals or humans is low because the intestines absorb the substance poorly and excrete it quickly.’
Molecular structures as ‘predictors’
While searching for a method that does provide a good prediction of how the human body handles toxic substances so that symptoms and damage can be explained, Nynke and her colleagues decided to use computer models, specifically physiologically-based pharmacokinetic models (also known as the PBPK model). ‘Our models use the computational power of computers and partly AI to assess a substance’s molecular structure, before predicting how quickly the substance will be absorbed in the intestines, for instance, and the extent to which the substance will enter the organs from the bloodstream. Basically, our models predict how much of a given substance the human body can handle before toxicity problems occur.’
Application in everyday practice: reliable and recognised methods
Nynke explained that the models are already in use in various ways and places worldwide. ‘In the US, the Food and Drug Administration (FDA) accepts their use in drug interaction studies, while the European Food Safety Authority (EFSA) – the body that determines whether a substance in our food poses a health risk – now wants to include the models’ data as additional evidence in risk assessment dossiers. The OECD has also issued guidelines on building the models to promote their use in toxicology. This will enable other researchers to develop such models, in theory for every conceivable toxin and animal species.’
Pioneering innovations
‘Feel free to call it old-fashioned academic nerdiness, but I always get a thrill when a new model works’, Nynke responded wittily when asked what she likes best about her innovation. ‘Our section was the first worldwide to predict almost perfectly the toxic dose of certain substances on embryonic development. We still have that distinction today’, Nynke proudly continued. ‘For my current professor, who was there at the time and is retiring in September this year, the legacy could even be better. I’m feeling really positive about this. This innovation has everything to be truly pioneering in the long term.’
Safety first: preventing health issues and damage to the environment
For Nynke, the fact that computer modelling means fewer laboratory animals are likely to be needed for toxicological studies in the long term is not her primary motivation but is a big plus. According to her, animal testing still has its uses, but she finds the unnecessary use of animals for research to be unethical. ‘What I’m particularly hoping for with this innovation is that we’ll eventually be able to assess chemicals – for whatever purpose – for toxicity purely by their molecular structure. This will allow you to conclude in time that you simply shouldn’t want to use certain substances, enabling you to prevent health problems.’
More information
Interview: Bard Borger
Photo: WUR