Read Fat, Fate, and Disease : Why we are losing the war against obesity and chronic disease Online
Authors: Mark Hanson Peter Gluckman
Of course no one thought that it was quite this simple when it came to the ‘higher’ human functions such as behaviour, character formation, and intelligence. It was clear that learning had to play a major role and this really was a developmental process. The way we lived
our lives and the environment were acknowledged to play a role, although how much was a matter of opinion. This became structured—fossilized might be a better word—around the nature/nurture debate—one we now know to be very artificial. Are the majority of human attributes largely determined by our genes (i.e. nature), that is by directly heritable processes which could be bred for, in the way that Mendel had been able to do for characteristics in his peas? Or are we born with a ‘blank slate’, as the 18th-century philosopher Thomas Hume had suggested, upon which early experience and environment (i.e. nurture) could write the story of how we would turn out?
This nature/nurture debate rumbled on, and has had many reincarnations up to the present day, although now most of us realize that it is rather futile. Both processes matter and interact—indeed they are not really separable, as epigenetic biology shows, and do not explain fully what makes us what we are. But as with Lamarckism and Lysenkoism the nature/nurture concept has left its traces in cults and abandoned projects, some of which were influential, costly, and damaging. We need only think of the legacy of eugenic movements, compulsory sterilization, or breeding programmes, spurred on in the mid 20th century by the idealized science of genetics; or the mass testing of children at an early age for IQ advocated by the once famous, and now infamous, psychologist Cyril Burt, based on data on inheritance derived from studies of twins which were found (again!) to be fraudulent after his death. Then, on the nurture side, think of the many nursery and child-rearing fads based on simplistic ideas about environmental stimulation, etc. Despite all our successes, we have not been very smart as a species when it comes to rearing our children.
Apart from the effects on our brain, developmental influences have been traditionally seen as much less important in making us what we are. It came as a surprise to the world of medical science in 1941 when the Australian ophthalmologist Norman Gregg recognized that cataracts and deafness in children could result from mothers having been
infected with rubella (German measles) during pregnancy. It conflicted with the current dogma that the fetus was isolated from external influences and it took some years for the significance of this great discovery to be fully appreciated. If the fetus was not immune to outside influences after all, did this mean that fetal development was not as autonomous as previously thought? But these were gross effects which disrupted the developmental programme and caused birth defects such as holes in the heart and disordered bone and brain development, and so were not considered important in the context of human health for the majority of us.
Then in 1961 another Australian scientist, William McBride, showed that the anti-sickness drug thalidomide taken by some women in pregnancy had terrible effects on limb development in the fetus. This was a critical discovery, although McBride’s hubris led him later to make unsubstantiated and falsified claims about another drug, debendox, and he ended his career in disgrace—this seems to be a pattern in too many great men of science.
But beyond medicine and within the world of botany and zoology the understanding of the role of the environment in development was growing. Many examples were found in plants, insects, and amphibians where early life events influenced development. The honey bee is a good example. A female larva, hatching from its waxy cell in the hive, may develop into a worker or into a queen. Their destinies will be quite different, like the rags or riches of a fairy story. The worker will slave her life away, collecting pollen and never having any offspring of her own. Meanwhile, the queen languishes in the hive, occupying her time by fighting other queens and preparing for her nuptial flight, when the male drones of the hive will pursue her; successful males will mate with her in mid-air as high as a kilometre above the ground, only to die after this acrobatic feat.
The worker and queen bees share the same genes, yet they could not be more different: the worker bee’s mouthparts develop to collect pollen, her metabolism is suited to frequent short flights, and her
reproductive organs are shrunken; the queen’s mouthparts are suited to fighting, her metabolism is designed for a single long flight, and her ovaries are prepared to lay thousands of eggs. Amazingly, these two very different bees can emerge from identical genetic information. The developmental path that the larva will take depends solely on what it eats in the first few days after hatching—fed on protein-rich royal jelly for only a few days and then with sugary nectar, the larva becomes a worker bee, while prolonged feeding on the royal jelly alone turns the larva into a queen. Interestingly their destiny is not absolutely cast in stone, for if the queen bee dies, some previously sterile female worker bees may start to become reproductively active—their social environment changes their fate. But they never become complete queens—that die is cast soon after the larvae hatch.
We now know a good deal about how two identical sets of genes can be induced to develop into such very different bees as the worker and the queen. If we use agents to block DNA methylation in the larva, it can only develop as a queen. This is the science of epigenetics, which we introduced in
Chapter 8
. We have seen that epigenetic mechanisms can explain how exposure in early life can have lasting consequences. And we saw that measurement of epigenetic changes at birth can demonstrate a far greater effect of early life on biological consequences than we would have imagined even three years ago. These data provide the basis for explaining the links that the epidemiologists have found between a poor start to life and a much higher risk of developing heart disease or diabetes later in life.
Development can no longer be ignored: it is a crucial part of the story of what makes us what we are.
The new insights from epigenetics are compelling. They force those in adult medicine who have ignored the importance of developmental science to think again. Epigenetic markers measurable at
or soon after birth could well provide a very good record of the conditions which the fetus experienced in prenatal life. They may be able to tell us how well each baby has responded to the challenges which it met during the nine months when it was effectively hidden from view. Was its nutrition adequate for the growth it was attempting? Was it exposed to levels of hormones which changed the pattern of its development? We think that epigenetic marks will be able to pick up evidence of this with far greater accuracy than just simple measures of weight or shape at birth.
Because these marks are present in every baby, across the entire spectrum of normal growth and development, they can tell us much more. Every baby will have received information from its mother about her life and the world in which she lives—how old she is, whether this is her first baby, what she eats, how much physical activity she undertakes, as well as whether she is fat, putting on too much or too little weight during pregnancy, or developing gestational diabetes. She has taught her baby a very great deal about her world already, preparing it as best she can for the environment in which her baby is likely to live. Epigenetics gives us a way of measuring the impact of all these prenatal lessons at birth.
What can we do with this information? The science is at an early stage and we can only speculate. But potentially it gives us a way of deciding what might be the best diet, lifestyle, exercise level, and so on for each child. Studies are already starting to explore whether we can change mothers’ diets in such a way as to get a better epigenetic profile at birth or before weaning. Indeed we might finally have a tool we can use to address a question we really do not yet have the complete answer to—what is the best diet to recommend to a pregnant woman or a prospective father?
Can we use epigenetic information to ask whether a baby might be born with a propensity to lay down fat, or to develop high blood pressure or diabetes? If so, the sooner we help that baby to lead a life that will reduce this risk, the better. This information may help us
address the challenge of finding a better strategy to fight the war against obesity and chronic disease. There is an enormous variation in individual risk of disease associated with increased body fat. Some of us may experience a postnatal environment for which our development has not prepared us—perhaps those of us to whom this applies will be those at greatest risk of diseases associated with obesity. On the other hand we know that many of us are equally obese and yet have a low risk—perhaps because our development has prepared us precisely for the world in which we live and allowed us to capitalize on it, to optimize our fat deposition. Epigenetic changes measured early in life provide a way in which we can distinguish between these two extremes, recognizing of course that there is a spectrum in between the two.
In a nutshell, epigenetic processes are involved in making us all different, even if we have very similar inherited genes. So because the risk of disease differs in each of us, establishing our individual risk is likely to involve reading our epigenetic profile.
Several questions immediately arise from these discoveries. The first is whether the epigenetic change actually lies on the causal pathway between early life environment and later risk of disease, or whether it is a signpost along this path, something which indicates a particular state of affairs but is not causally involved. The answer to this important question is not known but, from research in animals and from what we know about the control of the genes involved in conditions such as obesity, it appears increasingly probable that the epigenetic changes are indeed on the causal pathway.
If this turns out to be true then the next question must be—how permanent are these epigenetic changes? Is it possible that they can be reversed, say in early childhood, so reducing risk in an individual? Once again the data from the animal experiments we described in
Chapter 8
suggest that this is indeed the case, as administration of some micronutrients such as folic acid or indeed of the satiety
hormone leptin can reverse the epigenetic changes in young animals and prevent them developing the phenotype of obesity even if they consume an unbalanced diet as they grow up.
But all this might lead us down a path which we do not wish to follow. Are we really saying that we would propose to give a hormone to young children in order to reduce their risk of later obesity? Well, there are precedents for this in paediatric endocrinology, for example the administration of synthetic growth hormone to children who otherwise would be extremely stunted because of medical conditions, but even that was misused and too many healthy children received growth hormone simply because their parents wanted them to be taller.
We have been at great pains to stress that the relative fatness manifest in many of our young children is an aspect of their normal biological responses, set up in their development. They do not have a disease which we can argue must be treated. Perhaps some generalized intervention might be possible—maybe some nutritional supplement that hypothetically has the same effect as leptin? Perhaps, but we simply do not yet know whether this is possible or safe.
The animal experiments demonstrate that interventions in early life not only reverse the potentially detrimental effects of mismatch or a poor start to life but at the same time reset the epigenetic marks. So this gives us another potential use for such epigenetic measurements, as a possible indicator of the effectiveness of interventions and a monitor of whether they are proceeding according to plan. One source of DNA comes from cells which can be taken from a gentle swabbing of the cheek inside the mouth—we have been able to get ample DNA for our measurements in such a way even from inside the mouths of newborn babies. This is a non-invasive and non-painful procedure and studies taking sequential cheek swabs from babies to examine how their profile changes according to how they are fed are under way in Singapore.
Studying epigenetics at birth and in infants may be the basis for developing a new wave of healthier foods. As epigenetic marks are sensitive to nutrition, we should over time be able to work out which components of what foods lead to better outcomes at different stages of life. While many claims are made about foods promoting health for pregnant mothers and babies, in general these are not supported by robust evidence. Beyond the critical micronutrients such as folic acid, the science is surprisingly hazy. We should be able to do much better using our new epigenetic toolkit, and it is exciting to see that the food industry is starting to take this science on board.
The story seems compelling, but despite this there are many reasons why the importance of development continues to be underplayed both in clinical medicine and in public health. Indeed we feel like the little boy in Hans Christian Andersen’s tale who asks, ‘Why is the emperor wearing no clothes?’ It seems so obvious but we still find ourselves asking the question again and again—‘Why is development being ignored?’ There are many reasons—some practical, some reflecting vested interests, and some due to just plain ignorance. We have already discussed the history of the basic science and how development came to be largely left out of the modern genomic revolution. Let us now turn to the realpolitik of medicine, public health, and the private sector.
Much of clinical medicine tends to ignore the specialities of obstetrics and paediatrics. After all, these are largely concerned with the normal processes of having babies and helping children through the inevitable problems of early life, aren’t they? So in many ways these disciplines are seen as somewhat irrelevant to adult medicine and surgery. Most practitioners in adult medicine are understandably focused on either treating the acute situation or ameliorating the chronic problem and they see no need to look backwards into
development. Most problems in adult medicine are dealt with by lifestyle change, medication, or surgery to treat the symptoms or by various forms of palliative care. It is widely recognized in many developed countries that health services are better equipped to deal with acute situations than with prevention and amelioration of chronic conditions.