Sunday, November 11, 2007

Mismatch: Why Our World No Longer Fits Our Bodies

As you may have guessed, it was this book's subtitle that caught my eye. Why Our World No Longer Fits Our Bodies. You can't get more blatant than that. Does the book live up to this audacious title? Are today's lifestyles more than our bodies can handle? I'll leave it up to the reader to draw their own conclusions, as this question is difficult to answer, but the evidence is compelling enough to suggest that changes need to be made.

The authors, Peter Gluckman and Mark Hanson, do a very thorough job of exploring the premise that the human body is mismatched to the modern urban lifestyle. The evidence is clear, argue the authors, and can be found in the increasing cases of diabetes, heart disease and obesity in the developed world. There are also more subtle consequences of our mismatch, such as those brought about by the falling age of puberty. Their solutions come right at the end of the book, and are a logical follow-on from their supporting evidence, but are perhaps are bit too brief. The aim of these solutions seems to be to raise people's awareness of the issues, rather than present final answers. Which is reasonable, given that the authors note that the application of evolutionary developmental biology to human medicine is a new field of study.

The relationship between our evolved biology and the nature of the environments in which we live is the real focus of this book. Generalist species such as humans have a broad capacity to adapt or cope over a range of environments but may not be so well-equipped to live in a particular environment as a specialist species. But it is important, the authors note, to distinguish between thriving in an environment and surviving or coping in that environment. Trade-offs which can affect our health and reproduction may have to be made once we move away from the centre of our comfort zone.

Living successfully for humans means being well-matched to the environment. The greater the shift from the environment at the centre of the comfort zone, the greater are the changes in physiology and behaviour needed to cope, until at some point significant costs appear.

When humans migrated progressively northwards from their ancestral home in the African savannah, they moved into colder climates where the average daily amount of sunlight fell. Occupying these new habitats in Europe and even further north gave advantages in terms of opportunities to hunt, and later to cultivate some simple crops and domesticate animals. But it also brought new threats. For example the low exposure to sunlight, especially in the winter months, reduced the production of vitamin D.

Vitamin D is made in the skin by the action of sunlight which converts a precursor molecule found in our diet into the active form of vitamin D. It is vital for many body processes, notably the deposition of bone during development. People who have chronically low levels of vitamin D during development suffer from rickets, and this is associated with skeletal deformity. Older adults who are vitamin D deficient are more likely to suffer brittle bone disease (osteoporosis) and even minor accidents produce fractures. This had not been a problem in Africa as sunlight levels were high throughout the year, and our ancestors had evolved to have dark skins as the melanin protected against the other harmful effects of sunlight.

In moving north we needed to have paler skins, filtering out less of the sun's rays and optimising our production of vitamin D. The cost of this strategy is that there is a higher risk of skin cancer in people with paler skins, triggered even in Europe during the summer when the sunlight exposure can still be high.

Before reading Mismatch, I had a limited understanding of evolutionary biology. So I found the authors' clear and concise introduction to its principles and processes quite useful. All the basics are covered early on, such as genes, variation, selection, adaptation, and inheritance. At the risk of veering off-track, I'll now provide a short overview of evolution for anyone who's interested.

The creation of a new species is the result of an accumulation of changes in the gene pool of an ancestral species. This usually occurs when some members of a given species become separated from the whole. This reduction of the gene pool means that over time new traits gradually become dominant, until the new group no longer resembles the old, and interbreeding between the two is not possible. This is known as macroevolution, or speciation, and takes thousands of generations. The genetic changes that occur in a population with the passage of each generation are known as microevolution. The accumulation of many microevolutionary changes leads to macroevolution.

Microevolution goes something like this: (1) Two individuals of the same species get together and mate. (2) Two sex cells or gametes (one from each parent) unite and form the first cell of a new individual, known as the zygote. (3) The zygote divides multiple times, producing identical copies of itself. Errors in this copying process result in mutation. (4) Mutation alters the genetic code of the offspring, resulting in the introduction of new traits not found in the parental generation. These traits are termed variations. (5) If a certain variation helps this new individual to better adapt to its environment, it will be more likely to survive and reproduce, thus passing on the variation to its progeny. This is known as the theory of natural selection.

In short, natural selection acts to select characteristics or traits that confer greater fitness within a given environment. It involves four straightforward principles: (1) there are more members of a species born in each generation than will survive; (2) there is variation in physical and behavioural characteristics among individuals within species; (3) this variation is heritable, and (4) characteristics that result in an individual surviving and reproducing tend to increase in frequency in the population, whereas characteristics of non-survivors decrease.

So natural selection acts on the most advantageous heritable variations. Those individuals whose characteristics best match them to their environment are said to have the most advantageous phenotype, which is defined as the observable characteristics of an organism. The phenotype is determined by both genetic and environmental influences. The genetic basis of phenotypic traits is known as the genotype of an organism. The genotype is the range of specific genes an organism possesses - in other words, the genetic constitution of an organism.

For example, my genotype for earlobes describes which version, or allele, of earlobe gene I have been gifted with. My father has free earlobes (dominant gene, which we'll call E), while my mother has attached earlobes (recessive gene, which we'll call e). Free earlobes hang below the point where they attach to the head while attached ones do not. My father's genotype for earlobes is Ee, and my mother's is ee. Since I have attached earlobes, my genotype for earlobes is also ee. If I possessed the dominant E allele, the instructions from this would have overpowered those from the recessive e allele and I would have free earlobes.

As a rule, any trait that reduces an individual's struggles in life could be considered beneficial. And anything that brings struggle into an individual's life could be considered a predator. Those individuals that have the necessary traits to overcome predators will survive long enough to reproduce. As an example, consider two different varieties of ladybird living in a lush green forest. One is coloured green, the other is orange. Green ladybirds blend in to their environment, so are much safer from hungry birds than the orange ones, who stand out like a sore thumb. Thus the green ladybirds will survive and increase their numbers, while the orange ones will slowly disappear from the species.

Without these two variations in ladybird colour, there would be no selection. The green colouring trait happens to be beneficial and over time becomes more common in successive generations of the population. The proliferation of green ladybirds means that as a whole, the species has become better adapted to its specific ecological niche, and has thus evolved.

The genotype provides the crude settings for the organism's development of its mature phenotype. But the environment also plays a large part in an organism's development. Epigenetic changes (changes that occur independently of genes) during development drive the phenotype towards a better match with its environment. The authors show that there are circumstances in which the resulting phenotype can turn out to be inappropriate, and these are mainly a consequence of modern environmental influences.

After all, humans evolved to inhabit a world very different from that which we have constructed for ourselves. It is well-known that Homo sapiens originated in the grasslands of Africa some 160,000 years ago. We lived as hunter-gatherers, relying on food from two major sources - (1) the collection of seeds, tubers, nuts, and fruits, and (2) hunting. There have been various estimates of the content of this Palaeolithic diet and it is clear that it differed significantly in a number of respects from the modern diet.

The authors explain: "The Palaeolithic diet was higher in fibre content and had a much lower glycaemic index (ability to raise blood sugar rapidly) because the foods were less refined. Wild honey would be the only source of concentrated sugars and it would have been a minor component of the diet. The diet had a very different mix of fatty acids, a higher protein content, and a much lower salt but higher potassium content. There was no milk, butter or cheese, and the meat was generally much leaner than today. Hunter-gatherers dispersed in accordance with available food suppplies. They could choose their environments, within limits. There is no fossil evidence to suggest that they suffered chronic malnutrition - quite the reverse, because the data available from skeletons suggest that they achieved modern, or close to modern, heights."

Our Palaeolithic ancestors may have had a good diet, but nothing lasts forever, and after the end of the last Ice Age the changes in climate and vegetation made hunting and foraging more difficult. A significant turning point in the history of our species was about to take place - the development of agriculture from about 11,000 years ago. Agriculture first appeared in the fertile crescent extending from the Levant to the Tigris and Euphrates (modern-day northern Syria), and with it came a progressive change in diet. Herding allowed the collection of milk as a food source and there was access to fatter meat on a more consistent basis.

This transition to agriculturalist and settlement-dweller eventually led to the development of urbanisation, complex power hierarchies, and the growth of cities, states, and empires. Some people now lived in contact with much wider networks of others than they would have had they maintained the hunter-gatherer or pastoralist way of life. Prior to the development of settlement, humans lived in social groups of less than 150 and perhaps as small as 20 to 50 people. These would have been extended family groups, and this has been an important component of our species' success.

With settlement, much larger numbers of people came into direct contact with each other. Those who lived in cities came into contact with many hundreds of people. From living in a small clan where individual roles and relationships were clearly evident and the power hierarchy simple, humans came to occupy complex networked social structures where roles were subdivided and separated and intricate power and control hierarchies emerged.

Clearly our social structures are far more complex today than they were when we evolved on the African savannah. This evolutionary discrepancy has been further magnified by a phenomenon the authors term "maturational mismatch". They observe that throughout most of our history, there was synchronous maturation of our bodies and our brains at puberty. But over the last hundred years they have diverged; while psychosocial requirements have become more demanding and full maturation appears to have been delayed, physical maturation is getting earlier.

Two centuries or more ago many teenagers could, and did, take on roles as mature adults. Junior officers in the Royal Navy during the Napoleonic Wars were in their early teens. Has today's society become so much more complex that adolescents need to know so much more to become an adult?

Gluckman and Hanson surmise that external influences like the media and the loss of tight societal pressures may have reinforced exploratory behaviour in adolescents, thereby delaying the development of attributes such as responsibility and self-control. The way adolescents are treated by society may also explain their delayed psychosocial maturity - as a society we confuse physical maturation with psychosocial maturation so we have a tendency to assume a person who is biologically mature is a full adult. These assumptions lead to rebellious adolescent behaviour.

This goes some way towards explaining maturational mismatch in relation to brain development. But it doesn't explain why the age of puberty is falling. The answer to this puzzle lies with diet and nutrition. The development of agriculture and settlement led to population growth, which brought humans into greater proximity with each other - static settlements dependent on agriculture allowed more people to live at one place. This led to fundamental changes in our nutrition. While hunter-gatherers had multiple ways to obtain food, populations dependent on a fixed location for their herd and crops became inevitably more at the mercy of climate and war. Malnutrition affects children first and their growth was reduced. When childhood nutrition is poor, puberty becomes delayed and so with changing patterns of settlement came a delay in puberty.

This allowed physical maturation to remain synchronised with the concurrent delay in pyschosocial maturation, as by this time social structures had become more complex and the skills needed to thrive in society took longer to learn. In today's society, complexity has increased still further, but this time puberty has not kept up. In fact, it has moved in the opposite direction. Gluckman and Hanson suggest that the reason for this is the health and wealth of modern life. The age of puberty is returning to where it was during our nutritionally-balanced Stone Age days, when life expectancy was short (around 35 years) and puberty in the female needed to begin between the ages of 7-8 to allow her a reproductive span of 16-18 years, enough time to support the youngest of her progeny to a fully independent existence.

The effect of nutrition on puberty differs before and after birth. While poor childhood nutrition delays puberty, poor fetal growth may lead to earlier puberty. Poor fetal growth due to deficient nutrient supplies is actually an adaptive response by the fetus. When the supply of nutrients from mother to fetus is poor, the fetus must reduce its growth rate just to survive. If the nutrient levels are too low, babies can be born premature. Their biological processes are able to sense that the environment within the womb is so threatening that it is wise to get out early in order to maximise chances of survival. Accelerated maturation and premature birth means that the individual is less likely to live a long life, so sexual maturation is also accelerated to ensure gene transmission to the next generation.

When a fetus develops in a constrained environment and then lives in a richer one after birth, there are several outcomes. Advanced sexual maturation is one - this is most dramatically observed in children who were born in very poor societies but then adopted and brought up in the richest countries. Their rapid switch from poor early nutrition to good childhood nutrition is associated with much earlier puberty - with some girls having their first period as early as 6 to 8 years of age.

A further outcome is a greater likelihood of developing obesity. Maternal constraint is a natural process in the mother that generates an upper limit on how much nutrition the fetus can sense. It may have given our species an adaptive edge during our evolution, because the predictive responses always made us expect to live in a slightly harsher environment than existed during our gestation. Thus as a species we are pre-adapted to expect worse than we may experience, and this would have given us an inbuilt safety margin. In fact, this is why the human neonate is born with a layer of high-energy fat reserves. If nutrition is compromised, these fat reserves serve the purpose of providing an energy buffer for brain growth to continue.

But as our nutritional environments have got richer, the discordance between predictions created by these constraint mechanisms and reality has become greater. Rather than being evolutionarily advantageous, these prenatal forecasts have now become disadvantageous, and are the major cause of what the authors call a "metabolic mismatch". The consequences of metabolic mismatch become manifest as heart disease and diabetes. The preference for high-calorie and high-fat foods that developed during gestation (in order to build the layer of fat) will lead to weight gain and eventually obesity in a rich environment.

This problem is particularly important among sections of the population who have inadequate education or are in lower socio-economic groups in both the developed and developing world. These people sometimes worsen their situation through inadequate education and inability or lack of opportunity to act. The range of foods they can afford often have a higher fat and carbohydrate content. This compounds their degree of mismatch. The poor are more at risk and poverty is a major contributor to chronic disease.

Further research into maternal constraint is very important, the authors argue, as it is clear that the increasing incidence of heart disease and diabetes have their origin in the mismatch that arises from the interplay between developmental plasticity and the postnatal environment. One might think that evolutionary processes would have worked to exclude such nasty fates for our species. They have not because for the most part these issues do not interfere with our reproductive fitness. These diseases until recently only appeared in middle age, well after reproduction has been completed. Evolution cannot select against traits that appear after reproduction has ceased.

Advances in medicine have allowed humans to live longer than ever before. Right up until the last century, longevity to middle age and beyond was rare. We have always known the certainty of death, but it was not within our power to delay it. We instead used religion to deny it through concepts of an afterlife. These days, we no longer tolerate the evolutionary imperative of decline and demise after reproduction. But there is a cost to living longer.

One has been the rapid rise in the occurrence of diseases of degeneration. There is a theory (the disposable soma model developed by Tom Kirkwood) stating that there is a trade-off between the lifetime investment in growth, reproductive, and repair systems. According to this theory, those individuals which anticipate a short life invest less in repair and more in early reproduction.

It then follows that women who are able to reproduce later in life are also likely to live longer. A study in Boston found that women who were able to conceive children naturally after the age of 40 had a four times greater chance of living to 100. There are several other studies showing the relationship between a late menopause and longevity and others showing the reverse, namely that early menopause is a marker for a shorter lifespan.

Experiments on mice have shown that longevity has genetic determinants: the genes involved are those associated with growth and metabolism. If a mouse predicts a threatening environment during its prenatal development, it will invest less in growth, metabolism, repair, and longevity and try to hasten its reproduction. Conversely if it predicts a benign environment, it will invest in greater longevity. Thus in mice, prenatal undernutrition leads to reduced longevity, whereas postnatal undernutrition leads to a marked prolongation of the lifespan.

There is evidence to show that these developmental trade-offs between components of the life-course strategy also exist in humans. So what can be done? The authors suggest that the route to reducing the impact of our many mismatches lies in technological, environmental, and cultural development.

One example of technological development involves modifying the epigenetic component of our evolutionary inheritance. An experiment is described in which undernourished newborn rats can be tricked into thinking that they are fatter than they really are by giving them injections of a hormone that is normally made by fat. These injections stopped the development of obesity even when the rats were fed a high-fat diet.

Other useful recommendations are given, such as social intervention programmes and further research into the causes and consequences of mismatch. The most wide-ranging and perhaps radical recommendation involves optimising the diet and body composition of all women of reproductive age.

Any reader with an interest in biologically-based approaches to human health would find this book appealing, as would science buffs who enjoy interesting facts and trivia. This is a very informative and well-written book, and health policy-makers should be made aware of the recommendations it proposes.