In the Origin of Species (Reference Darwin1), Darwin recognised that there are two forces of evolution, i.e. natural selection and the conditions of existence, where the latter was considered the most powerful(Reference Crawford, Cunnane and Stewart2). For example, important steps in evolution are the origin of eukaryotic life approximately 1·6–2·7 billion years ago(Reference Knoll, Javaux and Hewitt3, Reference Brocks, Logan and Buick4) and the appearance of photosynthetic cyanobacteria that began to oxygenate the atmosphere about 2400 million years ago (Mya)(Reference Buick5). However, there was relatively little alteration in the design of life forms before the Cambrian explosion about 600 Mya. Only when the oxygen tension in the atmosphere rose above the Pasteur point did aerobic metabolism become thermodynamically possible(Reference Holland6), resulting in an explosion from simple prokaryotics to a diversity of eukaryotic life forms(Reference Crawford and Marsh7).
During the past millions of years of evolution, with relatively little alteration in life forms and environmental circumstances, the human genome has become optimally adapted to its local environment(Reference Eaton and Konner8–Reference Eaton and Cordain11). In other words, our genome may have reached a state of homeostasis, defined as the ‘optimal interaction between environment and genome’ or ‘nature in balance with nurture’, to support optimal survival for reproductive success. The aetiologies of many typically Western diseases, also known as diseases of affluence or civilisation, have been attributed to the disturbance of this delicate balance, secondary to the rapid changes in the conditions of existence, while our genome has remained basically unchanged since the beginning of the Palaeolithic era. The former include changes in physical activity, stress, sleep duration, environmental pollution and others(Reference Egger and Dixon12, Reference Egger and Dixon13), but one of the most rapidly changing conditions of existence has been the human diet.
Since the onset of the Agricultural Revolution, some 10 thousand years ago (Kya), and notably in the last 200 years following the start of the Industrial Revolution, humans have markedly changed their dietary habits. Consequently, it has been advocated that the current pandemic of diseases of civilisation results in part from the mismatch between the current diet and our Palaeolithic genome. In other words, ‘we are what we eat, but we should be what we ate’(Reference Wood and Brooks14, Reference Muskiet15). The ensuing poorly adapted phenotype may find its origin as early as in the fetal period(Reference Barker16, Reference Godfrey and Barker17) and possibly as far back as in the maternal grandmother's womb(Reference Drake and Walker18). This phenotype might be laid down in, inherently labile, epigenetic marks that are meant for the short- and intermediate-term adaptation of a phenotype to the conditions of existence. With clear evolutionary advantages they may become transmitted to the next generations as a memory of the environmental conditions that can be expected after birth(Reference Gluckman, Hanson and Morton19). They thereby give rise to a seemingly high contribution of genetics in some of the associated ‘typically Western’ degenerative diseases, which are in fact complex diseases that by definition do not inherit by Mendel's law, illustrating that epigenetic marks can also become erased.
From a pathophysiological point of view, the poorly adapted phenotype in Western countries, ensuing from the conflict between the changing lifestyle and our Palaeolithic genome, centres on chronic low-grade inflammation and the metabolic syndrome (also named the insulin resistance syndrome), which are risk factors for many of the diseases and conditions typical for affluent countries, such as CVD, type 2 diabetes mellitus, osteoporosis, certain types of cancer (notably colon, breast, prostate), fertility problems (polycystic ovary syndrome), pregnancy complications (gestational diabetes, pre-eclampsia), some psychiatric diseases (major and postpartum depression, schizophrenia, autism) and neurodegenerative diseases (Alzheimer's disease, Parkinson's disease)(Reference Cordain, Eades and Eades20–Reference Pasinetti and Eberstein22). The genetically determined flexibility to adapt to a changing environment appears to have been exceeded and the genetically most vulnerable have become sick first, but ultimately all individuals will become sick with increasing dose and exposure time.
Environment, nutrients and their interaction with the genome
Adjustment of the DNA base sequence is a slow process that in an individual cannot support adaptation to environmental changes occurring at intermediate or rapid pace. Flexibility for rapid adaptation is provided by genetically encoded mechanisms that allow adjustment of phenotype by epigenetics and by the interaction of the environment with sensors, such as those of the sensory organs, but also by the many that remain unnoticed(Reference Muskiet and Kemperman23–Reference Muskiet, Kuipers, Cunnane and Stewart25). The role of nutrients in (epi)genetics and their direct interaction with the genome have become increasingly acknowledged(Reference Feige, Gelman and Michalik26). Examples of such nutrients are iodine, Se, vitamins A and D, and n-3-fatty acids, which are direct or indirect ligands of the thyroid hormone receptor (TR), retinoid X receptor (RXR), retinoic acid receptor (RAR), vitamin D receptor (VDR) and PPAR. Homodimerisation and heterodimerisation of these receptors facilitate gene transcription and thereby keep our phenotype optimally adapted to the reigning conditions of existence. The roles of these nutrients, their respective receptors and the interaction between their receptors are indicative of the importance of their dietary presence and of a certain balance between their dietary intakes to arrive at optimal interaction with the genome. Lessons for this optimal interaction, and hence for the development of randomised controlled trials aiming at the study of diet or lifestyle, rather than single nutrients, might derive from knowledge on human evolution and the conditions of existence to which our ancestors have been exposed. These lessons might provide us with valuable information on what we should genuinely define as a ‘healthy diet’.
The concept that a thorough understanding of evolution is important in the prevention and treatment of (human) diseases has long been recognised. For example, in the early 1960s it was stated that ‘nothing in biology makes sense except in the light of evolution’(Reference Dobzhansky27), while in ethology, a distinction was made between proximate and ultimate (also named evolutionary) causes(Reference Tinbergen28). Proximate explanations provide a direct mechanism for certain behaviour in an individual organism. They explain how biomolecules induce certain behaviour or, for example, an allergic reaction. Proximate explanations, however, provide insufficient information to answer the question why this behaviour or this allergic reaction occurred. Ultimate explanations provide answers explaining why things happen from an evolutionary point of view. Many, if not all, diseases can become explained by both proximate and ultimate explanations. The science searching for the latter explanations has become known as ‘evolutionary medicine’. Unfortunately, modern medicine deals mostly with proximate explanations(Reference Harris and Malyango29, Reference Purushotham and Sullivan30), while ultimate explanations seem more prudent targets for long-time disease prevention(Reference Harris and Malyango29).
The term ‘evolutionary medicine’ (also named Darwinian medicine) was launched by Randolph M. Nesse and George C. Williams(Reference Williams and Nesse31, Reference Nesse and Williams32). They provided evolutionary answers for the understanding of human diseases. Many diseases do not result from a single biological, anatomical or physiological abnormality, but rather from a complex web of interactions. They often reflect the collateral damage of the survival and reproduction strategies of our genes and the genes of other organisms in our environment. The resulting disease manifestations include the outcomes of human defence mechanisms to clear foreign pathogens and the collateral damage of conflicts and trade-offs between humans and foreign invaders. Examples often overlooked are coincidence, in which diseases may result from imperfections of human evolution, and exaptation, in which a feature is not acquired in the context of any function to which it might eventually be put(Reference Gould and Vrba33). For example, the equilibrium between the not yet full-grown, but yet relatively large, brain of a newborn and the small birth canal in its turn is constrained by an upright posture and provides an example of a trade-off in human evolution. The location of the birth canal in its turn provides an example of an evolutionary coincidence that urges to deal with an, in retrospect, imperfect evolutionary design. These examples illustrate that evolution builds on the past: it is not possible to start a completely new design from scratch, which argues against ‘intelligent design’. The most important example of an evolutionary explanation for human disease, however, comes from the mismatch between our slowly adapting genome and the rapidly changing environment, notably our diet.
Evolutionary medicine argues that the chronic degenerative diseases causing most morbidity and mortality in affluent countries occur because of the current mismatch between the rapidly changing conditions of existence and our Palaeolithic genome(Reference Eaton, Eaton and Sinclair34). These mismatches will persist, notably in the light of our long generation time. The genetic adjustments needed to adapt to the new environment are also unlikely to occur, since the mismatch exerts little selection pressure. That is, they do not cause death before reproductive age, but rather reduce the numbers of years in health at the end of the life cycle(Reference Eaton, Cordain and Lindeberg35). Consequently, evolutionary medicine acknowledges a return to the lifestyle before the onset of the Agricultural Revolution as translated to the culture of the 21th century and as popularised by the expression: ‘how to become a 21th century hunter–gatherer’(Reference O'Keefe and Cordain36). Skeptics of evolutionary medicine often raise the intuitive criticism that the human ancestor had a very short life expectancy compared with contemporary individuals(Reference Eaton, Cordain and Lindeberg35). Consequently, they argue, there was no selection pressure on longevity or ‘healthy ageing’, since there were virtually no old people, while the few individuals reaching old (for example, postmenopausal) age provided no evolutionary benefit to younger individuals who were still able to reproduce. The counterargument is multilevelled.
Arguments and counterarguments in evolutionary health promotion
It needs to be emphasised that evolutionary medicine predicts no further increase in life expectancy, but rather a decrease in the numbers in deteriorating health at the end of the life cycle. It has been estimated that the complete elimination of nine leading risk factors in chronic degenerative diseases would increase life expectancy at birth by only 4 years, since these diseases only affect late-life mortality(Reference Hahn, Teutsch and Rothenberg37). Second, the increased life expectancy at present originates mostly from the greatly diminished influence of some unfavourable conditions of existence, including (childhood) infections, famine, homicide and tribal wars(Reference Eaton, Eaton and Sinclair34, Reference Hill, Hurtado and Walker38) secondary to the high levels of medical sciences and continuing civilisation. Thus, to achieve the average life expectancy of 40 years in a present-day hunter–gatherer society, for every child that does not survive beyond 1 year of age, another should reach the age of 80 years. In fact, about 20 % of modern hunter–gatherers reach at least the age of 60 years(Reference Howell39–Reference Marlowe41). In other words, the popular argument that very few individuals in these societies live past 50 years(Reference Eaton, Cordain and Lindeberg35) is unsupported by ethnographic data. The third, often raised, argument is that due to the higher life expectancy in present-day humans, it is invalid to compare the mortality figures for cancer and degenerative disease of present-day hunter–gatherers (with low life expectancies) with those of Western populations (with a life expectancy of 80 years). However, early biomarkers of degenerative diseases such as obesity, high blood pressure, atherosclerosis and insulin resistance are also less common in younger, age-matched, members of present hunter–gatherer compared with members of affluent societies(Reference Eaton, Konner and Shostak9, Reference Eaton, Eaton and Stearns42), while measurements indicative for ‘good health’ such as muscular strength and aerobic power are more favourable in the former(Reference Shephard and Roy43). Moreover, even the oldest individuals in hunter–gatherer societies appear virtually free from chronic degenerative diseases(Reference Lindeberg and Lundh44–Reference Trowell and Burkitt46). A fourth counterargument against the assumption that our human ancestors before the Agricultural Revolution died at a young age derives from archaeological records. After the transition from hunting and gathering to farming about 10 Kya, life expectancy dropped from about 40 years (as it is in recently studied hunter–gatherers, but also was among students of the Harvard College Class born in 1880(Reference Blacklow47)) to about 20 years(Reference Angel, Cohen and Armelagos48–Reference Larsen50). This seemingly evolutionary disadvantage, secondary to a decrease in nutritional quality, is substantiated by a decrease in general health that has become noticeable from a decrease in final height, while skeletal markers of infection and nutritional stress became more common in archaeological finds(Reference Larsen49–Reference Cohen, Cohen and Armelagos52). These setbacks were eliminated by a net increase in population growth, secondary to an increased productivity per land area that resulted in more energy intake per capita. Life expectancy remained stable throughout the Neolithic until the late 18th century, seldom exceeding 25 years in ‘civilised’ nations(Reference Eaton, Cordain and Lindeberg35). From this time, improvements in hygiene, food production and manufacturing, energy generation, per capita income, shelter, transportation, clothing and energy intakes substantiated an increase to and beyond the life expectancy that prevailed before the onset of the Agricultural Revolution. Greater energy availability enhanced, for example, the energy requirements of the immune system and for reproduction, both improving longevity(Reference Eaton, Cordain and Lindeberg35, Reference McKeown, Brown and Record53). Importantly, it was concluded that medical treatments had little impact on mortality reduction, while public health achievements (sanitation, food and water hygiene, quarantine and immunisations) have critically improved life expectancy. The fifth counterargument is that old people do provide an evolutionary benefit to the younger generations. Male fertility remains largely intact and male provisioning might help in the problem of high female reproductive costs, although the latter is contested(Reference Blurton-Jones, Marlowe, Hawkes, Cronk, Chagnon and Irons54, Reference Hawkes, O'Connell and Blurton-Jones55). The benefits of older females have been put forward in the grandmother hypothesis. This hypothesis, in which the presence of older females within a certain group benefits the reproductive success of their offspring, is supported by studies in human hunter–gatherer(Reference Hawkes, O'Connell and Jones56–Reference Kachel, Premo and Hublin62) and primate societies(Reference Hawkes, O'Connell and Jones56, Reference Hawkes60, Reference Strier, Chaves and Mendes63). Interestingly, the fitness benefits of grandmothering proved insufficient to fully explain the evolution of increased longevity(Reference Kachel, Premo and Hublin62), suggesting that other evolutionary benefits, such as grandfathering, might also be involved in the long reproductive and non-reproductive lifespan of Homo sapiens. A recent analysis supports such benefits for both older males and females, since the presence of post-reproductive women increased the numbers of newborns by 2·7 %, while 18·4 % of the infants in a polygamous society in rural Africa were sired by males aged 50 years and above(Reference van Bodegom64). In support of the statement that ‘nothing in biology makes sense except in the light of evolution’ we therefore conclude that, unless proven otherwise, the presence of a substantial proportion of older males and postmenopausal females in hunter–gatherer, in contrast to primate societies, should be considered as proof for the evolutionary benefit that these individuals are to their progeny. Finally, we propose that this assumption would only be convincible if these individuals were reasonably fit, thereby supporting the concept of healthy ageing. Hence, healthy ageing seems both supported by ethnographic data and its benefit to hunter–gatherer societies. Other commonly raised arguments against the genome–environment mismatch hypothesis are the potential genetic changes since the Agricultural Revolution, the heterogeneity of ancestral environments and innate human adaptabilty(Reference Eaton, Cordain and Lindeberg35). Counterarguments to these critics have been discussed in great detail elsewhere(Reference Eaton, Cordain and Lindeberg35).
In the present review, a multidisciplinary approach is used, including palaeo-environmental reconstruction, comparative anatomy, biogeochemistry, archaeology, anthropology, (patho)physiology and epidemiology, to assess the characteristics of the ecosystem that supported human evolution. Based on this assessment, an approximation is made of the dietary composition that derives from this ecosystem. Finally, the potential benefit of a return to this ‘Palaeolithic diet’ is discussed and an update is provided for the evidence for the positive health effects of these diets.
Hominins are defined as members of the taxon Hominini, which comprises modern Homo sapiens and its extinct relatives over the past about 7 million years. The oldest-known hominins (Fig. 1) are Sahelanthropus tchadensis from Chad (about 7 Mya(Reference Brunet, Guy and Pilbeam65)) and Orrorin tugenensis from Kenya (about 6–5·7 Mya(Reference Senut, Pickford and Gommery66)). The next oldest are Ardipithecus kadabba (Ethiopia, about 5·8 Mya(Reference Haile-Selassie, Suwa and White67)) and A. ramidus (Ethiopia, about 4·4 Mya(Reference White, Asfaw and Beyene68)), Australopithecus anamensis (Kenya, about 4·1–3·9 Mya(Reference Leakey, Feibel and McDougall69)), Au. afarensis (Ethiopia, Tanzania and maybe Kenya, 3·6–3·0 Mya(Reference Kimbel and Delezene70, Reference Harrison and Harrison71)), Au. bahrelghazali (Chad, about 3·5 Mya(Reference Brunet, Beauvilain and Coppens72)), Kenyanthropus platyops (Kenya, about 3·5 Mya(Reference Leakey, Spoor and Brown73)), Au. garhi (Ethiopia, about 2·5 Mya(Reference Asfaw, White and Lovejoy74)) and Au. africanus (South Africa, about 2·9–2·0 Mya(Reference Herries, Hopley and Adams75)). From these earliest hominins evolved the genera Paranthropus (three known subspecies) and Homo. The earliest species that have been designated Homo are Homo rudolfensis, Homo habilis and Homo erectus sensu lato –including H. ergaster (Eastern Africa, about 2–1.8 Mya): these in turn are the presumed ancestors of Asian H. erectus, H. heidelbergensis (Africa, Eurasia 0·6–0·3 Mya), H. neanderthalensis (Eurasia, 0·4–0·03 Mya) and H. sapiens (from about 0·2 Mya onwards)(Reference Tattersall, Cunnane and Stewart76–Reference White, Asfaw and DeGusta78). The recently discovered H. floresiensis (0·095–0·013 Mya(Reference Brown, Sutikna and Morwood79)) and the previously unknown hominins from Denisova Cave (about 0·05–0·03 Mya(Reference Reich, Patterson and Kircher80)) show that in the recent past several different hominin lines co-existed with modern humans.
Africa is now generally accepted as the ancestral homeland of Homo sapiens (Reference Stringer77, Reference Stringer81, Reference Templeton82). In several subsequent out-of-Africa waves(Reference Oppenheimer83), hominins of the genus Homo colonised Asia, Australia, Europe and finally the Americas (Fig. 2). Archaic Homo species reached as far as the island of Flores in South-East Asia, East China and Southern Europe (Spain). Homo heidelbergensis remains were found in Africa, Europe and Eastern Asia, while Homo neanderthalensis was restricted to Europe, Western Asia and the Levant. At last, in the later out-of-Africa diaspora starting about 100 Kya, Homo sapiens finally reached Australia and the Americas, while probably replacing earlier hominins in Africa, Europe and Asia that had left during the earlier out-of-Africa waves. However, there remains some debate(Reference Templeton82, Reference Templeton84–Reference Templeton86) whether or not the gene pool of archaic hominins contributed to that of modern humans. In the replacement theory, archaic hominins make no contribution to the gene pool of modern man, whereas in the hybridisation theories (either through assimilation or gene flow), newly arriving hominins from the later out-of-Africa wave mixed with archaic predecessors. Current evidence from DNA analyses supports the concept that the gene pool of archaic hominins, notably Neanderthals(Reference Green, Krause and Ptak87), but also Denisovans(Reference Reich, Patterson and Kircher80) contributed to the gene pool of Homo sapiens.
The African cradle of humankind is supported by micro-satellite studies(Reference Zhivotovsky, Rosenberg and Feldman88) that reveal that within populations the genetic variation decreases in the following order: sub-Saharan Africa>Eurasia>East Asia>Oceania>America, with the hunter–gatherer Hadzabe of Tanzania separated from the Ju|'hoansi (previously called !Kung) from Botswana by a genetic distance greater than between any other pair of populations(Reference Knight, Underhill and Mortensen89), which indicates the chronology of continent inhabitation and points to South or East Africa as the cradle of humankind(Reference Knight, Underhill and Mortensen89, Reference Henn, Gignoux and Jobin90). Human evolution was characterised by several large-scale decimations, and it has been estimated that the current world population derives from only 1000 surviving individuals at a certain time point(Reference Behar, Villems and Soodyall91). Such bottlenecks(Reference Ambrose92), characterised by strong population decrease, or where groups of hominids were separated due to global climate changes, volcanic winters or geographic boundaries as mountain ridges or seas, caused gene flow and genetic drift. As a result, different phenotypic races emerged in different geographic regions(Reference Zhivotovsky, Rosenberg and Feldman88, Reference Ambrose92). However, differences among these populations contribute only 3–5 % to genetic diversity, while within-population differences among individuals account for 93–95 % of genetic variation(Reference Rosenberg, Pritchard and Weber93). In other words, genetically we belong to one species that originally evolved in Africa and that for the great majority genetically still resides in the Palaeolithic era. Most of the current inter-individual genetic differences were already existent when Homo sapiens emerged, some 200 Kya(Reference White, Asfaw and DeGusta78). Bipedalism, hairlessness, speech and the ability to store fat differentiate humans from the closest relatives, the primates, but it is the uniquely large brain, which allowed for symbolic consciousness and pose ‘what-if’ questions, that finally made humanity(Reference Tattersall, Cunnane and Stewart76).
Changing habitat and increasing brain size
It is assumed that during the early stages of human evolution early hominins introduced more animal food into their diets, at the expense of plant foods(Reference Washburn, Lancaster, Lee and DeVore94, Reference Sailer, Gaulin and Voster95). Subsequent hominins further increased the amount of animal food and consequently the energy density and (micro)nutrient content of their diet, i.e. the dietary quality. While increasing their dietary intake from animal food, early hominins grew taller and increased their brain mass relative to body mass (encephalisation quotient; EQ). Brain mass in primates relates to the number of neurons(Reference Herculano-Houzel96) and global cognition(Reference Deaner, Isler and Burkart97), while the human cortex also has more cycles of cell division compared with other primates(Reference Hill and Walsh98). During hominin evolution the first significant increase in EQ occurred about 2 Mya (Table 1). From about 2 Mya to 200 Kya the human ancestors tripled their brain size from Australopithecus species with an EQ of 1·23–1·92 to an EQ of 1·41–4·26 for the genus Homo (Reference Broadhurst, Cunnane and Crawford99, Reference Cunnane, Cunnane and Stewart100). The increase in brain size and the number of neurons differentiate Homo from their closest primate relatives. However, a large brain requires an adaptation or an exaptation to accommodate it, and notably sufficient intake of so-called ‘brain-selective nutrients’(Reference Cunnane, Cunnane and Stewart100, Reference Cunnane101) to build and conserve it.
EQ, encephalisation quotient.
* Adapted from Templeton(Reference Templeton85).
† Relative to modern Homo sapiens.
Buiding a big brain
Compared with other primates, humans have an extraordinarily large brain(Reference Navarrete, van Schaik and Isler102, Reference Potts103). To understand the expansion of the human brain during evolution, it is important to comprehend its composition and its biochemistry. Brain tissue has a unique profile of long-chain PUFA (LCP)(Reference Broadhurst, Cunnane and Crawford99). Comparison of the brain ethanolamine phosphoglycerols of forty-two studied animal species shows an almost identical LCP pattern, independent of the grade of encephalisation, containing approximately equal proportions of arachidonic acid (AA) and DHA. Consequently, for normal neuronal function, mammalian brain tissue appears to have an invariant structural requirement for both AA and DHA. This shows that both these fatty acids are important building blocks for building a big brain and for encephalisation. The weight of a newborn human brain is about 340 g(Reference Blinkov and Glezer104) and it contains about 9 g lipid(Reference White, Widdowson and Woodard105); the brain of a 10-month-old infant is 850 g and contains 52 g lipid. At 3 years, the brain is 1100 g and contains 130 g lipid. Thus, the major part of the human brain spurt occurs postnatally(Reference Dobbing and Sands106), implying that especially the newborn infant has high demands for AA and DHA.
Toothed whales (brain weight 9000 g) and African elephants (4200 g) have brains much larger than humans, but they have lower cognitive abilities and a lower EQ(Reference Roth and Dicke107). These observations substantiate an EQ-centred approach to explain variation in cognition between species. Recent analyses, however, have shown remarkable differences between primate and non-primate brains; a primate brain contains many more neurons than a non-primate brain of similar size(Reference Herculano-Houzel96, Reference Herculano-Houzel108, Reference Herculano-Houzel109) and the absolute number of neurons, rather than body relative to brain ratio (EQ), best predicts cognitive ability(Reference Deaner, Isler and Burkart97), although it still needs to be determined whether humans have the largest number of brain neurons among all mammals. From this new neuron-centred view, there seems to be nothing special about the human compared with the primate brain, except for its size(Reference Herculano-Houzel96), which basically determines both the number of neurons and non-neurons(Reference Azevedo, Carvalho and Grinberg110, Reference Herculano-Houzel111). Detailed comparisons of human and primate brains have revealed other differences, such as different levels of gene expression(Reference King and Wilson112–Reference Caceres, Lachuer and Zapala114), secondary to chromosomal rearrangements(Reference Marques-Bonet, Caceres and Bertranpetit115), differences in the relative extent of the neocortical areas(Reference Herculano-Houzel96, Reference Finlay and Darlington116), the distribution of cell types(Reference Stimpson, Tetreault and Allman117) and the decrease of brain structure volumes with increasing age in man in contrast to chimpanzees(Reference Sherwood, Subiaul and Zawidzki118, Reference Sherwood, Gordon and Allen119). The best predictor of cognitive ability in humans compared with non-primates, however, still needs to be established, but rather than EQ or brain size, the absolute number of neurons seems a prudent candidate(Reference Deaner, Isler and Burkart97, Reference Herculano-Houzel108), since there is no clear relationship between neuron number and the absolute brain size among the different animal species(Reference Herculano-Houzel96, Reference Herculano-Houzel108, Reference Herculano-Houzel109).
In contrast to intuitive belief, growing a large brain and a large skull to accommodate it is less difficult to achieve than it seems at first glance. It was recently shown that different levels of expression of a single gene might have resulted in the markedly different beak shapes and lengths of Darwin's finches. Experimental overexpression of the calmodulin gene in chicken embryos resulted in a significant increase in the length of their beaks(Reference Abzhanov, Kuo and Hartmann120, Reference Patel121). These experiments suggest that small and seemingly insignificant changes can have profound implications for the evolution of anatomical size and shape and thereby provide great potential for explaining the origins of phenotypic variation(Reference Schneider122), including increases in brain and skull size. Analogously, many mutations in humans are associated with either microcephaly(Reference Kaindl, Passemard and Kumar123) or macrocephaly(Reference Williams, Dagli and Battaglia124), while the growth of the skull in hydrocephaly shows that the increased skull size is secondary to the increase of its contents, suggesting that brain rather than skull size is the limiting factor here. The evolution of certain genetic variants associated with brain size has accelerated significantly since the divergence from the chimpanzee some 5–6 Mya. A recent variation that occurred 37 Kya has spread more rapidly through the human population than could be explained by genetic drift(Reference Evans, Anderson and Vallender125–Reference Evans, Vallender and Lahn128), suggesting that it conferred evolutionary advantage.
The anatomical and metabolic changes encoded in the genome (see ‘Comparative anatomy’) might have provided hominins with the anatomical and energetic opportunity to, over a period of several million years, steadily increase their brain size, but these mutations per se did not fulfil the nutrient requirements for brain expansion(Reference Cunnane101, Reference Speth129–Reference Gibbons131). The underlying small number of mutations should rather have been accompanied, and most probably have been preceded, by increased availability of ‘brain-specific nutrients’ such as LCP for their ultimate conservation through the process of mutation and selection, which basically underlines both Darwin's concept of the crucial importance of ‘the conditions of existence’ and the secondary role of mutation. An example may come from current knowledge on the sources of AA and DHA. In humans, both AA and DHA can be synthesised from their precursor essential fatty acids α-linolenic acid (ALA) and linoleic acid (LA) (Fig. 3), respectively. ALA and LA are present in various natural food resources. ALA is predominantly found in plant foods, while LA is mainly found in vegetable oils such as sunflower-seed oil. Both AA and DHA may derive from their synthesis from abundantly consumed precursor fatty acids ALA and LA, but in humans and especially neonates, these synthetic activities are insufficient to cope with metabolic demands(Reference Crawford132). Consequently both these LCP, but especially those of the n-3 series, need to be present in sufficient quantities in our diet. It is still under debate what dietary resource(s) provided the LCP that enabled us to grow a large brain(Reference Cunnane101, Reference Cordain, Eaton and Sebastian133–Reference Langdon137).
The probability of hunting on the savanna
It has been a longstanding paradigm in palaeoanthopology that early human evolution occurred in a dry and open savanna environment(Reference Dart138–Reference Tobias, Cunnane and Stewart140). Recent studies from the Afar basin(Reference White, Asfaw and Beyene68, Reference White, Ambrose and Suwa141), although recently contested(Reference Cerling, Levin and Quade142–Reference Cerling, Wynn and Andanje144), indicated that the habitat of Ardipithecus ramidus at about 4·4 Mya was characterised not by savanna but by woodland to grassy woodland conditions. Human characteristics, such as poor water-drinking capacity, excessive urination and transpiration and poor water retention support the argument that we would be poorly adapted savanna dwellers(Reference Tobias, Cunnane and Stewart140).
A second long-reigning paradigm was ‘man the hunter’, which was the standard version of human origins advocated for many years. Washburn & Lancaster(Reference Washburn, Lancaster, Lee and DeVore94) referred at most to our most recent antecessors, Homo sapiens and possibly H. neanderthalensis, when they claimed that our intellect, interests, emotions and basic social life are evolutionary products of the hunting adaptation. The strongest argument against this hunting paradigm comes from combined studies of past and present-day hunter–gatherer societies indicating that the role of hunting is exaggerated, notably (around the campfire) in hunter–gatherer societies, since the majority of the dietary protein is in reality obtained by women gathering nuts, tubers and small animals(Reference Woodburn, Lee and DeVore145–Reference Stanford147). Cordain et al. (Reference Cordain, Miller and Eaton148) showed that only 25–35 % of energy (en%) of subsistence in worldwide hunter–gatherer communities is derived from hunting, while the remainder is derived from both plant and fished food. Thus, while meat from large game may have been the most valued food, it is highly unlikely that it was the most valuable (nutritionally important) food resource from a dietary perspective(Reference Marlowe41, Reference Stanford149). At present, the niche of early hominins and thus the environment of human evolution, and, most importantly for the present review, the nutritional composition of the early human diet are still heavily debated(Reference Ungar and Sponheimer150).
Reconstruction of our ancient diet
In the next sections we will discuss various views on (changes in) the hominin ecological niche that over time shaped the human genome to what it currently is.
Sahelanthropus, Orrorin and Ardipithecus
In the late Miocene (up to 5·3 Mya), the African continent became more arid, which resulted in fragmentation of the (sub)tropical forests and the appearance of more open environments(Reference Bernor151). The widespread dispersal of some of the earliest hominins such as Sahelanthropus (Reference Brunet, Guy and Pilbeam65, Reference Lebatard, Bourles and Duringer152), and Australopithecus bahrelghazali from Chad, might be explained by the presence of the relatively low-lying humid East–West corridor constituted by the remnants of the Cretaceous Central African and Sudan Rifts between Western and Eastern Africa(Reference Bosworth and Morley153, Reference Joordens154). The reconstructed environment of Sahelanthropus (about 7 Mya) suggests a mosaic of gallery forest at the edge of a deep, well-oxygenated lake, swampy and vegetated areas, and extensive grasslands(Reference Vignaud, Duringer and Mackaye155). Since there is no indication of carnivore modification or fluvial transport of its bones, Sahelanthropus chadensis probably lived in this area(Reference Stewart, Cunnane and Stewart156). The palaeo-environment of Orrorin (about 6 Mya) was probably characterised by open woodland, with dense stands of trees in the vicinity and possibly fringing the lake margin and/or streams that drained into the lake(Reference Pickford and Senut157). Ardipithecus kadabba (5·6 Mya) remains are associated with wet and closed, grassy woodland and forest habitats around lake or river margins(Reference WoldeGabriel, Haile-Selassie and Renne158). Ardipithicus ramidus (4·4 Mya) lived in or near a groundwater-supported grassy woodland to forest(Reference WoldeGabriel, Ambrose and Barboni159). Additionally, the abundance of fossilised shallow-water aquatic species such as catfish, barbus, cichlidae and crocodiles additionally suggests an episodically present flood-plain environment(Reference WoldeGabriel, Ambrose and Barboni159).
Early Australopithecus species
Australopithecus anamensis appeared at about 4·2 Mya and its environment was characterised by a mix of wetlands and terrestrial environments, such as lacustrine and fluvial floodplains, woodland and gallery forest(Reference Stewart, Cunnane and Stewart156, Reference Feibel, Harris, Brown and Harris160–Reference Schoeninger, Reeser and Hallin163). The later Australopithecus afarensis survived in a variety of habitats(Reference Reed164), but apparently thrived better in the more wooded and humid conditions in the Afar basin than in the relatively dry Laetoli area(Reference Su and Harrison165). Stewart(Reference Stewart, Cunnane and Stewart156) pointed out that in Africa the only environmental constant in hominin sites throughout the period from 3·4 to 2·9 Mya was a wetlands habitat, characterised by aquatic herbaceous vegetations around lakes and rivers, with large populations of wetland fauna such as reduncines and hippopotami. Hence, these wetlands could have been refuge for early hominins throughout an extensive period of human evolution.
Paranthropus, late Australopithecus and Homo species
About 2·9–2·5 Mya tectonic and global climatic changes made Africa cooler and drier(Reference Veldkamp, Buis and Wijbrands166–Reference Bartoli, Sarnthein and Weinelt169). The great wet forests of middle Africa retreated and made place for more savanna grasslands. It is around this time, from about 2·6 Mya onwards, that the first traces of the new hominin genus Homo appeared in the archaeological record(Reference Tobias, Cunnane and Stewart140). It has been suggested that alternating wet and dry periods after 2·7 Mya could have isolated hominin populations around sources of potable water, while forcing them to the extremes of their conditions of existence(Reference Stewart, Cunnane and Stewart156) and thus facilitating specialisation(Reference Potts170), either by adaptation or exaptation. Compared with Australopithecus, Paranthropus existed in slightly more open habitats, including wetlands and grasslands, but also in woodland and bushland areas. The habitats of Homo species seem similar to those utilised by Paranthropus species(Reference Reed161), but Homo remains at Olduvai Gorge and Koobi Fora are associated with well-vegetated swamps, lakes and river margins, and (semi-) aquatic fauna(Reference Pobiner, Rogers and Monahan171, Reference Ashley, Tactikos and Owen172). Only the later Homo species are also found in assemblages that indicate extremely arid and open landscapes such as savanna(Reference Reed161). Also, it has been suggested that hominins and other (aquatic) species dispersed throughout Africa along water systems, while even the last out-of-Africa migration might have occurred via the ‘green Sahara’ that existed during the last interglacial (125 Kya)(Reference Stewart, Cunnane and Stewart156, Reference Drake, Blench and Armitage173).
In conclusion, the palaeo-environmental evidence suggests that early hominins lived in the proximity of water. However, it is frequently argued that bones are preferentially preserved in lake, river or fluvial sediments, making their recovery in any other than an aquatic setting unlikely(Reference Sikes174). Alternatively, hominin remains may have been relocated to the water by carnivores(Reference Ward, Leakey and Walker162), including crocodiles. Nevertheless, the combined evidence strongly suggests that early hominins frequented the land–water ecosystem and thus lived there. Joordens et al. (Reference Joordens154) proposed, based on comparison with other terrestrial omnivores, that the default assumption should be that hominins living in freshwater or marine coastal ecosystems with catchable aquatic resources could have consumed these aquatic resources(Reference Joordens154, Reference Joordens, Wesselingh and de Vos175).
The diet of our closest relatives
Field studies on our closest relatives, the extant apes, show that their preferred food items are primarily fruits and/or leaves and stems from terrestrial forests. Lowland gorillas, for example, derive 57 % of their metabolisable energy (en%) from SCFA derived from colonic fermentation of fibre, 2·5 en% from fat, 24 en% from protein and 16 en% from carbohydrate(Reference Popovich, Jenkins and Kendall176). ‘Fallback foods’ are consumed when preferred foods are unavailable(Reference Marshall and Wrangham177) and are generally composed of herbaceous plants and high-fibre fruits from aquatic and terrestrial environments(Reference Stewart, Cunnane and Stewart156). Like our closest relatives(Reference Stewart, Cunnane and Stewart156, Reference Nishida178–Reference Wrangham, Cheney and Seyfarth181), hominins might have used foods from the aquatic environment as fallback foods, while, although speculative, this niche might eventually have proven favourable with regard to subsequent encephalisation.
Teeth morphology and dental microwear
Comparative anatomy (Fig. 4) of the hominins might confer some information about these fallback and preferred foods of our ancestors. Dental studies of Sahelanthropus (Fig. 1) describe that the teeth had thick enamel(Reference Brunet, Guy and Pilbeam65), similar to orangutans, suggesting that it could eat hard and tough foods(Reference Vogel, van Woerden and Lucas182), such as available from the lakeshore vegetation(Reference Stewart, Cunnane and Stewart156). Ardipithecus ramidus, however, had thin molar enamel and smaller teeth compared with later hominins. This dental morphology is consistent with a partially terrestrial, omnivorous/frugivorous niche(Reference Suwa, Kono and Simpson183). Studies on cranio-dental changes such as tooth size, tooth shape, enamel structure and jaw biomechanics indicate that Australopithecus and Paranthropus had prominent jaws, relatively flat molar teeth, small incisors and thick enamel, suitable for breaking and crushing small hard, brittle foods such as fruits, nuts and underground storage organs(Reference Ungar and Sponheimer150), but unsuitable for breaking down tough plant foods or tearing meat. Together, this would allow early hominins to eat both hard and soft, abrasive and non-abrasive foods, which suits well for life in a variety of habitats(Reference Teaford and Ungar184).
In addition to studies on teeth morphology, microwear studies are essential. Dental microwear studies analyse tooth-wear, showing evidence for where teeth were actually used for and thus what an animal in reality ate(Reference Walker, Hoeck and Perez185). While adaptive morphology will give important clues about what a species was capable of eating, microwear studies reflect what an animal ate during some point in its lifetime. In these studies, ‘complexity’ is used as an indicator for hard and brittle items, while ‘anisotrophy’ is an indicator for tough foods(Reference Ungar, Grine and Teaford186).
Most primates show either low complexity combined with high anisotrophy, indicative of consumption of tough foods such as leaves, stems and meat, or high complexity with low anisotrophy associated with hard-brittle foods, such as nuts and seeds(Reference Ungar and Sponheimer150, Reference Ungar, Scott and Grine187). Ardipithecus ramidus's preference for an omnivorous/frugivorous diet was confirmed by microwear studies (Reference Suwa, Kono and Simpson183), suggesting a diet of fleshy fruits and soft young leaves(Reference Ungar and Sponheimer150). Conversely, microwear textures of Australopithicus afarensis and anamensis show striations rather than pits (low complexity and low anisotrophy), i.e. patterns similar to those of grass-eating and folivorous monkeys instead of the predicted diets predominated by hard and brittle foods(Reference Ungar, Scott and Grine187, Reference Grine, Ungar and Teaford188). Australopithecus africanus showed microwear patterns that were more anisotropic, suggestive for consumption of tough leaves, grasses and stems(Reference Scott, Ungar and Bergstrom189). Paranthropus robustus, also know as the ‘Nutcracker Man’, has enormous, flat, thickly enamelled teeth that are combined with a robust cranium, mandible and powerful chewing muscles, suggestive of breaking hard and brittle foods(Reference Ungar, Grine and Teaford186). After microwear analysis of its teeth, however, Ungar et al. (Reference Ungar and Sponheimer150, Reference Ungar, Grine and Teaford186) showed that P. robustus had low complexity and anisotrophy; thus Paranthropus might only have consumed mechanically challenging items as fallback foods when preferred foods were unavailable. Similarly, microwear studies support the notion that the diet of P. boisei contained large quantities of low-quality vegetation, rather than hard objects(Reference Cerling, Mbua and Kirera190).
Generally, microwear studies confirm earlier dental topography studies(Reference Ungar191), which revealed the incorporation of more fracture-resistant foods, i.e. tougher foods as leaves, woody plants, underground storage organs and animal tissues, in the diet of Australopithecus africanus compared with Australopithecus afarensis and for P. robustus compared with Australopithecus africanus (Reference Ungar, Scott and Grine187). Dental topographic analysis suggested that successive Homo species emphasised more on tougher and elastic foods, perhaps including meat(Reference Ungar191). The latter suggestion is in line with the optimal foraging theory, which states that humans prefer foods with high energy density over those with low energy density(Reference Ulijaszek192, Reference Goldstone, de Hernandez and Beaver193).
Microwear studies confirmed that early Homo, such as H. erectus and H. habilis, did not prefer fracture-resistant foods, although some H. erectus specimens showed more small pits than H. habilis members, suggesting that none of the early Homo specialised on very hard-brittle or tough foods, but rather could consume a varied diet(Reference Ungar, Grine and Teaford194). This does not imply that early Homo had very broad diets, but rather that early Homo was adapted to subsist in a range of different environments, providing evolutionary advantage in the climatic fluctuations and the mosaic of habitats in Africa during the late Pliocene(Reference Ungar, Grine and Teaford195). A study on dental microwear of 300 000-year-old H. heidelbergensis teeth from Sima de los Huesos in Spain showed striation patterns that indicated a highly abrasive diet, with substantial dependence on poorly processed plant foods such as roots, stems and seeds(Reference Perez-Perez, De Castro and Arsuaga196). Lalueza et al. (Reference Lalueza, PerezPerez and Turbon197) compared teeth of very recent hunter–gatherers (Inuit, Fueguians, Bushmen, Aborigines, Andamanese, Indians, Veddahs, Tasmanians, Laps and Hindus) with Middle and Upper Pleistocene fossils. Their results indicate that some Neanderthals resemble carnivorous groups, while archaic H. sapiens show a more abrasive diet, partly dependent on vegetable materials.
Overall, remarkably few studies have related microwear patterns to hominin diet for the period between 1·5 Mya and 50 Kya (PS Ungar, personal communication). Studies in more recent hominins, such as from an Upper Palaeolithic site in the Levant (22 500–23 500 before present; BP) showed a high frequency of long narrow scratches and few small pits, suggesting a tough abrasive diet of aquatic foods rather than a diet with hard foods that needed compressive force(Reference Mahoney198, Reference Mahoney199). A study of subsequent local hunter–gatherer (12 500–10 250 BP) and farmers (10 250–7500 BP) living in the Levant showed larger dental pits and wider scratches among the farmers compared with the hunter–gatherers, suggesting that the implementation of agriculture led to a more fracture-resistant diet(Reference Mahoney199).
Gut morphology, energy expenditure and muscularity
Gut morphology studies(Reference Milton200, Reference Milton201) support the introduction of animal foods, at the expense of vegetable foods, in the diet of early Homo. The dominance of the colon (>45 %) in apes indicates adaptation to a diet rich in bulky plant material, such as plant fibre and woody seeds. In contrast, the proportion of the human gut dominated by the small intestine (>56 %) suggests adaptation to a diet that is highly digestible, indicating a closer structural analogy with carnivores than to folivorous or frugivorous mammals. Importantly, the shorter gut in Homo, as compared with primates, might have had some other advantage. During the evolution from Ardipithecus and Australopithecines to early Homo, the improvement of dietary quality coincided with an increase in height, the size of our brain and its metabolic activity. However, an increase in body size coincides with increased daily energy demands, notably during gestation(Reference Gittleman and Thompson202) and lactation(Reference Oftedal203). It was, for example, calculated that daily energy expenditure for a Homo erectus female is about 66 % higher compared with an Australopithicine female, while being almost 100 % higher in a lactating Homo erectus female compared with a non-lactating, non-pregnant Australopithicine (Reference Aiello and Key204). These high energy demands might have been met by increased female fat reserves(Reference Aiello and Wells205, Reference Cunnane and Crawford206), such as demonstrated by the presence of female steatopygia in some traditional human populations, such as the Khoisan of southern Africa.
Apart from the extra energy need for reproduction and increasing height, the human brain of a modern adult uses 20–25 en% of the total RMR, while this value is 8–9 en% for a primate(Reference Leonard, Robertson and Snodgrass207). It has been postulated that the extra energy needs did not derive from a general increase in RMR, but partly from a concomitant reduction of the gastrointestinal tract(Reference Aiello and Wheeler208) and a reduction in muscularity(Reference Leonard, Robertson and Snodgrass207). These features are combined in the ‘expensive tissue hypothesis’ of Aiello & Wheeler(Reference Aiello and Wheeler208) that points at the observation that the mass of the human gastrointestinal tract is only 60 % of that expected for a similar-sized primate and that humans are relatively under-muscled compared with other primates(Reference Leonard, Robertson and Snodgrass207). Interestingly, the negative relationship between gut and brain size across anthropoid primates(Reference Aiello and Roebroeks209) was confirmed in a study with highly encephalised fish(Reference Kaufman, Hladik and Pasquet210), whereas it became falsified in mammals(Reference Navarrete, van Schaik and Isler102). Unfortunately, however, the latter study did not include marine mammals(Reference Navarrete, van Schaik and Isler102), while it is questionable whether with respect to brain development humans adhere to general mammalian rules(Reference Potts103). Other adaptations might have saved energy expenditure as well. The short human inter-birth interval(Reference Aiello and Key204), compared with inter-birth intervals of 4–8 years in the gorilla, chimpanzee and orangutang(Reference Galdikas and Wood211), reduces the most expensive part of reproduction, i.e. lactation(Reference Isler and van Schaik212, Reference Isler213), while the shift from quadrupedal to bipedal locomotion also reduced daily energy expenditure(Reference Pontzer, Raichlen and Sockol214). Together, all of these adaptations allow for an increased daily energy expenditure, including the reallocation of energy to the metabolically active brain, but were only possible after Homo included more energy-dense foods into its diet. Brain mass in primates is positively related to dietary quality and inversely to body weight(Reference Leonard, Robertson and Snodgrass207). Generally, a shift towards more energy-dense foods includes a shift from primarily carbohydrate-rich vegetables to fat- and protein-rich animal foods. However, it has also been suggested that a shift from the complex carbohydrates in leafy vegetables towards underground storage organs, such as tubers, might have provided easy-to-digest carbohydrates(Reference Milton200, Reference Wrangham, Jones and Laden215) to support a larger hominin body mass.
A phenotypic specialisation on non-preferred resources, without compromising the ability to use preferred resources, is also known as Liem's paradox(Reference Robinson and Wilson216). This paradox is important to keep in mind during attempts to reconstruct the preferred diet from the available evidence. Interestingly, Stewart(Reference Stewart, Cunnane and Stewart156) recently noted that in the case of Paranthropus's phenotype with regard to its fallback food, there is just as much evidence to talk about a ‘Nutcracker Man’ as there is to talk about a ‘Shellcracker Man’. Thus, a phenotypic characterisation needs support from other studies to confirm adaptation to preferred rather than fallback foods. In other words, it should be noted that not all physical characteristics might be taken as unambiguous proof for the preferred diet of early hominins (Liem's paradox). The increasing absolute body size and brain size and the reduction in gut size, however, do indicate a shift from low- to high-dietary-quality foods for more recent hominins.
Evidence from the strontium:calcium ratio
Based upon the principle ‘you are what you eat’(Reference Harris217) several techniques have been developed to study early hominin diets(Reference Sponheimer, Dufour, Hublin and Richards218). Trace element studies first started with Sr:Ca ratios, which decrease as an animal moves up the food chain, secondary to the biological discrimination against strontium(Reference Sillen and Kavanagh219). A first study(Reference Sillen220) suggested that Paranthropus robustus (Fig. 1) had lower Sr:Ca ratios compared with contemporaneous Papio (baboon) and Procavia (hyrax), suggesting that Paranthropus was not an exclusive herbivore. However, a subsequent study showed that two Homo specimens had a higher Sr:Ca ratio than Paranthropus (Reference Sillen, Hall and Armstrong221). Since Homo had been assumed to consume more animal foods than Paranthropus and thus have a lower Sr:Ca ratio, these higher Sr:Ca ratios needed explanation. For example, consumption of specific foods with a high Sr:Ca ratio might have increased the ratio in Homo compared with Paranthropus. Foods in the area of research with elevated Sr:Ca ratios were mainly geophytes. These are notably perennial plants with underground food storage organs, such as roots, bulbs, tubers, corms and rhizomes. Consequently, this discrepancy has been attributed to the consumption of underground storage organs by early Homo (Reference Wrangham, Jones and Laden215, Reference Sillen, Hall and Armstrong221). The use of these underground storage organs may have become necessary from the start of a first period of aridity about 2·8 Mya, when forest was replaced by drier woodlands, forcing hominins to search for available resources around water margins(Reference Conklin-Brittain, Wrangham and Hunt222).
Although promising at first, the use of Sr:Ca ratios was found to suffer from several limitations. For example, a problem is that hominin Sr:Ca ratios in fossilised bones alter with time(Reference Lee-Thorp and Sponheimer223), a process that is known as diagenesis. This problem can be circumvented by the use of tooth enamel, which is less susceptible to diagenesis than bone(Reference Lee-Thorp and Sponheimer223). Subsequent studies showed that Australopithecus africanus had a higher Sr:Ca ratio in its enamel compared with Paranthropus and contemporaneous browsers, grazers, baboons and carnivores(Reference Sponheimer, de Ruiter and Lee-Thorp224–Reference Balter and Simon226). The interpretation of these findings, however, remains a subject of debate. For example, a group of brown seaweeds and some aquatic plants discriminate against Ca, resulting in an increase in the Sr:Ca ratio, and likewise in the Sr:Ca ratio of the fish feeding on them(Reference Balter and Simon226, Reference Ophel and Fraser227). Similarly, leaves from trees have lower Sr:Ca ratios compared with grasses, which becomes subsequently reflected in the Sr:Ca ratios of browsers and grazers, respectively(Reference Ambrose and Deniro228). Thus, it might be argued that the relatively high Sr:Ca ratios in Australopithicines and Homo reflect their consumption of aquatic resources or of animals such as insects or other small animals feeding on grasses. For now, the exploration of especially Sr:Ca ratios as a dietary proxy method has largely been stalled(Reference Lee-Thorp, Sponheimer and Passey229).
Evidence from the barium:calcium ratio
Another trace element ratio that might provide information on the composition of the early hominin diet is the Ba:Ca ratio, particularly when used in a multiple-element analysis with Sr:Ca and Sr:Ba ratios(Reference Sponheimer, de Ruiter and Lee-Thorp224, Reference Sponheimer and Lee-Thorp230). Combined Ba:Ca and Sr:Ba ratios clearly differentiate grazers from browsers and carnivores in both modern and fossil mammals(Reference Sponheimer and Lee-Thorp230). Hominins have a lower Ba:Ca and higher Sr:Ba ratio compared with grazers and browsers. Paranthropus shows considerable similarity for both ratios with both carnivores and Papionins (baboons), while Australopithecus shows an even higher Sr:Ba ratio. The unusual combination of a high Sr:Ca and low Ba:Ca ratio in hominins (and baboons) has been further observed for some animals such as warthogs and mole rats that make extensive use of underground resources(Reference Sponheimer, Lee-Thorp and de Ruiter231). Finally, the high Sr:Ba ratio, as observed in Australopithecus, might derive from the consumption of grass seeds, which have Sr:Ba ratios three to four times higher than grass straw, while consumption of these grasses is also consistent with stable-isotope evidence (see below) showing that Australopithecus derived a substantial part of its diet from C4 resources. Although not included in any study so far, Sr:Ba ratios in aquatic foods might add to the understanding of the unusual combination of low Ba:Ca and high Sr:Ba ratios.
Evidence from the 13C:12C ratio
Fractionation studies of carbon isotopes differentiate between different routes of photosynthesis. While most tropical African woody plants from forests (like fruits, leaves, trees, roots, bushes, shrubs and forbs) use the C3 photosynthetic pathway, some South African(Reference Vogel232) and East African(Reference Cerling, Harris and MacFadden233) grasses and sedges use the C4 photosynthetic pathway. C4 plants such as sedges (for example, Cyperus papyrus) typically occur in a mosaic of extensive seasonal and perennial shallow freshwater wetlands that can also be found in savanna and ‘bushvelds’ receiving summer rainfall(Reference Peters and Vogel234). The real impact of C4 plants occurred with their spread into Eastern and Southern Africa during the Pliocene(Reference Segalen, Lee-Thorp and Cerling235). Tissues of plants that utilise the C4 pathway have a relatively high content of the stable carbon isotope 13C (about 1·1 % of carbon), since C3 plants discriminate more strongly against 13CO2 during photosynthesis. As a result C3 and C4 plants have quite different 13C:12C ratios in their tissues, as have the herbivorous animals that feed on these plants(Reference Ambrose and Deniro228, Reference Vogel232) and the carnivores that prey on these herbivores(Reference Goldstone, de Hernandez and Beaver193, Reference Cunnane and Crawford206). Differences are expressed as δ13C values (δ13C = ((13C:12C)sample − (13C:12C)standard) × 1000/(13C:12C)standard) in parts per thousand (‰) relative to the 13C:12C ratio in the reference standard (named Pee Dee Belemnite; PDB), i.e. the carbonate obtained from the fossil of a marine Cretaceous cephalopod (Belemnitella americana) which is highly enriched in 13C (13C:12C ratio = 0·0112372). Consequently, most animals have negative δ13C values (Fig. 5). The δ13C ranges from − 35 to − 21 ‰ (mean − 26 ‰) in C3 plants and from − 14 to − 10 ‰ (mean − 12 ‰) in C4 plants(Reference Lee-Thorp and Sponheimer225). Studies that attempted to reconstruct mammalian food webs indicated that carbon is slightly enriched (1–2 ‰) with each trophic step(Reference Goldstone, de Hernandez and Beaver193, Reference Cunnane and Crawford206). To facilitate comparison of current and historical animals, δ13C analyses are predominantly performed in hard tissues, such as bone collagen and enamel (notably apatite), since these constitute the majority of the fossil record. It was shown that enamel mineral is enriched by about 13 ‰ compared with dietary δ13C(Reference Sponheimer, Lee-Thorp, de Ruiter and Ungar238), while collagen is enriched by about 5 ‰ compared with dietary δ13C(Reference Lee-Thorp and Sponheimer225). Collagen from terrestrial mammal C3 herbivores shows a value of − 21 ‰ (range − 22 to − 14 ‰), while in collagen of C4 herbivores this value is − 7 ‰ ( − 12 to − 6 ‰). Their respective carnivores show collagen values of − 19 ‰ ( − 21 to − 14 ‰) and − 5 ‰ ( − 8 to − 2 ‰), respectively(Reference Lee-Thorp and Sponheimer225). Marine phytoplankton, which uses the C3 pathway, shows an average value of − 22 ‰(Reference Kelly239). Collagen of marine fish shows a range from − 15 ‰ to − 10 ‰, while the values in collagen of reef fish range from − 8 to − 4 ‰(Reference Schoeninger and Deniro240). Marine carnivores have, similar to terrestrial carnivores, intermediate values of − 14 ‰, ranging from − 10 ‰ in collagen of sea otters to − 15 ‰ in collagen of the common dolphin(Reference Kelly239, Reference Schoeninger and Deniro240). Thus, also the δ13C values of marine carnivores compare well with those of their prey. Finally, δ13C values for the muscle of freshwater fish species range from − 24 to − 13 ‰, but considerable variation may exist between different lakes(Reference Mbabazi, Makanga and Orach-Meza241). Unfortunately, no δ13C data are available for terrestrial piscivorous carnivores, but the consistency of the other data suggests that these might be between − 24 and − 13 ‰, i.e. comparable with their prey (Fig. 5).
Consistent with the data above, Schoeninger et al. (Reference Schoeninger, Deniro and Tauber242) showed that European agriculturalists consuming C3 grasses had much lower δ13C values in bone collagen ( − 21 to − 19 ‰) compared with Mesoamerican agriculturalist consuming C4 maize ( − 7 to − 5 ‰), while North American and European fisher–gatherers had intermediate values ( − 15 to − 11 ‰). However, bone collagen proved less reliable to study early hominin diets. For that purpose the 13C:12C ratio is preferably measured in tooth enamel. To compare collagen δ13C values with enamel δ13C values, an additional (8 ‰) correction has to be made. Similar to the clear distinctions between bone collagen of modern grazers, browsers and their carnivores, tooth enamel data from fossilised fauna from South Africa showed similar differences for Plio-Pleistocene C3 feeders ( − 11·5 ‰) and C4 feeders ( − 0·5 ‰), with Australopithecus, Paranthropus and Homo taking intermediate positions ( − 10 to − 4 ‰)(Reference Sponheimer, Lee-Thorp and de Ruiter231, Reference Peters and Vogel234, Reference Lee-Thorp, Thackeray and van der Merwe237, Reference Sponheimer and Lee-Thorp243), which compared well with the values for contemporaneous felids ( − 10 to − 0·5 ‰)(Reference Sponheimer, Lee-Thorp and de Ruiter231, Reference Lee-Thorp, Thackeray and van der Merwe237, Reference Lee-Thorp, Sponheimer and Luyt244) (Fig. 5). These results clearly demonstrated that a significant proportion of the diets of the early hominins from Swartkrans, Makapansgat and Sterkfontein derived from C4 resources. Using these data it was calculated that South African Paranthropus derived 14–47 % of its diet from C4 sources, compared with 5–64 % in Australopithecus and 20–35 % in Homo (Reference van der Merwe, Masao and Bamford245).
A second study in Olduvai showed that the Tanzanian Paranthropus boisei derived 77–81 % and Homo 23–49 % from its diet from C4 resources(Reference van der Merwe, Masao and Bamford245). The low nutritional value of grasses, and microwear studies (see above) render it unlikely that humans were directly eating grass(Reference Sponheimer, Lee-Thorp and de Ruiter231, Reference Sponheimer and Lee-Thorp243). Analogously, the sizeable carnivory of C4-consuming mammals (such as cane rats, hyraxes or juvenile bovis) was argued to be practically impossible and thus unable to leave a strong C4 signature(Reference Sponheimer and Lee-Thorp243). However, at about 1·8 Mya, there were extensive wetlands in the Olduvai area, where a river from the Ngorongoro mountains entered the area, while at 1·5 Mya the Peninj river produced wetlands near Lake Natron(Reference Stewart, Cunnane and Stewart156). Some researchers investigated the edible plants in a present-day wetland (Okavango Delta) and found that the rhizomes and culms of three species of C4 sedges were edible, the most common one of which is Cyperus papyrus (Reference van der Merwe, Masao and Bamford245). However, it seems unlikely that the C4 signature in all early hominins derived from the consumption of papyrus. It was recently suggested that P. robustus and especially P. boisei had a diet of primarily C4 resources, most probably grasses or sedges, from savanna or wetland environments, respectively(Reference Cerling, Mbua and Kirera190). Theoretically, a good source of C4 foods would be a seasonal freshwater wetland with floodplains and perennial marshlands, with an abundance of easy accessible aquatic foods, large aggregations of nesting birds and calving ungulates(Reference Peters and Vogel234). Consumption of termites could have contributed to the high C4 signature observed in hominin fossils (Fig. 5), but it seems unlikely that termites could explain values as high as 50 % of the diet from C4(Reference Sponheimer and Lee-Thorp243). Finally, an enamel C4 signature of − 10 to − 4 ‰ in hominins, which translates into a soft tissue signature of − 23 to − 17 ‰ and a collagen signature of − 18 to − 12 ‰ (see above), might also derive from the consumption of small freshwater aquatic animals or fish, since they compare well with the δ13C values of − 24 to − 13 ‰ for freshwater fish(Reference Mbabazi, Makanga and Orach-Meza241) and − 18 to − 9 ‰ in collagen of crustaceans and anthropods(Reference Schoeninger and Deniro240), respectively. Moreover, δ13C values for hominins are similar to those reported for marine mammal hunters, freshwater fish, crustaceans, fisher–gatherers, marine and freshwater carnivores and marine fish (Fig. 5).
In agreement with the variability selection hypothesis of Potts(Reference Potts170), which states that large disparities in environmental conditions were responsible for important episodes of adaptive evolution, the wide range in δ13C values in particularly Australopithecus suggests that early hominins utilised a wide range of dietary sources, including C4 resources. This contrasts with chimpanzees, which, even in the most arid and open areas of their range, are known to consume negligible amounts of C4 resources, despite their local abundancy. Consequently, chimpanzees show very little variability in their δ13C carbon signature(Reference Schoeninger, Moore and Sept246, Reference Sponheimer, Loudon and Codron247). This underscores that even if contemporaneous chimpanzees and early hominins inhabited similar habitats, hominins had broadened their dietary range sufficiently to survive in habitats uninhabitable by chimpanzees. The latter assumption provides an interesting perspective on the recent data, which suggest that C4 foods were absent in the diet of Ardipithecus ramidus at 4·4 Mya. Consequently, it has been proposed that the origins of the introduction of C4 foods into the hominin diet lie in the period between 3 and 4 Mya(Reference Lee-Thorp, Sponheimer and Passey229).
Limited evidence from the 15N:14N ratio
Another stable-isotope ratio that has received considerable attention is the N isotope (15N:14N) ratio. A number of food web studies have shown that each step in the food chain is accompanied by 3–4 ‰ enrichment in δ15N(Reference Ambrose and Deniro228, Reference Kelly239) and that δ15N can therefore be useful as a trophic level indicator. Additionally, animals feeding in marine ecosystems have higher values compared with animals feeding on terrestrial resources(Reference Schoeninger, Deniro and Tauber242). For example, North American and European fisher–gatherers and North American marine mammal hunters and salmon fishers had much higher δ15N values (+13 to +20 ‰) compared with agriculturalists (+6 to +12 ‰). Analyses of phyto- and zooplankton suggest that freshwater organisms have δ15N values intermediate to terrestrial and marine organisms(Reference Schoeninger, Deniro and Tauber242). δ15N values are routinely measured in bone collagen, but it has been shown that good-quality collagen (preserving the original δ15N value) can, and only under favourable conditions, survive up to a maximum of 200 000 years(Reference Lee-Thorp and Sponheimer225). This limits δ15N isotopic studies to Late Pleistocene hominins (see below), but with improved technology, future studies using collagen extracted from tooth enamel may expand their application to early hominins(Reference Sponheimer, Lee-Thorp, de Ruiter and Ungar238).
Limited evidence from the 18O:16O ratio
A final isotope that might provide information about an animal's diet and thermophysiological adaptations is the oxygen isotope ratio (18O:16O). More energy is needed to vaporise H218O than H216O. When ocean water evaporates and during evapotranspiration, i.e. the sum of evaporation and plant transpiration from the earth's land surface to the atmosphere, more of the lighter isotope evaporates as H216O. The ensuing 18O enrichment of transpiring leaves results in 18O enrichment in typical browsers such as kudu and giraffe who rely less on free drinking water and derive most of their water from the consumption of the 18O-enriched plant water. As the 16O-enriched water vapour in clouds moves inland, some of it condenses as rain, during which more of the heavier isotope (as H218O) rains out, making the δ18O of coastal rain only slightly less enriched than the original vaporated ocean water, while the δ18O of the remaining water vapour that eventually comes down is highly negative (i.e. more 18O depleted). Consequently, river water from rain and melting ice is more δ18O negative than seawater. Roots derive their water from meteoric or underground water that is thus relatively depleted from 18O and so become animals that are consuming these roots(Reference Sponheimer and Lee-Thorp248, Reference Sponheimer and Lee-Thorp249). Browsers of leaves undergoing evapotranspiration and consumers of roots may thus be expected to have high and low δ18O values, respectively.
Australopiths showed lower δ18O values compared with Paranthropus, but the meaning of this difference remains uncertain. However, one might argue that Australopithecus preferred less arid conditions compared with Paranthropus or was more dependent on seasonal drinking water(Reference Sponheimer, Lee-Thorp and de Ruiter231). Low δ18O was additionally found in primates and suids, which might be linked to frugivory, although this is not supported by the higher 18O values found in Ardipithecus ramidus compared with Australopithicines (Reference White, Asfaw and Beyene68, Reference Lee-Thorp, Sponheimer and Passey229). Taken together, the use of δ18O for exploration of ancient human diets is still in its infancy, but might, especially in combination with other isotope ratios, become more appreciated in the future.
Isotopic data for more recent hominins
It would be of high interest to explore the hominin diet during the last spurt of encephalisation between 1·9 Mya to 100 Kya, when brain size tripled in size to volumes between 1200 and 1490 cm3 for Homo erectus, H. heidelbergensis, H. neanderthalensis and modern H. sapiens (Reference Cunnane, Cunnane and Stewart100). Isotopic data for this period are, however, absent. Due to the limited preservation of collagen beyond 200 000 years, and the near absence of C4 plants in Europe, these answers will have to come from further studies with tooth enamel in Africa and Asia. So far, there is no isotope evidence for the diet of Homo between 1·5 Mya up to 50 Kya (MP Richards, personal communication).
Dietary information from more recent humans comes from data on δ13C, supplemented with data on δ15N and the 13C:15N ratio. The 15N-isotope values of bone collagen(Reference Schoeninger, Deniro and Tauber242) for differentiation between aquatic and agricultural diets were additionally verified by the study of the sulfur isotope ratios (34S:32S), since high intakes of marine organisms also result in higher δ34S values(Reference Hu, Shang and Tong250). Combined isotope studies reveal high intakes of animal protein, with substantial proportions derived from freshwater fish by Upper and Middle Palaeolithic (40–12 Kya) humans in Eurasia, indicating that in some populations about 30 % of dietary protein came from marine sources(Reference Hu, Shang and Tong250–Reference Richards and Trinkaus256). In contrast, isotopic evidence indicates that Neanderthals were top-level carnivores that obtained most of their dietary protein from large terrestrial herbivores, although even Neanderthals certainly exploited shellfish such as clams, oysters, mussels and fish on occasion(Reference Hu, Shang and Tong250–Reference Richards, Pettitt and Stiner252). At the onset of the Neolithic period (5200 years ago), there was a rapid and complete change from aquatic- to terrestrial-derived proteins among both coastal and inland Britons compared with Mesolithic (9000–5200 years ago) British humans(Reference Richards, Schulting and Hedges254), which coincides precisely with the local onset of the Agricultural Revolution in Europe.
Conclusions from isotope studies
The isotope systems that have been studied thus far in hominin bone and teeth provide evidence that early hominins were opportunistic feeders(Reference van der Merwe, Thackeray and Lee-Thorp257). The spread of C4 foods in East Africa, and subsequently in the hominin food chain between 3 and 4 Mya, is in agreement with a niche of early hominins that locates close to the water. This conception is in agreement with the palaeo-environmental evidence. However, many questions still remain unanswered. With regard to the possible niche in the water–land interface, it seems interesting to include aquatic as well as terrestrial piscivorous animals into future studies. The data of combined studies of early hominins and the more recent hominins suggest a gradual increase in dietary animal protein, a part of which may derive from aquatic resources. In the more recent human ancestors, a substantial part of the dietary protein was irrefutably derived from marine resources, and this habit was only abandoned in some cases after the introduction of agriculture at the onset of the Neolithic(Reference Richards, Schulting and Hedges254).
The oldest stone tools found so far are dated to 2·6 Mya(Reference Semaw, Rogers and Quade258, Reference McPherron, Alemseged and Marean259) and it has been suggested that these were used for flesh removal and percussion on long bones for marrow access. From this time onward stone tools were apparently used for defleshing and butchering of large animals. However, again there is a pitfall in putting too much emphasis on the association between stone tools and hunting and butchering of large animals as the sole food source of the human ancestors, especially with regard to brain foods such as LCP. As stated by Liem's paradox; the apparently overwhelming evidence for the consumption of bone marrow, or even brain from cracked skulls, by the findings of cut marks on animal bone may not be evidence for the primary food resources of human ancestors, but only for its fallback food. Bones, especially long bones, are also better preserved than vegetable material. Moreover, cut marks on bone are easier ascribed to human utilisation than any nearby found fossilised fish bones or molluscan shells that only seldomly bear cut marks(Reference Willis, Eren and Rick260, Reference Braun, Harris and Levin261) and are often not even examined. Hence, while human remains are nearly always found in the vicinity of water and the fossil record of nearby found fish is extensive(Reference Asfaw, White and Lovejoy74, Reference de Heinzelin, Clark and White262), the exploitation of aquatic resources is difficult to relate to early man(Reference Leonard, Robertson, Snodgrass and Roebroeks263).
The present review is about the diet that allowed early humans to increase their brain size and thereby become intelligent enough to develop, for example, symbolic thinking and the controlled use of fire. Hunting and/or scavenging is often invoked as an important source of LCP, but, as pointed out by Crawford(Reference Crawford, Cunnane and Stewart2), even in the more recent certainly ‘hunting’ ancestors ‘a [scavenged] small brain was not going to go far among the ladies [and children] even if it was still in an edible condition when they [the male hunters] got it back [from the savanna]’(Reference Crawford, Cunnane and Stewart2), not even in the scenario(Reference Simpson, Quade and Levin264) that we were specialised, as suggested(Reference Bramble and Lieberman265), in endurance running. Apart from organ tissue (liver and brain) and bone marrow (whether scavenged or hunted), fish, shellfish and other aquatic foods are also mentioned as rich sources of the nutrients involved in brain expansion(Reference Broadhurst, Cunnane and Crawford99, Reference Crawford, Bloom and Broadhurst130, Reference Broadhurst, Wang and Crawford266). Therefore, the question arises whether the archeological evidence for human habitation in the land–water ecosystem only represents facilitated fossilisation or indicates the true ecological niche. The following section will focus on comparable evidence for the concurrent exploitation of aquatic resources.
Living in the water–land ecosystem
Because sea levels have risen up to 150 m in the past 17 000 years, a substantial part of the evidence for the exploitation of aquatic resources is hidden below sea level, if not permanently destroyed by the water(Reference Erlandson267, Reference Marean, Bicho, Haws and Davis268). However, in Kenya, a site in East Turkana provides solid evidence that at about 1·95 Mya hominins enjoyed carcasses of both terrestrial and aquatic animals including turtles, crocodiles and fish, which were associated with Oldowan artifacts(Reference Braun, Harris and Levin261). More ambiguous evidence for the exploitation of freshwater fish, crocodiles, turtles, amphibians and molluscs by Homo habilis in the Olduvai Gorge in Tanzania goes back as far as 1·8–1·1 Mya(Reference Erlandson267, Reference Stewart269). Subsequent tentative evidence from the Olduvai Gorge dates the use of similar aquatic resources by Homo erectus to 1·1–0·8 Mya(Reference Erlandson267, Reference Stewart269). Also the out-of-Africa diaspora probably took place largely via the coastlines(Reference Stringer81), even after the crossing of the Bering Strait into North America(Reference Wang, Lewis and Jakobsson270) (Fig. 2). In Koa Pah Nam, Thailand, 700 000-year-old piles of freshwater oyster shells were associated with Homo erectus (Reference Pope271, Reference Fagan272). In Holon, Israel, freshwater turtles, shells and hippopotamus bones were associated with Homo erectus and dated to 500–400 Kya(Reference Bar-Yosef273). Homo erectus fossils associated with seal remains in Mas del Caves (Lunel-Viel, France) were dated to 400 Kya(Reference Cleyet-Merle, Madelaine and Fischer274).
The archeological evidence for aquatic resource use increases with the appearance of archaic Homo sapiens (Reference Erlandson267). Although dominated by land mammal bones, 400 000–200 000-year-old remains from penguin and cormorants in Duinefontein, South Africa were associated with early Homo (Reference Klein, Avery and Cruz-Uribe275). Shellfish and possibly fish remains, dated 300–230 Kya, were associated with the French coastal campsite at Terra Amata(Reference de Lumley276, Reference Villa277), while marine shellfish and associated early human remains, dated 186–127 Kya, were found in Lazaret, France(Reference Cleyet-Merle, Madelaine and Fischer274). Marean et al. (Reference Marean, Bar-Matthews and Bernatchez278) found evidence for the inclusion of marine resources, at 164 Kya, in the diet of anatomically modern humans from the Pinnacle Point Caves (South Africa). At the Eritrea Red Sea coast, Middle Stone Age artifacts on a fossil reef support the view that early humans exploited near-shore marine food resources by at least 125 Kya(Reference Walter, Buffler and Bruggemann279). In several North African sites, dated to 40–150 Kya(Reference Erlandson267), human remains were associated with shell middens and aquatic resources such as aquatic snails, monk seals, mussels and crabs. Several European sites, dated to 30–125 Kya, are comparable with archeological sites that reveal evidence ranging from thick layers of mussels and large heaps of marine shells in Gibraltar(Reference Erlandson267) to diverse marine shells in Italy(Reference Stiner280), and to a casual description of the presence of marine shells of unknown density in Gruta da Figueira in Portugal(Reference Erlandson267). Further evidence for the use of shellfish, sea mammals and flightless birds comes from: Klasies River Mouth (South Africa) dated between 130 and 55 Kya(Reference Erlandson267, Reference von den Driesch281, Reference Erlandson, Cunnane and Stewart282); from Boegoeberg, where 130 000–40 000-year-old shell middens and cormorant bones were associated with Homo sapiens (Reference Erlandson267, Reference Klein, Cruz-Uribe and Halkett283); from Herolds Bay Cave, where 120 000–80 000-year-old shell middens, shellfish, mussels and otter remains were associated with human hearths(Reference Erlandson267); from Die Kelders (75–55 Kya), where abundant remains of sea mammals, birds and shellfish were found in cave deposits(Reference Erlandson267, Reference Marean, Goldberg and Avery284); and from Hoedjies Punt (70–60 Kya), Sea Harvest (70–60 Kya) and Blombos Cave (60–50 Kya) for the use of shellfish, sea mammals and fish(Reference Erlandson267, Reference Henshilwood and Sealy285). From this period onwards, human settlements are strongly associated with the exploitation of aquatic resources(Reference Erlandson267, Reference Marean, Bicho, Haws and Davis268, Reference Erlandson, Cunnane and Stewart282, Reference Klein, Avery and Cruz-Uribe286–Reference Henshilwood, d'Errico and Vanhaeren288). Evidence for more sophisticated fishing by use of barbed bone harpoon points dates back to 90–75 Kya in Katanda, Semlike River, Zaire(Reference Harris, Williamson, Morris and Boaz289, Reference Meylan and Boaz290) and to 70 Kya in South Africa(Reference Henshilwood, Sealy and Yates291). Finally, indications for seafaring are dated to 42–15 Kya(Reference Wickler and Spriggs292–Reference O'Connor, Ono and Clarkson294). Possibly, seafaring dates as far back as 800 Kya, as indicated by the finding of Homo erectus stone tools at the Indonesian island of Flores, which is located on the other side of a deep sea strait(Reference Brown, Sutikna and Morwood79, Reference Morwood, O'Sullivan and Aziz295–Reference Brumm, Jensen and van den Bergh299). In general, many archeological sites are found along channels, lake- and seashores(Reference Broadhurst, Cunnane and Crawford99, Reference Broadhurst, Wang and Crawford266, Reference Erlandson267, Reference Erlandson, Cunnane and Stewart282) and reveal aquatic fauna, such as catfish, crocodile and hippopotamus(Reference Sept300), but its proves difficult to relate their possible utilisation to our early ancestors.
Several events within the time span of the past about 2 million years have been attributed to the increase in brain size and intelligence. The introduction of meat in the hominin diet, which resulted in a higher dietary quality, has been discussed above. Claims for controlled fire in the Olduvai Gorge (Tanzania) and Koobi Fora (Kenya) go as far back as 1·5 Mya(Reference Wrangham, Jones and Laden215, Reference Gibbons301, Reference Wobber, Hare and Wrangham302). Evidence for cooking is as old as 250 Kya(Reference Gibbons301), but possibly dates back to 800 Kya(Reference Goren-Inbar, Alperson and Kislev303), when indications of controlled fire were found to be present. However, recently it has been concluded that solid evidence for systematic use of fire is only found from 400 to 300 Kya onwards(Reference Roebroeks and Villa304). Evidence that cooking provided increased dietary quality was recently provided by Wrangham et al. (Reference Wrangham, Jones and Laden215). According to archeological evidence, this could only have played an important role since the appearance of Homo sapiens (Reference Wrangham, Jones and Laden215, Reference Gibbons301) and Neanderthals(Reference Henry, Brooks and Piperno305). Also, the inclusion of aquatic resources as an attributor to human brain evolution has been suggested(Reference Crawford, Cunnane and Stewart2, Reference Broadhurst, Cunnane and Crawford99–Reference Cunnane101, Reference Crawford, Bloom and Broadhurst130, Reference Broadhurst, Wang and Crawford266, Reference Crawford, Bloom and Cunnane306–Reference Parkington, Roggenpoel, Halkett and Conard308), but remains a matter of debate(Reference Carlson and Kingston134, Reference Joordens, Kuipers and Muskiet135, Reference Marean, Bicho, Haws and Davis268, Reference Cordain, Eaton and Sebastian309).
From hunting–gathering to agriculture
The hunter–gatherer lifestyle continued worldwide for several millions of years and ended quite abruptly with the introduction of agriculture. The first indications for the abandonment of the hunter–gatherer lifestyle towards settlement come from a 23 000 year-old fisher–hunter–gatherer's camp at the shore of the Sea of Galilee(Reference Nadel, Weiss and Simchoni310, Reference Weiss, Wetterstrom and Nadel311). The associated return from diets containing substantial amounts of protein (from hunting and gathering) back to substantial amounts of carbohydrates is supported by indications for the ground collecting of wild cereals(Reference Kislev, Weiss, Hartmann and Hartmann312). This was slowly followed by the large-scale utilisation of cereals starting with the onset of the Agricultural Revolution some 10 Kya.
As indicated above (see ‘Biogeochemistry’), there is much controversy about the diet of the earliest humans and until now it is often stated that fishing was only introduced until more recently. From an anthropological perspective this might be true, since certain types (for example, deepwater) of fishing require advanced techniques(Reference O'Connor, Ono and Clarkson294). However, from a nutritional point of view, ‘fishing’ might include anything from collecting sessile shellfish to the seasonal hand capture or clubbing of migrating or spawning fish in very shallow water. Since fresh drinking water is the single most important aquatic resource for humans, hominins probably observed predators and scavengers feeding on aquatic animals. This makes it unlikely that they would not have participated in opportunistic harvesting of the shallow-water flora and fauna, such as molluscs, crabs, sea urchins, barnacles, shrimp, fish, fish roe or spawn, amphibians, reptiles, small mammals, birds or weeds(Reference Joordens, Wesselingh and de Vos175, Reference Erlandson267). There are many indications suggesting that the evolution of early Homo and its development to Homo sapiens did not take place in the ‘classical’ hot, arid and waterless savanna, but occurred in African ecosystems that were notably located in places where the land meets the water (with the land ecosystem possibly consisting of – depending on rainfall – wooded grasslands). Compared with terrestrial hunting and/or scavenging in the savanna, food from this land–water ecosystem is relatively easy to obtain and is rich in the aforementioned combination of haem-Fe, iodine, Se, vitamins A and D, and long-chain n-3-fatty acids(Reference Cunnane, Cunnane and Stewart100, Reference Cunnane101, Reference Broadhurst, Wang and Crawford266).
In conclusion, there is ample archeological evidence for a shift from the consumption of plant towards animal foods. Second, although there is an extensive archeological record for aquatic fossils (representing possible food) in the vicinity of human remains, their co-occurence is usually attributed to the preferential conservation of human remains in the vicinity of water. The present review provides support for the notion that the exploitation of these aquatic resources by hominins in coastal areas should be the default assumption, unless proven otherwise(Reference Joordens154). For a long time period in hominin evolution, hominins derived large amounts of energy from (terrestrial and aquatic) animal fat and protein. This habit became reversed only by the onset of the Neolithic Revolution in the Middle East starting about 10 Kya.
The hunter–gatherer diet
The Homo genus has been on earth for at least 2·4 million years(Reference Kimbel, Walter and Johanson313) and for over 99 % of this period has lived as hunter–gatherers(Reference Lee, Lee and DeVore314). Surprisingly, very little information is available on the macro- and micronutrient compositions of their diet in this extended and important period of human evolution(Reference Eaton, Eaton and Sinclair34, Reference Cordain, Miller and Eaton148). Since the onset of agriculture, about 10 Kya, agriculturalists and nomadic pastoralists have been expanding at the expense of hunter–gatherers(Reference Lee, Lee and DeVore314), with agricultural densities increasing by a factor of 10–1000 compared with the highest hunter densities. For this reason, present-day hunter–gatherers are often found in marginal environments, unattractive for crop cultivation or animal husbandry.
In order to study the original hunter–gatherer way of life, it is appropriate to aim at the few hunter–gatherer communities living in the richer environments that bear closer resemblance to those in which the evolution of the genus Homo probably took place. Most studies on hunter–gatherers and their diets are, however, performed by anthropologists(Reference Murdock315), whose primary interests are different from those of nutritionists. Anthropologists would, for example, conclude that ‘fishing was so unimportant as to be a type of food collection’(Reference Stewart, Lee and DeVore316), or consider collecting both small land fauna and shellfish(Reference Lee, Lee and DeVore314) as part of ‘gathering’, whereas from a nutritional point of view considerable differences exist in energy density, macro- and micronutrient composition between plants, terrestrial and aquatic animal foods.
Hunting v. gathering
Studies on food procurement of present-day hunter–gatherer societies show, in terms of energy gain v. expenditure, the advantage of hunting compared with plant foraging(Reference Ulijaszek192). Nevertheless, three distinct studies(Reference Marlowe41, Reference Cordain, Miller and Eaton148, Reference Lee, Lee and DeVore314) showed that hunting makes up only about 35 % of the subsistence base for worldwide hunter–gatherers, independent of latitude or environment. However, collection of small land fauna and shellfish was included as gathering in these studies. While gathering evidently played an important role over the whole of human evolution, hunting, although introduced later, coincided with ‘a major leap for mankind’ and has ever since played the most dominating cultural role. While hunting may have overtaken gathering in cultural importance, gathering continued to play a very important nutritional role, because: (i) gathering still contributes about 65 % to the subsistence base; (ii) many micronutrients derive only from plant sources; (iii) gathering of, for example, shellfish provides a substantial amount of LCP and other nutrients essential for brain development; and (iv) gathering plays an important cultural role since women, children and grandparents can participate(Reference Hawkes, O'Connell and Jones56, Reference O'Connell, Hawkes and Blurton Jones57, Reference Meehan317).
Contrary to common belief, hunting in present-day hunter–gatherers is still not very successful: the probability for a kill in !Kung bushmen is only 23 %(Reference Lee, Lee and DeVore314) and the subsistence of Hadza, as described by Marlowe(Reference Marlowe41) and Woodburn(Reference Woodburn, Lee and DeVore145), is composed of 75–80 % of plant foods. Conversely, studies of North American hunting–gathering societies describe the dietary role of shellfish as similar to ‘bread and butter’, being the staple food(Reference Moss318) in these societies. The anthropological remark(Reference Lee, Lee and DeVore314) that for many studied hunter–gatherer tribes ‘fishing was only a type of food collection’ also adds to the notion that the collection of aquatic foods might have preceded scavenging and hunting. Collecting aquatic foods is still daily practice in Eastern Africa and picking up, clubbing or spearing stranded aquatic animals seems much easier and safer than either scavenging or hunting game on the Serengeti plains.
We conclude that gathering plays, and most likely always played, the major role in food procurement of humans. Although hunting doubtlessly leaves the most prominent signature in the archaeological record, gathering of vegetables and the collection of animal, notably aquatic, resources (regardless of whether their collection is considered as either hunting or gathering), seems much easier compared with hunting on the hot and arid savanna. We suggest that it seems fair to consider these types of foods as an important part of the human diet, unless proven otherwise(Reference Joordens154). Conversely, while hunting might have played a much more important role at higher latitudes, dietary resources in these ecosystems are rich in n-3-fatty acids (for example, fatty fish and large aquatic mammals), while the hominin invasion of these biomes occurred only after the development of more developed hunting skills.
Nutrients and other environmental factors are increasingly recognised to influence epigenetic marks(Reference Godfrey, Lillycrop and Burdge319–Reference Godfrey, Sheppard and Gluckman323), either directly or indirectly via many bodily sensors. Food from the diverse East African aquatic ecosystems is rich in haem-Fe, iodine, Se, vitamins A and D, and n-3 fatty acids from both vegetable origin and fish(Reference Cunnane101). All of these nutrients seem to act at the crossroad of metabolism and inflammation(Reference Muskiet, Montmayeur and Coutre24). For example, PPAR(Reference Bensinger and Tontonoz324, Reference Castrillo and Tontonoz325) are lipid-driven nuclear receptors with key cellular functions in metabolism and inflammation(Reference Feige, Gelman and Michalik26). TR(Reference Song, Yao and Ying326), VDR(Reference Bouillon, Bischoff-Ferrari and Willett327), RXR and RAR(Reference McGrane328) are other examples of nuclear transcription factors that serve functions as ligand-driven sensors. The iodine- and Se-dependent hormone triiodothyronine (T3)(Reference Venturi, Donati and Venturi329–Reference Gilbert, McLanahan and Hedge331) is a ligand of TR(Reference Song, Yao and Ying326), many fatty acids and their derivatives are ligands of PPAR(Reference Desvergne and Wahli332), the vitamin D-derived 1,25-dihydroxyvitamin D hormone is a ligand of the VDR(Reference Norman and Bouillon333), 9-cis-retinoic acid and the fish oil fatty acid DHA are ligands of RXR(Reference McGrane328), while RAR interacts with vitamin A (retinol) and many of its derivatives such as all-trans-retinoic acid, retinal and retinyl acetate(Reference McGrane328). The ligated nuclear transcription factors usually do not support transcription by themselves, but need to homodimerise or heterodimerise notably with RXR to facilitate gene transcription. Examples of the latter are TR/RXR, PPAR/RXR, VDR/RXR and RAR/RXR. It has become clear that their modes of action illustrate the need of balance between, for example, iodine, Se, fish oil fatty acids and vitamins A and D, a balance that is notably found in the land–water ecosystem.
Deficiencies of the above ‘brain-selective nutrients’ are among the most widely encountered in the current world population(Reference Cunnane101, Reference Holick and Chen334). While iodide is added to table salt in many countries, margarines and milk have become popular food products for fortification with vitamins A and D. After discussing some general health differences between traditionally living individuals and those living in Westernised countries, we focus on the importance of LCP and notably those of the n-3-series, as examples of the above-mentioned nutrients that are especially abundant in the land–water ecosystem.
Hunter–gatherer v. ‘Western’ physiology
There are many differences in health indicators between traditionally living individuals and those living in Western societies. For instance, primary and secondary intervention trials with statins indicate lowest CHD risk at an LDL-cholesterol of 500–700 mg/l (1·3–1·8 mmol/l), which is consistent with levels encountered in primates in the wild and hunter–gatherer populations with few deaths from CVD(Reference Mann, Roels and Price335–Reference O'Keefe, Cordain and Harris339). Another example of the healthy lifestyle of present-day hunter–gatherers comes from the observed ‘insulinopenia’ or ‘impaired insulin secretion’ following an oral glucose tolerance test (Fig. 6) in Central African Pygmies and Kalahri Bushmen(Reference Joffe, Jackson and Thomas340, Reference Merimee, Rimoin and Cavalli-Sforza341), respectively. As opposed to the ‘impairments’ noted by these authors, it may also be argued that these researchers were actually witnessing an insulin sensitivity that has become sporadic in Western countries as a consequence of the decrease in physical activity and fitness, increase in fat mass and as a result of the quantity and quality of the foods consumed(Reference O'Keefe and Cordain36, Reference Cordain, Eaton and Sebastian133, Reference Ramsden, Faurot and Carrera-Bastos342). The current consensus is that ‘fat is bad’ and especially saturated fats have become associated with CVD(Reference Keys343–Reference Kuipers, de Graaf and Luxwolda345). However, traditional Maasai consumed diets high in protein and fat (milk and meat) and low in carbohydrates(Reference Ho, Biss and Mikkelson346, Reference Biss, Ho and Mikkelson347). They had high intakes of saturated fat and cholesterol, showed extensive atherosclerosis with lipid infiltration and fibrous changes, but had very few complicated lesions, and were virtually devoid of CVD(Reference Mann, Spoerry and Gray348). The average total and LDL-cholesterol in these societies was low and did not increase with age(Reference Mann, Spoerry and Gray348). Finally, the physical fitness of individuals in such traditional societies, such as the Maasai, is often remarkable(Reference Mann, Shaffer and Rich337).
In Kalahari Desert Bushmen and Central African Pygmies, observers could not find any case of high blood pressure and blood pressure did not increase with age(Reference Mann, Roels and Price335, Reference Kaminer and Lutz349). Dental surveys of Kalahari Bushmen(Reference Clement, Fosdick and Plotkin350) and other hunter–gatherers(Reference Larsen49) showed a remarkable absence of caries. The absence was explained by the repetitive annual abstinence of fermentable sugars in their diet, with a consequent inability to build a cariogenic oral Lactobacillus flora(Reference Clement, Fosdick and Plotkin350). Inhabitants of Kitava (Trobriand Islands, Papua New Guinea) have high intakes (70 en%) of carbohydrates from yams, high intakes of SFA from coconuts, and a high fish intake(Reference Lindeberg and Lundh44, Reference Lindeberg, Nilsson-Ehle and Terent45, Reference Lindeberg, Berntorp and Nilsson-Ehle351). Although both high intakes of carbohydrates and saturated fat have been related to the metabolic syndrome and CVD, these traditional Kitavians do not show symptoms of either the metabolic syndrome and are virtually free from the Western diseases that ensue from it.
Evidence-based medicine as applied to long-chain PUFA in CVD and depression
Despite some compelling examples of the healthy lifestyles of traditional populations, current dietary recommendations derive preferably from randomised clinical trials with single nutrients and preferably hard endpoints(Reference Sackett, Rosenberg and Gray352). This approach clearly oversimplifies the effects of dietary nutrients(Reference Blumberg, Heaney and Huncharek353), since neither macronutrients, nor micronutrients, are consumed in isolation and their effects may be the result of a complex web of interactions between all the nutrients present in the biological systems that we consume, such as a banana or a fish.
The current recommendations from many nutritional boards for a daily intake of 450 mg EPA + DHA in adults derive from epidemiological data that demonstrated a negative association of fish consumption with CHD(Reference Dyerberg, Bang and Hjorne354–Reference Mozaffarian and Rimm357) that has subsequently become supported by landmark trials with ALA(Reference de Lorgeril, Salen and Martin358) and fish oil(359–Reference Yokoyama, Origasa and Matsuzaki361) in CVD. However, not all trials in CVD have been positive(Reference Kromhout, Giltay and Geleijnse362). In addition, a negative association was observed for fish consumption and depression(Reference Hibbeln363–Reference Hibbeln365) and for homicide mortality(Reference Hibbeln366). The causality of these relationships was supported by some, but not all, trials with fish oil in depression(Reference Lin and Su367–Reference Freeman, Hibbeln and Wisner371), while a recent meta-analysis demonstrated the beneficial effect of EPA supplements with ≥ 60 % EPA of total EPA + DHA in a dose range of 200–2200 mg/d of EPA in excess of DHA(Reference Sublette, Ellis and Geant372).
The influence of polymorphisms in the genome is increasingly recognised, but seldom interpreted in an evolutionary context. As argued above, most polymorphisms were already amongst us when Homo sapiens emerged, some 200 Kya, while that also holds true for most, if not all, currently identified ‘disease susceptibility genes’ that are usually abundant but confer low risk(373). A loss-of-function mutation in a specific biosynthetic pathway might be an evolutionary advantage if the specific endproduct has been a consistent part of the diet, such as is probably applicable to all vitamins, for example, vitamin C(Reference Challem374, Reference Nishikimi, Fukuyama and Minoshima375). Applied to our LCP status, it is nowadays well established that all humans synthesise DHA with difficulty(Reference Burdge and Wootton376, Reference Burdge377). Analogously, the recently discovered polymorphisms of fatty acid desaturases 1 (FADS1; also named Δ-5 desaturase) and FADS2 (Δ-6 desaturase) with lower activities in their conversion of the parent essential fatty acid to LCP suggest that from at least the time of their appearance, the dietary intakes of AA, EPA and DHA have been of sufficient magnitude to balance the LCP n-3:LCP n-6 ratio(Reference Kuipers, Luxwolda and Janneke Dijck-Brouwer378, Reference Luxwolda, Kuipers and Smit379) to maintain good health.
Long-chain PUFA benefits in pregnancy and early life
Another indication for the importance of LCP comes from the higher LCP contents in the fetal circulation compared with the maternal circulation, a process named biomagnification(Reference Crawford, Hassam and Williams380–Reference Crawford, Hassam and Rivers382), which occurs at the expense of the maternal LCP status(Reference Hornstra383, Reference Hornstra384). The decreasing maternal n-3 LCP status during pregnancy in Western countries is associated with postpartum depression(Reference Hibbeln363, Reference Hibbeln364), although intervention studies with LCP in postpartum depression have been negative so far(Reference Freeman370, Reference Freeman, Hibbeln and Wisner371, Reference Makrides, Gibson and McPhee385, Reference Doornbos, van Goor and Dijck-Brouwer386). However, a positive effect was seen for n-3 LCP supplementation on depression during pregnancy(Reference Su, Huang and Chiu387) and it has been advocated to start supplementation earlier in pregnancy and with higher dosages(Reference Wojcicki and Heyman388).
Maternal LCP intakes have also been related to infant health. AA and DHA in premature and low-birth-weight infants correlated positively with anthropometrics, AA to increased birth weight(Reference Koletzko and Braun389) and DHA to prolonged gestation(Reference Szajewska, Horvath and Koletzko390–Reference Olsen, Osterdal and Salvig392). Studies with supplementation of DHA during pregnancy yielded, for example, evidence for: (i) the maturation of the brain, visual system and retina of the newborn at 2·5 and 4 months, but not at 6 months(Reference Malcolm, McCulloch and Montgomery393–Reference Smithers, Gibson and McPhee397); (ii) increased problem solving at 9 months but no difference in memory(Reference Judge, Harel and Lammi-Keefe398); and (iii) superior eye–hand coordination at 2·5 years(Reference Dunstan, Simmer and Dixon399) and higher intelligence quotient at 4 years(Reference Helland, Smith and Saarem400) but not at 7 years of age(Reference Helland, Smith and Blomen401). In contrast to the inconclusive human studies, animal studies and combined human and animal studies showed abnormal behaviour together with disturbed cognition at lower brain DHA levels(Reference McCann and Ames402). The importance of dietary AA during pregnancy seems less pronounced, but a positive association between umbilical AA and neonatal neurological development(Reference Koletzko and Braun389) and a lower venous AA for those with slightly abnormal neurological development(Reference Dijck-Brouwer, Hadders-Algra and Bouwstra403) has been shown. A reduced DHA status in the brain is associated with a mildly increased AA status(Reference Moriguchi, Loewke and Garrison404), which is in its turn associated with low-grade inflammation(Reference Rao, Ertley and DeMar405).
Infant health starts with maternal health; thus dietary recommendations issued for pregnant women indirectly also apply to their infants. The recommendation for adults to consume 450 mg DHA + EPA per d translates into a DHA composition in breast milk of about 0·79 %(Reference van Goor, Smit and Schaafsma406). However, current recommendations for the composition of infant formulae derive mainly from the range of human milk fatty acid compositions as observed in Western countries, which in their turn derive from women with recorded intakes below the 450 mg recommended daily intake of EPA + DHA(Reference Brenna, Varamini and Jensen407, Reference Koletzko, Lien and Agostoni408).
The same paradox holds for other fatty acids in breast milk. For instance, there are few recommendations for the medium-chain SFA (MCSFA) content of human milk. High MCSFA contents in some traditional societies derive from their high intakes of 12 : 0 and 14 : 0 from coconuts(Reference Kuipers, Smit and van der Meulen409). Conversely, the high MCSFA contents in Western populations are primarily influenced by maternal carbohydrate intakes(Reference Hachey, Silber and Wong410), since the mammary gland has the unique ability to convert glucose into MCSFA (6 : 0–14 : 0), mainly lauric (12 : 0) and myristic (14 : 0) acids. However, women with regular consumption of coconuts have a much higher 12 : 0:14 : 0 ratio compared with women with high carbohydrate intakes. Both MCSFA are readily absorbed in the gastrointestinal tract, while antiviral as well as antibacterial properties have been attributed to some MCSFA, but mainly to 12 : 0(Reference Kabara411, Reference Bergsson, Steingrimsson and Thormar412).
The PUFA content of Western milk has increased over the last decades(Reference Widdowson, Dauncey and Gairdner413, Reference Ailhaud, Massiera and Weill414). While the human milk LA content in the USA increased by at least 250 %, its DHA content decreased by almost 50 %(Reference Ailhaud, Massiera and Weill414). The (from an evolutionary point of view) abnormally high LA intake is, despite a lack of evidence(Reference Ramsden, Hibbeln and Majchrzak415), advocated for cardiovascular health(Reference Harris, Mozaffarian and Rimm416). The resulting high LA status is likely to interfere with both the incorporation of AA and DHA into phospholipids and also inhibits their synthesis from their parent essential fatty acid(Reference Gibson, Muhlhausler and Makrides417). Major differences are noted in the comparison of the human milk fatty acid compositions of Western mothers compared with some traditional African women(Reference Kuipers, Smit and van der Meulen409, Reference Koletzko, Thiel and Abiodun418), with unknown consequences for infant health or the occurrence of disease at adult age (i.e. the ‘Barker hypothesis’)(Reference Barker16, Reference Godfrey, Robinson and Barker419). It has been proposed that the high concentrations of EPA, DHA as well as AA in human milk, such as described for many fish-consuming societies(Reference Kuipers, Smit and van der Meulen409, Reference Innis and Kuhnlein420, Reference Ruan, Liu and Man421), might be a more appropriate reflection of the Palaeolithic breast milk composition and may therefore constitute a better reference for infant formulae than do Western human milks(Reference Muskiet, Kuipers and Smit422).
The influence of environment
It is estimated that 70 % of all cases of stroke and colon cancer, 80 % of all CVD and 90 % of all cases of type 2 diabetes mellitus have been caused by lifestyle and could have been prevented by paying more attention to modifiable behaviour factors, including specific aspects of diet, overweight, inactivity and smoking(Reference Willett423). The mismatch between the human diet and the Palaeolithic genome might therefore be responsible for many typically Western diseases. In addition to the evidence from many other disciplines, evidence from (patho)physiology and epidemiology adds to the notion that a great deal of information on healthy diets might derive from the study of the diets of the early human ancestors. The metabolic syndrome, characterised by impaired insulin sensitivity, is at the centre of many diseases of civilisation. High intakes of refined carbohydrates as well as low intakes of LCP have been implicated in the development of insulin resistance. As such, low carbohydrate intakes(Reference Yudkin424, Reference Yudkin425) and high LCP(Reference Eaton and Konner8, Reference Eaton, Eaton and Sinclair34, Reference Eaton426, Reference Cordain, Watkins and Mann427) intakes by the early human ancestor might explain in part the low incidence of diseases of civilisation in current hunter–gatherer societies. The available evidence from pathophysiology and epidemiology supports the hypothesis that the land–water ecosystem contributed important and indispensable nutrients to evolving hominins.
Dietary reconstruction of the nutrients available in Eastern Africa
The debate on the ecological niche of human ancestors is unlikely to reach a consensus shortly. The millions of years of human evolution concurred with marked and abrupt climatic changes, which renders a single ecological niche of human ancestry unlikely. However, it is at the same time clear that in a short period of time humans have made tremendous changes in their lifestyle, their diet included, that lie at the basis of the diseases of Western civilisation. This prompted various investigators to reconstruct the possible compositions of diets that could have been consumed by our Palaeolithic ancestors. Their studies are, for example, based on the plausibility that before the Agricultural Revolution, when humans lived as hunter–gatherers, cereals were no appreciable part of the diet and that wild animals living in the Eastern African savanna and in Eastern African aquatic ecosystems have different fatty acid compositions compared with the domesticated animals that have now become staple foods. For example, the lean savanna animals that inhabit the Eastern African plains have much lower fat contents, and the available fat is much more enriched in PUFA(Reference Crawford428). Similarly, high-latitude (fatty) fish have much higher EPA and DHA contents, but lower AA contents compared with low-latitude (lean) fish from tropical waters(Reference O'Dea and Sinclair429–Reference Naughton, O'Dea and Sinclair432).
Eaton & Konner(Reference Eaton and Konner8) where the first to use this approach in reconstructing a Palaeolithic diet; their pioneer study was published in the New England Journal of Medicine in 1985. The authors estimated that late Palaeolithic humans consumed diets containing 35 % meat and 65 % vegetable foods, containing 34 en% from protein, 45 en% from carbohydrate and 21 en% from fat, while the ratio between polyunsaturated and saturated fat equalled 1·41 and their fibre intake amounted to 46 g/d(Reference Eaton and Konner8). These outcomes contrasted with the average American diet at the time, that consisted of 12 en% protein, 46 en% carbohydrate and 42 en% fat, with a polyunsaturated:saturated fat ratio of 0·44 and a fibre intake of 20 g/d. After 25 years of additional study, Konner & Eaton confirmed their previous findings by estimating that the Palaeolithic diet provided 25–30 en% protein, 35–40 en% carbohydrate and 20–35 en% fat(Reference Konner and Eaton433), while the polyunsaturated:saturated fat ratio was 1·40(Reference Eaton, Eaton and Konner434). Moreover, they concluded that ‘it has become clear since our initial publications that marine, lacustrine, and riverine species were important sources of animal flesh during the evolution of modern Homo sapiens, and may have played a role in the evolution of brain ontogeny’(Reference Konner and Eaton433). In addition to the earlier studies, they also estimated the vitamin and mineral composition of a Palaeolithic diet, showing higher contents of folate, riboflavin, thiamin, vitamins A and E, Ca, Mg, P, Zn, and notably ascorbate, vitamin D (sunlight), Cu, Fe, Mn and K, while the Palaeolithic diet contained much lower Na compared with contemporary US intakes and recommendations(Reference Konner and Eaton433–Reference Eaton and Eaton435). In a subsequent study they estimated that in different ancient hunting and gathering populations, fatty acid intakes would have ranged from 5·19 to 20·6 g LA/d, 0·26 to 4·8 g AA/d, 3·45 to 25·2 g ALA/d and 0·03 to 1·52 g DHA/d, which contrasted with the much higher LA (22·5 g/d) and lower ALA (1·2 g/d), AA (0·6 g/d) and DHA (0·08 g/d) intakes as observed in current Western populations(Reference Eaton, Eaton and Sinclair34).
In a meticulous analysis of worldwide hunter–gatherer diets, Cordain et al. (Reference Cordain, Miller and Eaton148, Reference Cordain, Miller and Eaton436) estimated that the most plausible percentages of total energy from dietary macronutrients would be 19–35 en% from protein, 22–40 en% from carbohydrate and 28–58 en% from fat, which reflects a markedly higher contribution of dietary fat, a similar amount of protein, but a lower contribution of carbohydrates, compared with earlier estimates from Eaton & Konner(Reference Eaton and Konner8, Reference Eaton, Eaton and Konner434). The main differences were explained by the assumption that, wherever it was ecologically possible, hunter–gatherers would have consumed 45–65 % of total energy from animal foods(Reference Cordain, Miller and Eaton148), while in the earlier estimations(Reference Eaton and Konner8, Reference Eaton, Eaton and Konner434) only 35 % derived from animal foods. These higher animal food intakes were explained by their inclusion of both worldwide hunting and fishing hunting–gathering societies into their new calculation models(Reference Cordain, Miller and Eaton148), also including mounted and arctic hunters. Those latter possibilities, however, seem insignificant with regard to early human evolution, which explains why they seem to overestimate the amount of the diet that is derived from animal foods. For example, Marlowe(Reference Marlowe41) estimated that in a warm-climate sample about 53 % of the diet derives from gathering, 26 % from hunting and 21 % from fishing (i.e. about 47 % from hunting).
To subsequently investigate the nutrient compositions of such diets, fish consumption was incorporated as a separate variable to plant and meat consumption in the earlier models, since aquatic and terrestrial animals have markedly different fatty acid compositions. In this most recent analysis(Reference Kuipers, Luxwolda and Dijck-Brouwer437), 12 500 kJ (3000 kcal) Palaeolithic diets were investigated with plant:animal food intake ratios ranging from 70:30 to 30:70 en%/en% under the conditions of four different foraging strategies in which the animal part ranged from exclusive meat consumption including the selective consumption of energy- and LCP-rich fat from bone marrow and brain, respectively(Reference Cordain, Watkins and Mann427), to the consumption of an entirely aquatic diet in an Eastern African water–land ecosystem(Reference Morgan438, Reference Horrobin439). It was found that that the energy intakes from the macronutrients were: 25–29 en% (range 8–35) from protein, 39–40 en% (range Reference Gluckman, Hanson and Morton19–Reference Angel, Cohen and Armelagos48) from carbohydrate and 30–39 en% (range 20–72) from fat. Dietary LA ranged from 1·7 to 6·2 en%/d, AA from 1·15 to 10·7 g/d, ALA from 2·1 to 5·8 en%/d and EPA + DHA intakes from 0·87 to 28·3 g/d(Reference Kuipers, Luxwolda and Dijck-Brouwer437). From these data, despite their wide range in outcomes, it can again be concluded that there are substantial differences with respect to the average composition of the current Western diet, notably because of its higher proportions of carbohydrates and LA, and its much lower protein and ALA and LCP contents. It became also conceivable that ancestors living in the East African water–land ecosystem had daily intakes of gram amounts of EPA + DHA. As such, these n-3 LCP intakes were comparable with those of the traditionally living Eskimos in Greenland, who because of their low CVD risk(Reference Dyerberg, Bang and Hjorne354, Reference Bang, Dyerberg and Sinclair355) initiated the current interest in the role of n-3 LCP in both primary and secondary prevention of CVD. In addition to these n-3 fatty acids, the water–land ecosystem is also a rich source of haem-Fe, iodine, Se and the vitamins A and D(Reference Cunnane101), which have important functions and interactions in gene transcription and metabolism(Reference Muskiet, Montmayeur and Coutre24, Reference Feige, Gelman and Michalik26, Reference Hotamisligil and Erbay440).
Dietary changes since the Agricultural Revolution
Whatever the specific composition and wide range of early hunter–gatherer diets, the current consensus is that our diet has changed markedly from the time of large-scale utilisation of cereals and animal domestication (i.e. the Agricultural Revolution) starting some 10 Kya. Contrary to earlier belief, the advent of agriculture coincided with an overall decline in nutrition and general health, but at the same time provided an evolutionary advantage since it increased birth rates and thereby promoted net population growth(Reference Larsen49, Reference Larsen50).
While the decline of nutritional quality and general health started with the onset of the Agricultural Revolution, these processes became even more pronounced with the advent of the Industrial Revolution some 100–200 years ago(Reference Eaton, Konner and Shostak9, Reference Eaton and Cordain11, Reference Cordain, Eaton and Sebastian133). Among the many dietary and lifestyle changes (Fig. 7) are: a grossly decreased n-3:n-6 fatty acid ratio, the combined high intakes of SFA and carbohydrates(Reference Forsythe, Phinney and Fernandez441–Reference Forsythe, Phinney and Feinman443), the introduction of industrially produced trans-fatty acids, reduced intakes of n-3 and n-6 LCP, reduced exposure to sunlight, low intakes of vitamins D and K, disbalanced antioxidant status and high intakes of carbohydrates with high glycaemic indices and loads, such as sucrose and industrially produced high-fructose maize syrup(Reference O'Keefe and Cordain36, Reference Cordain, Eaton and Sebastian133, Reference Simopoulos444, Reference Simopoulos445). Many of these changes act in concert, which points at the serious limitations of conclusions from contemporary investigations that study the many nutrients in isolation and form the basis of modern nutritional guidelines. An example is the interaction of dietary carbohydrates with SFA(Reference Forsythe, Phinney and Fernandez441–Reference Forsythe, Phinney and Feinman443, Reference Feinman and Volek446).
Potential benefits of a Palaeolithic diet
Evidence for the beneficial effects of Palaeolithic diets may derive from their influence on weight reduction and classical coronary artery disease risk factors. In an uncontrolled study with healthy adults, Osterdahl et al. (Reference Osterdahl, Kocturk and Koochek447) showed a decrease in weight, BMI and waist circumference after 3 weeks ad libitum consumption of a Palaeolithic-like diet (i.e. 6627 kJ/d (1584 kcal/d); carbohydrate 40, protein 24, fat 36 en%), compared with their baseline usual diet (10 368 kJ/d (2478 kcal/d); carbohydrate 54, protein 14, fat 30 en%). Additionally, they showed favourable effects on systolic blood pressure and plasminogen activator inhibitor-1. Jönsson et al. (Reference Jönsson, Granfeldt and Ahrén448) performed a cross-over study of 2 × 3 months in type 2 diabetic patients receiving a Palaeolithic diet (6615 kJ/d (1581 kcal/d), carbohydrate 32, protein 24, fat 39 en%) or a diabetes diet (7858 kJ/d (1878 kcal/d), carbohydrate 42, protein 20, fat 34 en%). They showed a reduction of body weight, BMI and waist circumference and lower HbA1c, TAG and diastolic blood pressure, and higher HDL-cholesterol after consumption of the Palaeolithic diet.
In a randomised trial in patients with IHD plus glucose intolerance or type 2 diabetes, Lindeberg et al. (Reference Lindeberg, Jonsson and Granfeldt449) showed a reduced energy intake after ad libitum consumption of a Palaeolithic diet (5623 kJ/d (1344 kcal/d); carbohydrate 40, protein 28, fat 27 en%) as compared with an ad libitum Mediterranean-like Consensus diet (7510 kJ/d (1795 kcal/d); carbohydrate 52, protein 21, fat 25 en%). They also observed a larger improvement in glucose tolerance in the Palaeolithic diet group, independent of decreased waist circumference. The most convincing evidence so far derives from an uncontrolled trial(Reference Frassetto, Schloetter and Mietus-Synder450) showing that 10 d consumption of an isoenergetic Palaeolithic diet (11 301 kJ/d (2701 kcal/d); carbohydrate 38, protein 30, fat 32 en%) improved blood pressure, arterial distensibility, insulin sensitivity and total, HDL- and LDL-cholesterol in healthy sedentary human subjects, when compared with their baseline usual diet (9924 kJ/d (2372 kcal/d), carbohydrate 44, protein 18, fat 38 en%). Importantly, there were no changes in energy intakes, activity levels and body weight, which indicates that the improved coronary artery disease risk profile was unrelated to weight reduction or other well-known determinants.
The optimal nutrient combination to support good health can be expected to reflect a certain balance. This balance is present in the foods that were consumed by Palaeolithic and possibly by pre-Palaeolithic ancestors, because it is this balance on which the human genome has evolved. This genome has been shaped by millions of years of evolution, during which it adapted to the conditions of existence, including the diet. There are ample indications from many disciplines that the human ancestors evolved in a water–land interface that provided food from both terrestrial and aquatic resources. For instance, the availability of both n-3 and n-6 LCP from the aquatic food chain was one of the many factors that provided early humans with the unique combination of brain-selective nutrients for brain growth(Reference Crawford, Cunnane and Stewart2). The recent deviation from this Palaeolithic diet and lifestyle in general might be at the basis of many, if not all, current diseases of civilisation. Detailed studies with respect to the health effects of the diets of these earlier ancestors are therefore warranted.
This research received no specific grant from any funding agency in the public, commercial or not-for-profit sectors.
R. S. K. wrote the initial manuscript. After finishing a first outline, all authors contributed to their specific fields of knowledge, i.e. R. S. K. and F. A. J. M. refined the sections ‘Environment, nutrients and their interaction with the genome’, ‘Evolutionary medicine’, ‘Arguments and counter arguments in evolutionary health promotion‘, ‘Human evolution’, ‘Dietary changes since the Agricultural Revolution’ and ‘Potential benefits of a Palaeolithic diet’ and the sub-sections ‘Comparative anatomy’, ‘Biogeochemistry’, ‘Anthropology’, ‘(Patho)physiology’ and ‘Dietary reconstruction of the nutrients available in Eastern Africa’; J. C. A. J. refined the section ‘The probability of hunting on the savanna’ and the sub-sections ‘Palaeo-environments’ and ‘Archeology’.
The authors thank Matt Sponheimer, Mike Richards and Peter Ungar for their willingness to answer their questions.
There are no conflicts of interest.