Home

Food Myths

Environment

What To Eat

Links

Find it

Looking for evidence in our evolutionary past – the concept of the Palaeolithic diet.

View in printable format

Many have looked in our evolutionary past to try to gain an insight into how diet may have moulded us into what we are today. Among many views, three will be examined here. The first two have been used by the red meat producers to attempt to convince us that because our ancestors may have eaten a high meat diet during the time when hominin brains were rapidly expanding in size, our modern diets should contain similar proportions of red meat. The third theory is presented to show that there are major competing ideas as to how the extra energy to run a bigger brain may have been gained. The message from all this is that there is much speculation and the claims made by the meat producers are without proof.

1. The expensive tissue hypothesis of Aiello and Wheeler.  [Aiello]

2. The Paleodiet concept of Cordain and associated workers.  [Cordain]

3. Cooking of plant-based foods as the source of the additional energy required for human development as proposed by Wrangham and others.  [Wrangham]

Introduction

In the expensive tissue hypothesis it is proposed that the relative size of the human brain is so great as to require a high quality diet of energy rich foods to supply this need. This they propose was supplied by a diet rich in animal meat and fat. Without this  predominantly meat based diet to supply the additional energy, human brain development would have reached an evolutionary bottle neck according to this view. They also point out that the size of the human gut appears to have decreased as brain size increased, the additional energy requirements of the brain being counterbalanced by the reduced energy requirements of a smaller gut.

This argument has been pushed by the meat producers to attempt to show that red meat is necessary for brain development in modern humans. As you will be shown, there are many problems with this concept, particularly with the spin that the meat producers put on it.

The protagonists of the Palaeolithic diet concept claim that we as modern humans would be healthier if we attempted to eat a diet most closely resembling that consumed by early Homo sapiens from the upper Palaeolithic, since that is what we evolved to eat over a period of 2 million years or so leading up to this time. According to this concept, there has been too little time since then, around 10000 to 20000 years, from an evolutionary point of view for us to have evolved to be more suited to the current western diet. Adoption of this diet, if we accept that we can really know what it was exactly, would mean removing from our current diets cereals, refined sugars, all dairy products, oils and non-lean meats as these were not available to late Palaeolithic man.

There is indirect evidence that Palaeolithic man got 60 to 70% of his energy intake from game meat (which by its nature is very lean and has a fat profile very different from modern meats) and fish. Such a diet which is high in lean animal protein and relatively low in carbohydrate can be shown in modern humans to favourably change our blood lipids and reduce obesity. Whether it improves brain development is unproven. The first big question is can the world afford such a diet? The short answer is no, since if it were not for cereals more than half the world’s population would starve. More over, can the world’s ecosystems tolerate the load placed on it by animal protein production? Certainly not.

As will be shown elsewhere, the world’s resources cannot meet this demand, where already the total mass of beef livestock  exceeds that of the human race (Paul Erhlich 04). Are there others ways apart from the Palaeolithic diet approach in which we can modify our current western diet which is high in animal protein, saturated fats, highly refined carbohydrates and salt to one with low animal protein, low dairy and less refined carbohydrates and still be healthy without damaging the planet? Simply put, yes.

The final hypothesis attempts to show that the main evolutionary push for the development of big brains came from cooking plant-based food. Cooking markedly increases the amount of energy available from many plant based foods such as tubers. Cooking may also have contributed to the easier consumption of meat by making it less tough. And finally, cooking may have been a major driver in the development of many human social characteristics. Unfortunately, the archaeological record in relation to cooking beyond 100,000 years ago is very limited and controversial. We just cannot know.

The expensive tissue hypothesis. (Aiello 95)

One of the major papers quoted in the recent Australian Meat and Livestock advertising campaign has been "The expensive-tissue hypothesis: the brain and digestive system in human and primate evolution" by L Aiello and P Wheeler in the journal Current Anthropology 1995;36:199-221. This paper has also been used in a similar way in the US.

The basic tenet of this paper used by the proponents of the concept "that meat is essential for large brains" is that a high quality diet allowed a smaller less energy consuming gut. This then removed the energy constraint for the development of a much larger brain by redistributing this energy for brain development. The authors note that there is an inverse correlation between the relative size of the brain to that of the gut in primates, the bigger the brain, the smaller the relative size of the gut. They then hypothesize that because there no evidence to indicate that humans have increased their metabolic rate to supply the extra energy, the only way the body could supply additional energy to run the bigger brain was to take it from some other organ. The only organ that could be reduced in this way in their opinion was the gut.

They imply that a smaller gut is linked to a higher quality diet, i.e. food that has a higher energy content in it compared to low energy diet eaten by smaller brained animals. By further implication this suggests that for early man to attain a high enough quality diet he had to red meat. Hence according the MLA advertisement, "One thing we know for certain, if our early ancestors hadn’t eaten red meat, our brain wouldn’t be the size it is today." They make this assertion with the knowledge that most people won't be able to look up the reference. I even wonder whether people who wrote the advertisement actually read it themselves.

If they had read it, they would have found that the authors didn't actually say this.  Remember that the title of this key reference  contains the word "hypothesis" and this is far from proof. In fact in the authors’ reply to the various commentators on this work the they state," we want to make it clear that our hypothesis does not require dietary change to be the "prime mover" or even one the "prime movers" for encephalization." In plainer language, the evolutionary increase in brain size cannot be attributed directly to the adoption of a meat eating diet.

Rebuttal of the expensive tissue hypothesis.

The following outline is based on comments made by other experts in anthropology, published at the same time with the original paper. The commentators were Este Armstrong, Annapolis Maryland; Dean Falk, State University of NY, Albany NY; Maciej Henneberg, University of Witwatersrand, Parktown South Africa; Ralph Holloway, Columbia University, NY; Linda Marchant, Miami University, Oxford, Ohio, USA; Katherine Milton University of California; Richard Wrangham, James Holland and Mark Leighton, Harvard University, Connecticut, USA. (Aiello and Wheeler 95)

Quality diets don’t necessarily have red meat: The higher quality diet doesn’t necessarily have to contain red meat. As has been shown in this article, Capuchin and the South American squirrel monkeys have high levels of encephalization (brain size compared to body size) far beyond other primates such as the Chimpanzee. However, Capuchin monkeys diet consists of fruit, insects and other plant material.

The link between eating red meat and brain expansion is not proven: The problem in attempting to make this linkage is that it is far from clear what the exact diet of our ancestors was, particularly of the early hominins, and how it changed over the eons and how closely this was correlated to brain size. The dietary data are better for early Homo sapiens, this being the basis of the so-called Palaeolithic diet (early modern man in the upper Palaeolithic period) and hence the attempt to link meat eating with brain expansion. The only problem with this is that the expansion of brain size had been proceeding for more than a million years prior to this in time, this expansion occurring when dietary composition is far from established and possibly did not contain large proportion of animal meat.

Brain size is not closely tied to "intelligence" in primates: The relative brain size in primates does not correlate well to their apparent intelligence as is shown by the fact that Capuchin monkeys have a far higher relative brain size compared to the gorilla (3.25 EQ - encephalization quotient versus 1.9 EQ), yet the gorilla’s intelligence despite its relatively small brain is much closer to that of humans (4.75 EQ) than is the Capuchin monkey.

Other sources of energy to run the larger brain: The concept that the gut had to become smaller to reduce its energy consumption to allow for the higher energy consumption of the large human brain has been questioned. The gut uses energy to digest food, the lower the quality of the diet, the more energy has to be supplied to extract and absorb the nutrients. Smaller guts would therefore require higher quality diets to supply the necessary energy.

The data used to support this inverse relationship between brain and gut size were based on calculations using the basal metabolic rate i.e. the energy we use while resting. However most of the time we are active, and our metabolic rate rises accordingly, measured by what is referred to as the field metabolic rate. By putting this field metabolic rate into the concept a completely different set of explanations is possible.

The additional energy requirements of the large human brain could easily come from either sleeping a little longer or by being not quite so active rather than necessarily coming from a smaller gut with a higher quality diet. For example, the difference in daily energy consumption between the large human brain and the average sized mammalian brain for the same sized body equates to walking at average speed for 45 minutes.

The hypothesis was based on the use of a standardized theoretical human: Many of the commentators of this hypothesis have noted that there is considerable variability in modern humans across the world, some groups with relatively much larger guts associated with caecal/colonic fermentation of dietary fibre. The hypothesis to hold true would suggest that these people should have smaller brains, something that has not been shown. Hence, if such data from the range of values for people seen in the world today were used the relationship becomes much less clear.

There are no conclusive data to link evolutionary events of change in brain and gut size: Even if it could be shown irrefutably that the human brain has become proportionally larger while at exactly the same time the gut has become smaller during the evolution, this does not establish in any way a causal link. They may be coincidental or linked to a third common factor. A further problem is that the changes in the proportions of the two organs, brain and gut, cannot be proved to be contemporary evolutionary events with the current evidence. If they are not, then the whole hypothesis of a causal link between diet/gut and brain size becomes invalid. The evidence is lacking.

Mechanisms for removing the heat generated by a large brain were possibly important evolutionary driving forces in allowing a large brain to develop. A large brain produces a lot of heat, which if not removed would cause hyperthermia and death. Hence large brain evolution could not occur without changes to remove this heat. The mechanisms of thermal protection which evolved were a hairless skin with its associated sweat glands along with specialized veins draining blood away from the brain cortex. These mechanisms would not be tied to diet quality.

There are other perhaps more convincing explanations of the evolutionary evidence: Primates have always had relatively large brain sizes ever since they evolved into a distinct lineage. It has been proposed that the ancestral lineage leading to the establishment of the primates, expanded into the arboreal habitat of the emerging tropical forest flowering trees, giving access to much higher quality foods such as insects, ripe fruits and flowers. This would have required a an expansion of brain-based functions such as learning and memory in locating and remembering where the food was and manual dexterity in harvesting it.

Hence the conclusion could be the exact reverse as been hypothesized by Aiello and Wheeler: the expansion of the brain coming first to provide the higher quality diet followed by the reduction of gut with the higher quality diet. Further from an evolutionary point of view it would seem a dangerous strategy for an energetic brain to develop before the quality of food /energy supply was firmly assured, i.e. the gut could then be reduced in size once the thinking power to give the necessary ability to assure high quality diet was established rather than the other way around.

The problem with interpreting the early remains: the archaeological record is biased in favour of meat consumption. Many palaeontologists favour meat as the major energy source of the early ancestors of man, as there is strong evidence in the palaeo-archaeological record that the Australopithecines who preceded the Homo line in Pliocene/Pleistocene ate meat, since there are flake stone tools and animal bones with cut marks from such tools, indicating some type of butchery. However, just how much this contributed to the diet is unknown. There is a major bias in this record since artifacts such as wooden digging sticks and remains associated with plant derived foods such as tubers are not preserved from these eras, so that what the major sources of food were can only guessed at and what proportion of meat there was in the diet is not known (Nestle 99).

Fixed isotope studies as an indicator of food source are often stated as clearly established when in fact they come from very limited data and are subject to considerable interpretation. There is reasonable indirect evidence from Neanderthal remains of 20 to 50 thousand years ago in the upper Palaeolithic to indicate that early man consumed a high proportion of meat as shown by stable isotope d13C and d15N studies of fossil remains. In the next 10-20 thousand years, there is evidence of a marked broadening in the diet to include aquatic sources and small game such as reptiles, birds and small mammals (Richards 01). The ratios of these isotopes in preserved collagen reflect the source of food  as the distribution of the ratios would be similar to those in the predominant food source.

However, such studies have significant draw backs. The very limited number of remains studied relates to the fact that only a limited number of the preserved remains are suitable for analysis and almost all only go back as far as the late Palaeolithic, 30 thousand years ago and a few back to 50 thousand years ago. The findings are by no means clear cut and require considerable interpretation by being compared to the remains of animals that were consumed from that time period.

In one case, the results were suggestive of plant based diet but because the ratios matched animal fossils of the same era in that place, the assumption was made that the these people ate meat from these animals. (Richards 00) As no data are given as to fixed isotope ratios in plants of that era, my view is that no firm conclusion can be made since these results cannot exclude the possibility that plants-based foods were the dominant food source.

The fossilized evidence should not be over interpreted: there are limited data and much speculation: It should be remembered that the actual number of early human skeletons, partial skeletons or bone fragments is very small and often from quite restricted geographical areas. To make generalizations from such incomplete evidence can only be speculation and not established scientific fact. In comparison, the sample sizes today that would be acceptable for good statistical analysis are huge compared to these.

There is no doubt that Homo brain size increased dramatically in the 2 million or so years leading up to Homo erectus, that dentition became smaller with less enamel and that gut size decreased and at the same time there was evidence that meat consumption became a bigger part of human diet. However, to attribute the increase in brain size to a single evolutionary driving force, the adoption of meat eating is scientifically unjustified because of the well known error of logic: because two events are linked in time, they are necessarily cause and effect.

Was meat the only source of non-plant protein in the diet of our pre Homo sapiens ancestors? We cannot know for sure, but there is evidence from other non-human primates that insects form a significant part of their diets which could easily have been the case in our ancestors. Insects are a rich source of nutrients, even better than meat. The fossil record will not show any evidence of this. Likewise the consumption of honey would leave no trace in the fossil record.

Meat consumption as the major source of food would have been a dangerous evolutionary strategy. Hunting for our early ancestors a million or more years ago would have not been easy. They lacked many of the weapons used by modern hunter-gathers such as bows and arrows and spears. The first evidence of spears is from around 500,000 years ago and bows and arrows go back less than 10,000 years. While there is no doubt that such prey was caught from the fossil record, exactly how much is a matter of conjecture. Catching prey is a high return activity but it is also high risk since it may not have always been easy to catch even after a long chase. In lean times, prey may not have been all that abundant and these early peoples would have had to have had a major fall back strategy if they were to survive.

That fall back position of a largely plant-based diet is almost without doubt. Today in savannah areas in Africa where early man spent a lot of his evolution, the density of tubers can be huge, up to 40 tonnes per square kilometre. For things to be a major evolutionary driving force over very long periods of time, they need to be relatively constant, something that meat consumption probably wasn’t and that plant-based foods were. In the end, for an early hominin to evolve a large brain that gained its predominant energy supply from meat to be then be placed in an environment that had relatively little meat would have been disastrous from an evolutionary point of view. Loss of meat supply could easily have occurred with secular climate change, and lessens the chance that meat was the major driving force.

So was meat consumption the major driving force in the evolution of modern human’s large brain? As has been shown, there is much speculation but little concrete proof.  Does this matter? It does only if you are tempted to make assertions about eating a meat filled diet. Superficially these assertions make good advertising copy but on closer inspection they are just speculation. Considering how much damage animal protein production does to the world, it matters a great deal that this science has been so misused.

[Top]

A summary of the Palaeolithic diet evidence.

The following points have been put forward to justify the concept of the Palaeolithic diet concept. (Cordain 05)

Hunter gatherer diet can be used as the template for the Palaeolithic diet. As early man in the upper Palaeolithic era was a hunter-gatherer and not a farmer, it is asserted that he can be compared to the modern hunter gatherers of today. There is good evidence to show that the majority of these modern peoples have animal-based foods as a major part of their diets with around 60-70% of energy from animal meat (including sea food) and fat. The remainder of their diet comes from fruits and other plant based foods such as tubers, with small components of foods such as insects and honey.

There was a need for animal-based foods to supply the additional energy and essential nutrients for large brain evolution. As we evolved over the period of two million years or so, it is asserted we became adapted to eating animal-based foods which gave us a quality energy source to allow for the added energy requirements of a large brain. Further as a consequence we became less able to produce our own brain building fatty acids which are also found in animal-based foods. Rather than make them ourselves, the body became reliant on this outside source over evolutionary time.

These in particular are specialized poly-unsaturated omega 3 long chain fatty acids (n-3 PUFAs) such as EPA and DHA and the omega 6 fatty acid, arachidonic acid which are vital components of our brains, nerves and eyes. Hence the argument given by the exponents of "meat being essential for the evolution of the brain" goes that since we are less able to produce these components ourselves, we must get them from animal foods for our optimal development.

The reduced life expectancy of hunter-gatherers compared to modern westerners is not relevant. The average life expectancy of modern hunter-gatherers is around 40years and about 20% live beyond 60 years. The explanation is that the average life expectancy is dragged down by a much higher rate of infant mortality, since it only takes a relatively few deaths at a very early age to pull down the average. During adult life and before the onset of old age with its associated chronic disorders such as heart disease and cancer, things that would be minor problems to westerners who have access to sophisticated medical care such as infections, bleeding peptic ulcers, accidents and the like would be life threatening and likely to shorten life spans.

The major thing to note about the hunter-gatherers is their lack of chronic diseases such as hypertension, coronary artery disease, type II diabetes, cancer and so on. Perhaps they didn’t live long enough for them to develop but the evidence is good that they show little evidence of the early changes of these disorders. Hence if these people had access to modern medical interventions, they would outlasted westerners because of the absence of most chronic diseases associated with the west.

Much of the problem lies in the carbohydrate content of the typical western diet. If the carbohydrate is replaced by equi-caloric amounts of meat protein, many of the lipid abnormalities abate. Likewise the satiety of protein is higher than carbohydrate, making consumers of such diets less likely to want to eat again before the next meal. This is also backing for the Atkin’s diet which is not the same as the proposed Palaeolithic diet.

There has been insufficient evolutionary time for humans to evolve to deal with dietary change that has occurred in the last 10,000 years. It is asserted that as there is little apparent genetic variation in modern humans today, it follows that, despite differing environmental settings, we really have not changed much away from early Homo sapiens. Since we have not had time to adapt to these changes in diet, the main assertion is that it would be sensible to consume a diet most closely resembling that of the upper Palaeolithic for this is what we evolved over the past 2 million years to consume.

What are the problems with these considerations? The following are counter-arguments which call this whole concept of the so-called Palaeolithic diet into question.

Can we extrapolate back from modern hunter-gatherer diets? Whether this dietary composition can be extrapolated back to Palaeolithic times is an open question. For example bows and arrows which are used extensively by modern hunter gatherers probably did not exist in the upper Palaeolithic. Other hunting weapons probably only go back 500,000 years. Modern hunter-gatherers are likely to be much more sophisticated compared to earlier hominins having gained a lot of knowledge along the way, even if it is lesser than modern western humans.

There is good evidence that modern hunter gatherer societies by the 1970s were radically changed compared to their original societies so that it is difficult to extrapolate back 25 to 30 years let alone 10,000 years ago. (Nestle 99) Hence it is dangerous scientifically to simply say that the behaviour in food gathering and processing was similar to our forebears in the upper Palaeolithic.

Game meat in hunter-gatherer diets is radically different from the meat which we buy from the butchers. Why is it different? Game animals only have significant amounts of fat for a limited part of the year. The constituent fats are made up of slightly more than half by polyunsaturated (PUFAs) and monounsaturated (MUFAs) fats with the rest being saturated as shown in wild caribou (Cordain 05) The omega 3 to omega 6 ratio (N-3:N-6) is very much lower in game meat than it is meats available in the west, relating to the fact that a large proportion of animals for meat production are fed grain which in itself is low in N-3 fatty acids, to fatten them very rapidly in less than 2 years.

Pasture raised cattle have slightly better lipid characteristics but are still closer to grain fed cattle than game meat. Grain feeding causes marbling of meat, with fat deposits in muscle low in N-3 fatty acids (Cordain 05). One of the proposed Palaeolithic diets has significant amounts of sea food to boost the supply of N-3 PUFAs to be eaten in association with lean meat cuts because of the presumed lack of omega 3 (N-3) fatty acids.

The evidence that the human body cannot produce adequate amounts of essential PUFAs is lacking. Diets which lack the N-3 fatty acid, alpha linoleic acid the precursor of DHA and HPA cause severe medical problems. However, vegetarian diets contain adequate amounts of this compound. Extensive studies have been carried in the most vulnerable, neonates and in particular premature babies. The evidence is mixed with some supporting supplementation with LC PUFAs and others not. The point is that there is no firm evidence that humans cannot manufacture enough DHA and arachidonic acid from alpha linoleic and linoleic acids respectively (Heird 05).

Further, in societies that have low meat-based food intakes as well as among vegetarians, there is no evidence of the lack of these nutrients. Remember also that after man became an agriculturalist in the last 10,000 years, with much less animal protein than in the Palaeolithic diet, our major western societies among such meat poor people. In other words, despite the marked reduction in animal protein and fat, they were able to create the world’s leading societies.

The reduction in life expectancy seen in the times between the upper Palaeolithic and modern times cannot be attributed specifically to dietary change. It is true that life expectancy fell to an extreme low at around the time of the industrial revolution with average life expectancies of around 20 (related mainly to the extremely high infant mortality), but this change in life expectancy cannot be used to support the adoption of a hunter-gatherer type diet. Many critics of the Palaeolithic diet have pointed to the short life expectancy of modern hunter-gatherers, around 40 years, who possibly consume a diet very much like that in the upper Palaeolithic times. The Paleodiet protagonists explain this away by saying that we should take into consideration high infant mortality and the lack of evidence of chronic disease in them.

However, it cannot be argued both ways: after the adoption of agriculture, high density living caused major disease epidemics with very high infant mortality, thus markedly lowering the average survival. Short survival was not necessarily related to dietary composition as such but rather to severe malnutrition because of food shortage/spoilage limiting the range or overall amount of food as well as poor living conditions and sanitation. In addition, the rise in settled farming communities was associated with evolution of many infectious diseases, particularly viral diseases which jumped species from domesticated animals into humans. Such infectious diseases have become a dominant force after agriculture was adopted.

The increase of human life expectancy and a return to a similar stature of that seen in the Palaeolithic era in the last 100 years is clearly related to change in living conditions with greater abundance of food and not to dietary composition specifically.

Estimates of Palaeolithic man's life span was around 25 years. Limited empiric data make it hard to give accurate estimates but one estimate by Cohen (Cohen 89) set this figure around 25 years. This suggests that the Paleolithic diet, among many other considerations must have been less than ideal (Nestle 99). 

The central concepts of meat is necessary for brain evolution have problems. The two central concepts of meat being the main driver for encephalization in man are that meat provided the necessary extra dietary energy to allow the brain to develop and also supplied the extra preformed fatty acids (arachidonic acid, TPA and DHA) which, they claim, humans don’t produce rapidly enough, thus causing a bottle neck. The high energy component of animal-based food consumption comes from fat (9Kcal/g) and not protein which provides the same energy as carbohydrate (4Kcal/g).

The type of meat that the earlier hominins consumed was likely to be small game since they lacked modern hunting weapons (Cordain 01).The smaller the game the less the fat (Cordain 05). Even large game has limited amounts of these fat in muscle tissue, and was essentially a poor source of longer chain fatty acids such as TPA (Cordain 01). Without cooking, this type of meat can only be eaten slowly with the type of dentition these people had (Wrangham 03). Thus the amount of extra energy gained from such sources would have been limited.

Further, eating large amounts of lean small game without significant fat or carbohydrate intake is known to cause severe life threatening metabolic disorders in modern humans (Cordain 01) and thus early man is unlikely to have survived on small game alone or for that matter any sort of game meat. Good sources of the fatty acids used in brain development are found in the brains and lesser extent liver and bone marrow of killed animals. Early human scavengers were able to access the brain and marrow of larger prey using primitive tools. This would have been available since these tissues are frequently left over because carnivores and other scavengers couldn’t access it. This is clearly shown with modern carnivorous wild animals which usually leave bones and skull uneaten (Cordain 01).

Further examining this evidence, there is an important point to be made: in wild animal-based foods, the principal source of EPA was brain tissue with a reasonable amount of arachidonic acid from marrow. Most of the high energy intake would have come from eating marrow because of its fat content. Small animals have little fat and other predators/scavengers would have eaten any subcutaneous fat themselves. As noted above, lean meat has only very limited amounts of these fatty acid components and only relatively low energy concentrations. Hence, the conclusion is that if this hypothesis is true that evolving man required both energy dense food source and essential fatty acids for the evolution of a large brain, he would not have gained them from eating lean meat, but most likely from consuming marrow (Cordain 01).

If the argument that meat eating was an essential source of LC PUFAs for brain development, then today we should eat marrow and brains rather than lean meat. This fact that seems completely overlooked in the assertions made in the meat industry advertisements. If true, we should be directed to eat marrow and brain neither of which are very appetising and both of which are dangerous from the point of view of BSE (Mad cow disease). Further, even the consumption of animal meat has been questioned as the source of energy and nutrients as some have proposed these came from consumption of fish as well as shellfish and crustaceans.

So is it really true that we require red meat for normal brain development? As we well shall see from the information elsewhere in this site, this assertion that we cannot supply enough of own without exogenous sources can be shown to be untrue for a number of reasons: vegetarians develop normally; there are many in the past who didn’t eat a lot of meat after the adoption of cereal cropping, but who none-the-less had substantial communities which added to the richness of human history. See sections on vegetarians and also brain development.

The final point of this is that the supply of such tissues would have been very variable and hence unlikely to be the principal driver towards larger brains. However, what is known for certain is that hominin brain size was continuing to increase throughout all these periods when it is likely that meat supplies were variable. Obviously there was a steady force driving this evolution which was probably not meat consumption per se.

The data analysis concentrates on large animals where as there is evidence for a much wider range of animal food sources. Much of the data presented in the various Cordain papers focuses on the analysis of large herbivore animals. However, recent analysis of human sites in the middle and upper Palaeolithic in the Mediterranean region show a much more extensive use of small animals from shellfish to reptiles to birds and hares (Stiner 05). The need to exploit this food source may have been in response to increasing population density. The fat content of the creatures is much less than that of large herbivores. In addition, these studies show considerable diversity between the various sites measured and many of the conclusions remain open to interpretation. None of this analysis sheds light on the overall composition of the diet of this period since the evidence of plant-based foods has not been preserved.

Has there been sufficient time for evolutionary selection to have taken place to adapt to the change in diet? Many of the diseases induced by our current western diet will not exert significant evolutionary pressure. Most of the adverse effects of our diet are from chronic diseases that occur long after our reproductive years and have little bearing on whether we can have children. Likewise, it is unlikely that any of them will occur early enough to have any adverse impact on bringing up children. However, despite claims to the contrary, there is good evidence of genetic adaptation to dietary changes that have occurred after the adoption of farming and animal husbandry and probably also in the period leading up to this change, most likely associated with a long progressive change towards a more plant based diet.

Lactose intolerance is a good example of recent evolutionary change. Neonates have the enzyme lactase in their gut that digest lactose in their mothers milk. However, as the baby grows these enzymes are progressively lost in races that have never had a tradition of consuming animal milk, as is the case in many other animal species. The original selection of the genetic characteristic of progressive loss of lactase was driven by the need to limit the time of suckling and allowed the return of normal fertility. When man domesticated animals, part of his diet then included milk and cheese. Those people who had the least lactase enzyme activity as weaned children would have exhibited lactose intolerance and hence would not have been able to consume such foods, thus becoming less fit and able to reproduce. Those that produced lactase for longer would have be positively selected. Hence in the world today we see a clear division. Those peoples such as the Europeans who have come from a milk drinking background have a low incidence of lactose intolerance and those who do not, such as many Asian peoples, do.

Consumption of fava beans in Mediterranean countries giving protection against malaria is another and thus selecting for peoples who consumed this vegetable is another (Nabham). While significant consumption of many of our current foods such as sucrose, fructose, vegetables oils are very recent, milled cereals entered our diet sufficiently long ago, around 10,000 years to allow for significant genetic adaptation in ways that are as yet to be determined. There is no proof that we should exclude cereals on the basis genetic unsuitability.

Recently it has been shown that humans have increased production of amylase, an enzyme that is required for the digestion of starch, by increasing the number of copies of the amylase gene, a change that has occurred in the past 200,000 years. In general there is a broad correlation between the number of copies of the gene and the level of starch in the diet for a particular racial group. However, even groups that have a relatively low level of starch in their diets presently have increased, but lesser, copy numbers. This suggests such groups have moved back to a higher animal food based diet but had ancestors who ate a diet much richer in starches. This suggests but does not prove that amylase production increased in a progressive change to a diet higher in starch prior to the adoption of farming some 10,000 years ago (Perry 07) (Cohen 07).

The Palaeolithic diet is not supportable on a world scale. If everyone in the world were to adopt this diet as proposed by Cordain, there would not be enough resources to produce it and in turn seriously threaten the long term survival of the human race. We are already using too much of our agricultural resources in producing animal-based foods to be sustainable and is currently even now causing severe ecological damage. See the section on environmental damage.

Likewise the amount of fish required, if eaten by all would vastly outstrip supply which presently is under severe threat. If adopted by the relatively few rich westerners, this selfishness will further exacerbate the already marked greedy consumption of the world’s resources by the west. See section on fishing. Also remember that it is cereal consumption that feeds a majority of the world’s population and without this, huge numbers would starve. We simply have to have a diet that is good for the world and good for us that contains significant amounts of this staple food without causing us health problems.

The Palaeolithic diet is not the only solution to our western dietary problems. While this diet may work quite well from a medical point of view, it is not the only solution. The required experiments to confirm that we can safely move away from the Palaeolithic have already been carried out on a large scale and shown to work. They are called the Mediterranean diet and vegetarianism. See "What to eat".

The Mediterranean diet is well known: it contains some meat, some diary but has lots of fruit and vegetables along with vegetable oil such as olive oil. It is low on refined carbohydrates which dominate processed food consumed to huge excess in many western countries. Further there is strong evidence to show that vegetarians after controlling for the fact that these people are less likely to drink excessively, smoke, indulge in dangerous activities such as unsafe sex, have less chronic disease and live longer than similar omnivorous people. See the section on vegetarians.

The Paleodiet would not be readily accepted by a significant number of the community. The Paleodiet removes some very major food categories: dairy, cereal and vegetable oils. The Mediterranean diet for example has a full range of foods and is extremely interesting, even vegetarian diets are much more interesting than the Paleodiet.

Are there any dangers of adopting the Palaeolithic diet: is too much iron (or red meat ) bad for your health? Red meat provides large amounts of iron in a readily absorbable heme-iron form. Can consuming a diet which contains large amounts of red meat be bad for you?

  • Infections: Many micro-organisms require iron and get it by diverting host iron to their own uses. There is some evidence that higher levels of iron will promote diseases such as malaria and TB. The evidence that common bacterial infections in western societies are promoted by higher levels of iron is not convincing. (Oppenheimer 02)

  • Reducing iron by donating blood. There are reasonable data to show that regular blood donors have less cardio-vascular disease when all the confounding variables are controlled for (Meyers 02). Previous studies have had problems with confounding factors (Sempas 02). However, there are other possible mechanisms to explain this. Blood donors have lower levels of haemoglobin and there is a continuous positive correlation linking its level with platelet re-activity (platelets are a cellular component of clotting) and thus a link to coronary artery disease.
     
  • Red meat rather than iron per se may be related to the development of type II diabetes. There has been some suggestive evidence that high levels of heme iron from red meat may be an independent risk factor for this disorder. In a study by Jiang which looked at heme intake, iron status associated with blood donation and diabetes. The only significant association was with heme iron intake from meat (Jiang 04).

The underlying mechanism may be that iron is an oxidising agent and produces free radicals which may damage tissue. The body has highly developed mechanisms with proteins such as ferritin that mop up excess iron. Despite this there is evidence that there is a continuous variable effect with iron loading: the more iron in the body the greater the damaging effect. Similarly, too little iron can cause problems apart from anaemia. It has been further suggested that higher levels of iron may be associated with other chronic diseases such as cancer.

Palaeolithic man would have lived in areas of the world where parasitic diseases such as hookworm were common, this being the most common cause of iron deficiency in these regions. Hence man is likely to have evolved in an iron deficient environment. In the west, we have eliminated most causes of iron deficiency and now probably have too much iron in our bodies. This situation is related to high levels of red meat consumption. (If you want to eat red meat you should become a blood donor.)

[Top]

The evolution of cooking is another possible major evolutionary driver.

This is another spin on our evolutionary story. The point is that there is really not enough evidence to make didactic statements on the basis of what possibly happened in human evolution.

What did cooking contribute to human evolution? A paper by Wrangham et al in 1999 strongly supported the view that the major food source of our forebears was not meat but tubers. While some of the assertions in this paper are un-testable, it is likely that at some stage cooking of foods added another major contribution to human evolution.

The major points of this paper which seem to bring together a number of unrelated observations are as follows:

  • Cooking is almost universally done by females in modern human societies.
     
  • Bonding of humans in marriage like relationships is likewise universal.
     
  • Human female receptiveness to sex is almost continuous, unlike most other animals.
     
  • Male-female size difference is lower than other primates.
     
  • In hunter gatherer societies, females usually do all or the majority of foraging, being a subordinate activity.

These have been tied together in a single unifying concept which appears to explain them, the so-called theft hypothesis:

With the adoption of cooking, food tended to be gathered in one place rather than consumed when harvested. Scrounging is common in animals and birds, the common strategy being that the subordinate doing the preliminary work is pushed aside by the dominant individual: in other words, theft. This strategy is much more rewarding when food has been cooked, having been gathered together and made ready for easy digestion. As this subordinate role was given to the consistently smallest of the group, the females, such females, once the food had been prepared, needed some sort of defence against such theft from males.

It has been suggested that the strategy adopted was for the female to have a protective male partner and the way to keep him was to offer as much sexual opportunity as possible. Hence the recent evolution of the Homo line has been to select for individuals to be able to be sexually receptive for greater and greater proportions of time, ending in today’s almost continuous sexual receptivity of mature females. (There are other factors as well which are likely to have played a role for bonding such as being a good mother, healthiness, mental ability etc.) Likewise there would have been a strong evolutionary drive for pairing with a protective male, resulting in today’s almost universal pattern of male-female alliances. The major driver of this evolutionary change: cooking of plant-based foods.

While the same strategy could have applied to uncooked foods, in that in some non-human primate populations today there is evidence of such sexual alliances being set up for male scrounging of food from subordinate females. However, because the rewards for scrounging prepared food is so much greater, this process would have been greatly enhanced by the advent of cooking.

The major problems with this are:

  • There is no support in the archaeological record of early digging tools or numbers stone tools for preparing such foods.
     
  • Cooking tubers in open fires without cooking pots would have been difficult. It is possible that non-cooking processing of these foods by pounding and grinding could have been used, producing increases in the availability of nutrients similar to that of cooking. Again there is not definite support of this in the archaeological record.
     
  • The fixed isotopic studies in upper Palaeolithic man suggest a dominance of animal-based foods. Evidence of hunting with weapons goes back at least half a million years.

So did cooking change the course of human evolution? In a more recent paper in 2003, Wrangham (Wrangham 03) sets out a more balanced view of the contribution of cooking to human evolution. This has been further expanded in an article in Science (Gibbons 07). The main points are:

Even though we don’t know exactly when cooking became a human behaviour, it is likely to have been adopted significantly long ago for it to have had marked effects on both nutrition and human to human relationships. The time elapsed would have certainly been long enough for humans to have adapted to this from an evolutionary point of view. Consumption of raw food, both animal and plant-based, would have substantially reduced the quality of the diet as evidenced by the raw-foodists of today. As has been shown in animal experiments, grinding meat improved digestion, cooking without grinding was better and cooked ground meat was best of all (Gibbons 07). Hence it is still quite likely that the major leap forward in giving humans the high quality diet that they needed for large brains was in fact the adoption of cooking along with methods of grinding food either cooked or raw.

The last words.

The assertion that "meat is essential for large brains" is without foundation. The fact is that we can run on a variety of diets, some without meat completely, and do very well. Many of the source material articles for this conclusion are those very references quoted in the Meat and Livestock Association of Australia advertisements in support of their case for eating more red meat. One of the great temptations in reviews such this is to only quote material that supports your view, "cherry picking". I have not done this.

In the end we must find a solution for a diet that won’t damage the world’s ecosystems, a diet that can be able to be used by all and not just a few selfish rich westerners. This is the whole raison d’etre of this web site. What is being proposed is simple: eat meat as an occasional luxury, don’t eat a lot of dairy product, avoid processed foods as much as possible and increase your intake of fruit and vegetables. By this you will avoid or help ameliorate heart disease, hypertension, stroke, type II diabetes, osteoporosis to name but a few and at the same time make a real difference to the world’s ecology.

Many of these listed factors are also part of the Palaeolithic diet, the major divergence is in the recommendation by the Palaeolithic diet protagonists to eat a lot of lean meat. The world simply cannot afford to do this.

[Top]

 

References:

(Aiello 95) Leslie Aiello, Peter Wheeler. The Expensive-Tissue Hypothesis: The Brain and the Digestive System in Human and Primate Evolution. Current Anthropology 1995; 36: 199-221

(Cohen 89) Cohen MN  Health and the Rise of Civilization.  Yale University Press 1989.

(Cohen 07) Jon Cohen. A little xeroxing goes a long way. Science 2007;317:1483

(Cordain 01) Loren Cordain, Bruce A. Watkins, Neil J. Mann. Fatty Acid Composition and Energy Density of Foods Available to African Hominids. Evolutionary Implications for Human Brain Development. World Rev Nutr Diet 2001; 90:144–161

(Cordain 02) L Cordain, B A Watkins, G L Florant, M Kelher, L Rogers and Y Li. Fatty acid analysis of wild ruminant tissues: evolutionary implications for reducing diet-related chronic disease. European Journal of Clinical Nutrition 2002; 56:181–191.

(Cordain 05) Loren Cordain, S Boyd Eaton, Anthony Sebastian, Neil Mann, Staffan Lindeberg, Bruce A Watkins, James H O’Keefe, and Janette Brand-Miller. Origins and evolution of the Western diet: health implications for the 21st century. Am J Clin Nutr.,2005;81:341-354

(Ehrlich 04) Paul and Anne Ehrlich. One with Nineveh. Island Press 2004

(Gibbons 07) Gibbons A. Food for thought. Science 2007; 316:1558-1260.

(Heird 05) William C. Heird and Alexandre Lapillonne The role of essential fatty acids in development. Annu. Rev. Nutr. 2005. 25:549–71

(Jiang 04) Rui Jiang, Jing Ma, Alberto Ascherio, Meir J Stampfer, Walter C Willett, and Frank B Hu. Dietary iron intake and blood donations in relation to risk of type 2 diabetes in men: a prospective cohort study. Am. J. Clinical Nutrition. 2004; 79: 70 - 75.

(Meyers 02) David Meyers, Kelly Jensen, Jay Menitove. A historical cohort study of the effect of lowering body iron through blood donation on incident cardiac events. Transfusion 2002; 42:1135-1139.

(Nabhan 04) Gary Nabhan in Why some like it hot. Island Press 2004 pps 17-22

(Nestle 99) Marion Nestle. Animal v. plant foods in human diets and health: is the historical record unequivocal? Proceedings of the Nutrition Society 1999; 58:211–218

(Oppenheimer 01) Stephen J. Oppenheimer. Iron and Its Relation to Immunity and Infectious Disease. J. Nutr. 2001;131: 616S–635S

(Perry 07) George Perry et al. Diet and the evolution of human amylase gene copy number variation. Mature genetics 2007; doi:10.1038/ng2123

(Richards 00) M. P. Richards and R. E. M. Hedges. FOCUS: Gough’s Cave and Sun Hole Cave Human Stable Isotope Values Indicate a High Animal Protein Diet in the British Upper Palaeolithic. Journal of Archaeological Science 2000; 27:1–3

(Richards 01) Michael P. Richards, Paul B. Pettitt, Mary C. Stiner, and Erik Trinkaus. Stable isotope evidence for increasing dietary breadth in the European mid-Upper Paleolithic. PNAS 2001; 98:6528–6532

(Sempas 02) C T Sempas, 2002, Do body iron stores increase the risk of developing coronary heart disease? Am J Clin Nutr, 76, p 501-503

(Stiner 00) Mary C. Stiner, Natalie D. Munro, Todd A. Surovell. The Tortoise and the Hare. Small-Game Use, the Broad-Spectrum Revolution, and Paleolithic Demography. Current Anthropology 2000; 41:39-73

(Wrangham 99) Richard W. Wrangham, James Holland Jones, Greg Laden, David Pilbeam, and NancyLou Conklin-Brittain. The Raw and the Stolen: Cooking and the Ecology of Human Origins.Current Anthropology 1999;40:567-594

(Wrangham 03) Richard Wrangham, NancyLou Conklin-Brittain. Cooking as a biological trait Comparative Biochemistry and Physiology Part A 2003;136:35–46