Tag Archives: Cereal Grains

Fire Production | The Paleo Diet
In our world, fire is such a basic element that we almost never give it a second thought. You click upon your gas powered kitchen range and instantly a circular blue flame emerges to fry your eggs, boil your water or steam your veggies. Your summertime barbecue or campfire is lit without a second thought from the cheap butane lighter you bought from the convenience store. If you happen to be a cigarette or pot smoker – who amongst you worries about ignition for your habit – the problem is not fire starting itself, but rather paying for or obtaining tobacco or marijuana. Such is the way of the western world – virtually all of us do not give a second thought about creating fire. We can all do it at anytime we want with no worry whatsoever.

But what if in our contemporary world we didn’t have a butane lighter, matches or other modern procedures to produce fire? How could it be done? Do you know how to start a simple fire without modern technology? Could you create a simple flame to cook, to smoke or for warmth? I am not a betting man, but I can almost guarantee you that for even a hundred or a thousand dollars, none of you could start a fire without modern technology, even if your life depended upon it. The very first friction matches were only invented in 1827 by John Walker2 and the invention of simple butane lighters are even more recent still in the early part of the 20th century. How did humanity create fire before these inventions?

Let’s eventually travel backwards in time and see how fire was created without modern technology, but more importantly, for the Paleo Diet community, let’s examine how the control and production of fire defines the food groups that our ancestors could not have consumed – food groups that have now become arbitrary staples of civilization and which are ironically recommended by governmental and institutional organizations as promoters of good health and well being.

As simple as it seems; knowing, using and producing fire from an evolutionary perspective requires a number of fundamental steps:

  1. Logical identification of the event (fire itself)
  2. Recognition of fire’s benefits
  3. Controlling fire
  4. Producing fire at will4, 15

Virtually all mammals and primates are aware of fire’s dangers and logically flee from it, but none other than our own species identify its potential benefits. The majority of the anthropological community now recognize that fire use by hominids did not appear habitually anywhere in the world except in Europe, sporadically and opportunistically until about 300,000 to 400,000 years ago.9, 11, 14 I quote the most comprehensive recent review of ancient fire use:

“However, surprisingly, evidence for use of fire in the Early and early Middle Pleistocene of Europe is extremely weak. Or, more exactly, it is nonexistent, until ∼300–400 ka.”11

What does this statement mean? It means that Neanderthals living in Europe 300,000 to 400,000 years ago were the first hominids to:

  1. Logically identify fire
  2. Recognize its benefits
  3. Control it

However, the huge caveat here is that they almost certainly did not have the ability to produce fire at will.12, 13 How do we know this? Archaeological excavations of Neanderthal caves during extended cold periods in Europe show a virtual lack of fire use when the climate worsened and became quite frigid.12, 13 Accordingly, Neanderthals were at the mercy of collecting naturally occurring fire and keeping it alive for extended periods. Clearly, this approach was a “hit and miss” venture at best, as Neanderthals frequently suffered in the bitter cold of their winter caves without fire.12, 13

Controlling Fire vs. Producing Fire

The archaeological record from Europe shows evidence for fire control by about 300,000 to 400,000 years ago, but remember that the ability to control fire is far different than the ability to produce it.9, 11, 14 Naturally occurring fire results from lightening strikes, volcanic eruptions and spontaneous combustion via decaying plant material. Far and away, lightening is responsible for almost all naturally occurring fires. Hence, before humanity had the ability to produce fire, we were generally limited to collecting and preserving lightening caused fires. Apparently, this strategy was opportunistic and occasional at best, based upon the scarcity of fire in the early fossil record.9, 11, 12-14, 18 Because humanity lacked the knowledge and capacity to produce fire wherever and whenever we desired, then this limitation clearly prevented us from regularly consuming entire categories of plant foods (cereal grains, almost all legumes and most tubers and roots) which are normally inedible without cooking. The inability of humanity to produce fire therefore represents a crucial “line drawn in the sand” for defining foods and food groups that should or should not be included in contemporary Paleo Diets.

Producing Fire

As simple as it seems, fire production without modern technology is complicated, requires practice, instruction and dedicated skills.2 Alfred Kroeber, a world famous anthropologist from the University of California at Berkeley who studied the last wild Indian (Ishi) in North America in the early 1900s, simply could not light a fire in front of his University anthropology class when attempting to use Ishi’s hand held fire drill.20

Our genus (Homo) first appeared on earth about 2 million years ago. The most current data suggests that the ability to habitually produce fire by our species occurred only as, “a very late phenomenon restricted to the archaeological record of modern humans at the end of the Pleistocene.”14 This evidence based conclusion12-14 is consistent with the sum of the most recent archaeological data and does not support prior propositions of earlier habitual fire production, but rather opportunistic gathering of naturally occurring fire.1, 3, 18, 19 If we look at the emergence of habitual fire production as an exclusive innovation of modern humans, then you can appreciate how recent this technology really is, particularly from an evolutionary time scale. To put things into a perspective that we can all understand, fire production likely first came into regular play on a 24 hour time clock for all of humanity somewhere between 36 to 48 minutes (75,000 to 100,00 years ago) to midnight. Think about it – our genus (Homo) has existed for more than 2 million years, yet except for the final throes of our evolutionary period on earth did we ever consume any plant foods (cereal grains, legumes and most tuber and roots) that required cooking to make them edible.

One of the questions you certainly must ask, as have numerous people and professional anthropologists have posed before you is this: why did it take so long and why was it so difficult for humanity to produce fire? One of Charles Dickens’ most famous quotes, “It was the best of times; it was the worst of times” sticks in my mind. Good ideas become the “best of times” but only until they surface – before their appearance we must endure the prior status quo with the “worst of times.” Over the course of our species evolution, human technological innovation has moved at a dreadfully slow pace, primarily because prior accomplishments could not be documented or widely distributed until the advent of writing, the printing press and most recently computers and the internet. Nevertheless, the invention of fire production was an innovation that seems to have taken the entire world by storm sometime after modern humans evolved in Africa about 200,000 years ago and then began to colonize the planet about 60,000 years ago.2, 5-9

So just how did our ancestors do it? How did they invent the ability to produce fire whenever and wherever they wanted? The ethnographic literature of hunter gatherer societies universally shows that they utilized two basic means to generate fire:

  1. Wood on wood friction2, 4, 6-8
  2. Stone on stone percussion or friction using flint and iron bearing stones (pyrite or marcasite)10, 14, 15-17

Archaeological evidence from Europe indicates that production of fire via flint and iron stone percussion was rare or virtually absent in pre-agricultural people throughout Europe.14 Hence, it seems likely that the first Europeans to produce fire may have utilized wood on wood friction techniques to start fires.14 This fire starting procedure (wood on wood friction) likely spread rapidly worldwide, and evidence for fire production via this method appears in Australian2, 4 and North American hunter gatherer societies2, 6-8 as humans colonized these continents and elsewhere.2

Fire starting by wood on wood friction can be accomplished by a number of procedures. The most common method by worldwide hunter gatherers is the fire or hand drill (Figure 1 below).2, 4, 6 Other methods include the bow drill, the pump drill, the fire plow, the fire saw and the spear thrower over shield.2 Although the fire drill appears to be an easy and straightforward technique to produce fire, a number of crucial technological nuances virtually prevent unskilled operators from successfully producing fire,2 as was similarly experienced by Professor Kroeber in front of his anthropology class.20 Skilled hunter gatherers under good conditions can ignite fire in less than a minute with a fire drill, whereas the best modern survivalists can do it in 28 seconds.2

Logic dictates that the very first humans to start a fire via the hand drill method certainly did not preconceive this method in its entirety with the intent of producing fire. Rather fire must have accidentally resulted from an entirely separate operation – drilling to produce holes in objects.

Fire Production Figure 1

Figure 1. Fire production by the Giwi Hunter Gatherers using fire drills.

Since the appearance of modern humans in Europe more than 40,000 years ago, the fossil record is replete with drilled items – bone and stone necklaces, bone flutes, wooden grommets, and other items which are perforated with holes either drilled or punched into them. Accordingly, the very first fire ever created by any human from the hand drill method must have unexpectedly occurred with the original goal of drilling a hole into a wooden object with a wooden drilling stick. I bring this concept up to provide corroborative evidence that Neanderthals nor any other earlier hominid had the ability to habitually produce fire. Until modern humans arrived on the scene, the fossil record is almost completely devoid of drilled objects. Hence, the technology (drilling) that allowed modern humans to accidentally discover a universal procedure to ignite fire was not part of the technological repertoire of any hominids that came before us.

Nutritional and Dietary Implications of Fire Production

Before I leave this discussion, the most important consequence of when fire production first occurred in our ancestral past is the nutritional “line in the sand” that I alluded to earlier. As the Paleo Diet becomes more and more popular, its original message has become weakened by so-called experts whose Paleo food recommendations now include legumes, beans, lentils, garbanzo beans, lima beans, green beans, peas, quinoa, chia seeds, amaranth and other foods which are either toxic, indigestible or minimally digestible without cooking.21, 22 Further, these foods contain a variety of antinutrients (phytate, lectins, saponins, protease inhibitors, thaumatin-like proteins, tannins, isoflavones, raffinose oligosaccharides, cyanogenetic glycosides, favism glycosides and others), which in both their uncooked and cooked states impair gut health, immune and hormonal function while impairing nutrient absorption.21, 22

Our species has no nutritional requirement for cereal grains, legumes or tubers. We can obtain all required human vitamins and minerals from fresh vegetables, fruits, meats, fish, shellfish, seafood, eggs and nuts. The archaeological evidence produces a clear factual mandate that no hominids had the ability to habitually produce fire until very recent evolutionary times.11-14 Accordingly, plant foods that required the production of fire and cooking for their digestion and assimilation were not part of our original menu. Incorporation of these foods into contemporary diets is now known to reduce the nutrient density (vitamins and minerals of the 13 nutrients most lacking in the US diet)23, 24 while simultaneously promoting chronic diseases of western civilization.24, 25

The invention of fire was a very good thing. It changed our lives forever. The important message here for the 21st century Paleo Diet movement is to leave the worst of our ancestral world behind us (living in cold caves, etc.) and to adopt the best of their world (fresh living foods, regular exercise and sunlight exposure).


Loren Cordain, Ph.D., Professor Emeritus


1. Berna F, Goldberg P, Horwitz LK, Brink J, Holt S, Bamford M, Chazan M. Microstratigraphic evidence of in situ fire in the Acheulean strata of Wonderwerk Cave, Northern Cape province, South Africa. Proc Natl Acad Sci U S A. 2012 May 15;109(20):E1215-20

2. Blake S, Welch DM. Making Fire. David M. Welch Publisher, Australian Aboriginal Culture Series, 2006.

3. Brain CK, Sillen A. Evidence from the Swartkrans cave for the earliest use of fire. Nature 1988;336:464-466.

4. Davidson DS. Fire making in Australia. Am Anthropologist 1947; 49:426-437.

5. Frazer, SJ. Myths of the Origin of Fire :An Essay. MacMillan Press, London, 1930.

6. Hough W. Aboriginal fire-making. Am Anthropologist 1890;3(4): 359-372.

7. Hough W. Fire as an Agent in Human Culture. Government Printing Office, Washington D.C., 1926.

8. Hough W. Fire making apparatus in the United States National Museum. In: Proc U S Natl Mus, 1928, p. 73.

9. James, SR. Hominid use of fire in the Lower and Middle Pleistocene: a review of the evidence. Curr Anthropol 1989;30: 1-26.

10. Mountford CP, Berndt RM. Making fire by percussion in Australia. Oceania 1941; 11(4): 342-344.

11. Roebroeks W, Villa P. On the earliest evidence for habitual use of fire in Europe. Proc Natl Acad Sci U S A. 2011 Mar 29;108(13):5209-14

12. Sandgathe DM, Dibble HL, Goldberg P, McPherron SP, Turq A, Niven L, Hodgkins J. Timing of the appearance of habitual fire use. Proc Natl Acad Sci U S A. 2011 Jul 19;108(29):E298.

13. Sandgathe DM, Dibble HL, Goldberg P, McPherron SP, Turq A, Niven L, Hodgkins J. On the role of fire in Neandertal adaptations in western Europe: evidence from Pech de l’Aze IV and Roc de Marsal, France. Paleo Anthropology 2011;216-242.

14. Sorensen A, Roebroeks W, van Gijn A. Fire production in the deep past? The expedient strike-a-light model. J Archaeol Sci 2014; 42:476-486.

15. Stapert D, Johansen L. Flint and pyrite: making fire in the Stone Age. Antiquity 1999; 73:765-777.

16. Weiner J. Pyrite vs. marcasite. Or: is everything that glitters pyrite? with a structured bibliography on firemaking through the ages. Bull Cherch Wallonie 1997;37:51-79.

17. Weiner J. Friction vs. percussion. Some comments on firemaking from Old Europe. Bull Primit Technol 2003; 26:10-16.

18. Wrangham R. Catching Fire: How Cooking Made Us Human. Basic Books, New York, 2009.

19. Wrangham R, Carmody R. Human adaptation to the control of fire. Evol Anthropol 2010;19:187-199.

20. Kroeber T. Ishi in Two Worlds, 50th Anniversary Edition: A Biography of the Last Wild Indian in North America. University of California Press, Berkeley, CA, 2011.

21. Cordain L. (1999). Cereal grains: humanity’s double edged sword. World Review of Nutrition and Dietetics, 84: 19-73.

22. Cordain L. (2012). The trouble with beans. In: Cordain L, The Paleo Answer, John Wiley & Sons, NY, NY, pp 130-147.

23. Cordain L. The nutritional characteristics of a contemporary diet based upon Paleolithic food groups. J Am Neutraceut Assoc 2002; 5:15-24.

24. Cordain L, Eaton SB, Sebastian A, Mann N, Lindeberg S, Watkins BA, O’Keefe JH, Brand-Miller J. Origins and evolution of the western diet: Health implications for the 21st century. Am J Clin Nutr 2005;81:341-54.

25. Carrera-Bastos P, Fontes Villalba M, O’Keefe JH, Lindeberg S, Cordain L. The western diet and lifestyle and diseases of civilization. Res Rep Clin Cardiol 2011; 2: 215-235.

Mediterranean Diet | The Paleo Diet

Dr. Cordain,

Yours and Maelán Fontes Villalba’s position is both convincing and very interesting. But you agree that there are also studies showing protective effect of whole-grains?

I have another hypothesis – maybe complementary to yours: Perhaps it is before all drastic technological treatments applied to raw food edible materials that have rendered them deleterious for health via modified compounds not adapted to our genetics. Otherwise, our ancestors had a very low life expectancy: this an important point. And, if cereal grains were so bad, why are they edible? Don’t forget also that we have to consider wholegrains in the context of a whole diet. Finally, our ancestors seemed to eat lots of meat: maybe they were submitted to acidosis? And what do we know about diet-related chronic diseases at this ancient periods?

However, your genetic argument remains strong, I agree.

Friendly yours,

Anthony FARDET, Ph.D.
Chargé de Recherches (Research scientist)
Human Nutrition Research Center, Auvergne
Clermont-Ferrand/Theix Research Center

Dr. Cordain’s Response

Dear Dr. Fardet,

Thank you for keeping an open scientific mind — in regard to your comments, it is ironic that the range of diets to which our species has been conditioned to over the vast expanse of evolutionary experience is now beyond the reach of many of the world’s people.

France and French people have developed a cultural tradition of foods and eating/lifestyle habits which on the surface (in large population studies) appear to be healthier than in many parts of Europe and in the rest of the world. In France, on a population wide basis, French bread and other forms of wheat are consumed daily, as is wine, cultured cheese, and butter. Let’s not forget fresh veggies, fruit, fish, olives and olive oils — particularly in the South of France. Moreover, American style fast food is typically shunned by at least the older French population. Additionally, meals are consumed over long time periods with multiple dishes consumed in relaxed settings. These dietary patterns typically result in reduced total caloric intakes over a 24 hour period. This manner of meals pretty much describes the Mediterranean Diet which likely is healthier than the typical US Diet or the typical non-Mediterranean European Diet — both of which appear to accelerate all chronic diseases of western civilization.

Could the French or Mediterranean Diet be the healthiest way to stave off the chronic diseases which impact most western societies or is there a healthier alternative? Contrast the Mediterranean Diet and its associated morbidity and mortality rates for all causes combined to the Japanese Diet, or better yet to contemporary Paleo Diets. We now have preliminary data that the Paleo Diet is more nutritionally dense than the Mediterranean Diet and maintains multiple nutritional characteristics superior to the French, Mediterranean or Japanese Diets. The therapeutic data for contemporary Paleo Diets is now available. You can find these studies if you diligently look for them on MEDLINE.

Let me now address a few other concerns you have offered:

1. “Otherwise, our ancestors had a very low life expectancy: this an important point.”

Although this issue may represent an intuitive “flash point,” the best and most correct data would suggest otherwise. First, your characterization that, “our ancestors had a very low life expectancy” is not necessarily correct and is moreover misleading. Let me give you a simple example. If we have a population of 4 people (2 adults who die at age 80 and who give birth to 2 children who die in childbirth), then the average life expectancy of this population is quite low (160 years/4 = 40 years). Hence, “average lifespan” really only represents the average age at death. What is more important is to characterize the “average age” of the entire living population.

These statistics are calculated regularly by life insurance companies in the western world and are called Life Tables. Life Tables therefore reflect the living population and not those who have died only compared to the living. At least 4 life table studies of hunter gatherers show that a good percentage of the population survives into old age (>60 yrs.). These facts are rather surprising given that in their world, there was no modern medicine, sanitation or contemporary health practices, and that mortality comes not from chronic diseases (as in the western world) but rather from accidents, trauma, snake bikes, warfare and the stresses of living outdoors for an entire lifetime. Mortality and morbidity among hunter gatherers (even the elderly) do not show them sufferings the signs or symptoms of chronic disease found in western populations, and this should be the take home point. Let’s adopt the best of their worlds — leave the worst behind and take the best that the modern world has to offer.

2.”If cereal grains were so bad, why are they edible?”

Again, I encourage you to read my paper, “Cereal Grains: Humanity’s Double Edged Sword’ — Cordain, L. Cereal Grains: Humanity’s Double Edged Sword. World Rev Nutr Diet. Basel, Karger, 1999, vol 84, pp 19–73.

Cereal grains (whole wheat, rye, barley, oats, corn, maize, sorghum, millet, etc.) are not generally edible (or very poorly digestible) by humans (or almost any other primate) in their natural state without cooking. As a species, we have a poor/limited ability to hydrolyze raw grain starches into sugars and metabolize them and degrade their raw proteins into amino acids in our guts for absorption. Hence whole, uncooked grains consumed by humans and by virtually all primates (except for a single species of baboon [Gelada]) represent a food source which was rarely or never never consumed. See my paper cited above for the scientific references.

Accordingly, until humans developed fire, cereal grains would have never been a significant food source. More importantly, the ability to start fires “at will” is the crucial issue here. This technology likely developed in Europe (only) about 300,000 to 250,000 years ago, but occurred not “at will”, but more likely by collecting natural and lightning caused fires. To star a fire “at will” results from 4 or 5 technological advances which probably occurred only after the appearance of behaviorally modern humans (~200,000 ago or less).

More importantly, the cell walls of cereal grains must be broken down by mechanical means (milling) before fire and heating are effective to hydrolyze cereal grain starches and thereby make them available for human nutritional absorption. Important in this concept is that the first crude cereal milling stones do not appear in the archaeological record until about 15,000-25.000 years ago in the mid-east. The fossil, nutritional and physiological data indicate that cereal grains would have been rarely or never used as food sources by our species until very recently in human evolution, simply because they were indigestible.

3. “Don’t forget also that we have to consider whole-grains in the context of a whole diet.”

Consider reading these two papers:

1. Cordain L. The nutritional characteristics of a contemporary diet based upon Paleolithic food groups. J Am Neutraceut Assoc 2002; 5:15-24.

2. Cordain L, Eaton SB, Sebastian A, Mann N, Lindeberg S, Watkins BA, O’Keefe JH, Brand-Miller J. Origins and evolution of the western diet: Health implications for the 21st century. Am J Clin Nutr 2005;81:341-54

When cereal grains displace lean meats, fish, seafood, eggs, organ meats, fresh vegetables, and fresh fruits, they dilute the trace nutrient (vitamin, mineral, phytochemical) concentration of the 13 nutrients most lacking in the typical western diet. Hence, in the context of a whole diet, the inclusion of cereal grains makes all nutritional considerations worse.

4. “Finally, our ancestors seemed to eat lots of meat: maybe they were submitted to acidosis?”

The available archeaological evidence worldwide, spanning hundreds of thousands of years shows that osteological (bone mineral abnormalities) evidence cannot support your supposition. Rather, osteoporosis, cribra orbitalia and other bone mineral pathologies stemming from dietary induced acidosis only became commonplace following the agricultural revolution and the adoption of cereal grains and plant foods as staples. The physiological and archaeological mechanisms and arguments for these events are fully outlined in my paper, Cereal Grains: Humanity’s Double Edged Sword.

5. “And what do we know about diet-related chronic diseases at this ancient periods?”

As I have pointed out in prior blogs, it is difficult to deduce heart disease from the fossil/bone record. Further, except for bone cancers, the same be held true for cancers. Nevertheless, bone cancers are extremely rare or non-existent in the archaeological human record prior to agriculture. Studies of historically studied hunter gatherers show cardiovascular disease to be rare or non-existent.

Cordain L, Eaton SB, Brand Miller J, Mann N, Hill K. The paradoxical nature of hunter-gatherer diets: Meat based, yet non-atherogenic. Eur J Clin Nutr 2002;56 (suppl 1):S42-S52.


Loren Cordain, Ph.D., Professor Emeritus

Modern Paleo | The Paleo Diet

Dear Dr. Cordain and Dr. Fontes Villalba,

I read with great interest your paper entitled: “Carrera-Bastos, P., Fontes-Villalba, M., O’Keefe, J. H., Lindeberg, S. and Cordain, L. (2011). The western diet and lifestyle and diseases of civilization. Research Reports in Clinical Cardiology 2011: 15-35.

The demonstration is quite logical, but I would like to get your opinion on the following issue:

*If the paleolithic diet suits well human genetics and does not lead to chronic diseases – which is true -, and if grain products are not so good for human health, the world has changed with at least 1 billion people living in huge towns worldwide: they cannot realistically have a paleolithic diet going outside to hunt and collect berries! There is a compromise to find between the best and reality of our modern world. In that way, grain cereals and legumes appear as promising foods being cheap, easy to store, with a huge health potential, satiating, etc. Maybe humans before 10,000 years consume grains, but to a lesser extent?*

So, what do you think is the best diet today, in a world with more than 7 billions people and huge cities?

I will be very happy to have your opinion about this issue,

Yours sincerely,

Available I remain,

Anthony FARDET, Ph.D.
Chargé de Recherches (Research scientist)
Human Nutrition Research Center, Auvergne
Clermont-Ferrand/Theix Research Center

Maelán Fontes Villalba’s Response:

Hello Dr. FARDET,

Today almost all health authorities and nutritionists believe that cereal grains are healthy for so-called diseases of civilization. This information is derived from epidemiological studies that can not establish cause-effect. Consistently, epidemiological studies show an inverse association between the consumption of cereal grains and western disease, which is not demonstrated by randomized controlled trials. In the women’s health initiative (+48,000 postmenopausal women) those women allocated to the intervention group (eat more than 6 servings/day of wholegrain cereals, 5 servings of fruit/vegetable and <20%en from fat) of whom had a CV event at baseline significantly increased their risk of CV event by 26%. In the DART trial, those men allocated to increase the intake of fiber from wholegrain cereals, increased the risk of death compared to the group who wasn’t advised to increase the intake of fiber.

Some systematic reviews make it clear that we don’t have enough evidence to recommend the intake of cereal grains for the prevention and treatment of cardiovascular disease (Kelly, Cochrane Database of Systematic Reviews, 2007), obesity, (FESNAD-SEEDO, Revista Española De Obesidad, 2011) or diabetes (Priebe, Cochrane Database of Systematic Reviews, 2008). Therefore, I don’t agree with you in that “they have a huge health potential.”

The short-term clinical trials published by my group (Lindeberg, Diabtologia, 2007; Jönsson, Cardiovascular Diabetology, 2009) have shown that a Paleolithic Diet is superior to the Mediterranean and American Diabetes Association diets, respectively.

From an evolutionary standpoint it is very unlikely that we have completed adapted to cereal grains in just 10,000 years (even knowing that human evolution has accelerated since the adoption of agriculture and living in huge cities, probably by pathogens rather than foods). So, there is no body of evidence that demonstrates we have adapted. This information is necessary before performing human experiments showing we are feeding individuals with a food meant for granivorous animals (birds, rodents, etc). And, while “everybody” may think we are adapted, it is very unlikely. The proof simply isn’t there. If we eat the kind of foods we ate during human evolution (>99.9% of our evolution), is there any obvious risk? Not that we are aware of. Are there potential health risk of consuming cereal grains? Yes there are. So, I would stay in the safe side.

Regarding you question about sustainability, I think you can adapt the Standard American Diet to one more suitable for our genetic legacy and improve health. Furthermore, if the world population would eat a diet in accordance with our physiology then the expense in health would be dramatically cut. I am not an expert in this field but I think that famine in the third world is more so a political problem than a problem with food choice itself.

You are right, however, regarding the DART study, the increased risk (18%) was non-significant, but in a Eur J Clin Nutr 2002 Ness, and after statistical adjustment, there was a significant increase of mortality in the first two years, but not the following years.


Many studies are conducted in ill people with high risk of CVD. If an intervention reduces the risk to that of “normal” westerners, then we could say there was a positive effect, but who wants to be normal? Not me! (see European Heart Journal 2005 Lindeberg). I prefer to have a low risk of western disease.

I agree that the Mediterranean Diet (for example) is better than the Standard American Diet, but is there a better diet? What if we reduce cereal grains in a Mediterranean Diet and increase fruits, tubers and vegetables? Do you improve a diet based on vegetables, fruits, tubers, lean meat, fish, eggs and nut if you include grains?

Dr. Cordain has explained in many lectures that cereals grains are not edible unless you process them. But there are many other reasons (bioactive compounds like exorphins, lectins, saponins, binding to endocrine receptors; antinutrients; protease and amylase inhibitors, etc), why cereal grains can be a problem for most people, besides non-celiac gluten sensitivity (and potential same effect of thousands of proteins in grains).

I am unfamiliar with any study where it has been shown that cereal grains, per se, are protective. As I previously mentioned, there are some studies (with numerous limitations like the PREDIMED study where the control group received much less support and followed a diet similar to that in WHI, where the risk of CVD was reduced, but you cannot say it was because of the intake of cereal grains. On the other hand, the studies comparing healthy diets with and without cereal grains have shown very interesting results. Should we look to the other side, or focus on those interesting data? Well, many people just turn their head to the other way, while we are interested in exploring what happened in those studies (Lindeberg, 2007; Jönsson, 20009; Mellberg, 2014).

The statement that our ancestor lived only until the age of 30 is false. See Kaplan, 2007–>modal age at death is >70 years old in hunter-gatherers. Of course, they are not people from the Paleolithic era but there is no reason to think that it was different then. See also Eaton, 2002 where Dr. Cordain is a co-author (Evolutionary Health Promotion: A Consideration of Common Counterarguments: Preventive Medicine 2002 Eaton).

Regarding meat, it is not true that Paleolithic Diets must necessarily be high in meat. Some hunter-gatherers consume high amounts plants, with carbohydrate being almost 70%en (see the Kitava study).

Best wishes,

Maelán Fontes Villalba, M.S.

Dr. Cordain’s Response:

Dear Dr. FARDET,

Many thanks for your inquiry. Maelán Fontes has done a good job of summarizing potential health issues with cereal grains in his reply to you. In my paper, “Cereal Grains: Humanity’s Double Edged Sword,” I delve into greater detail on the topic with 55 pages of dialogue and 342 references.

Further, in our paper, “Origins and Evolution of the Western Diet: Health Implications for the 21st Century,” we show how humans have no nutritional requirement for whole grain cereals. In fact, when whole grains are added to the diet they significantly reduce the 13 nutrients most lacking in the US diet.

Further, in our paper “The Nutritional Characteristics of a Contemporary Diet Based upon Paleolithic Food Groups,” you can see how a modern Paleo diet based upon lean meats, fish, seafood, fresh fruits and vegetables and nuts (and devoid of whole grain cereals, dairy products and processed foods) are much more nutrient dense than either the current USDA recommended My Plate diet, or the Mediterrean Diet. The reason for this phenomenon is that the aforementioned foods are more trace nutrient dense than whole grains, dairy products or processed foods for the 13 nutrients most lacking in western diets.

Cheers, I hope these papers provide new learning to further educating yourself on the topic.


Loren Cordain, Ph.D., Professor Emeritus

Isotopic Data Does Not Indicate Grass Consumption | The Paleo Diet

You may remember a series of scientific papers were published in the Proceedings of the National Academy of Sciences evaluating the diet of numerous species of fossilized hominins, bipedal or upright walking apes, who lived in Africa from 4.1 to 1.4 million years ago.1234 These papers were grossly misinterpreted by the mass media, suggesting our early ancestors were regular consumers of grass and grass seeds (cereal grains).5, 6, 7

Poor research and analysis by a number of science writers have done their readers a disservice by inaccurately reporting the details of these three studies, making assumptions, and drawing conclusions that ancient hominin diets that the scientists themselves did not make.

The formal letter sent to the Proceedings of the National Academy of Sciences to address shortcomings was published in today’s Early Edition: “African hominin stable isotopic data do not necessarily indicate grass consumption.” I encourage you to read it carefully, revisit the original rebuttal published on The Paleo Diet Blog, and help guide others to make informed judgments.


Loren Cordain, Ph.D., Professor Emeritus


1. Matt Sponheimer, Zeresenay Alemseged, Thure E. Cerling, Frederick E. Grine, William H. Kimbel, Meave G. Leakey, Julia A. Lee-Thorp, Fredrick Kyalo Manthi, Kaye E. Reed, Bernard A. Wood, and Jonathan G. Wynn. Isotopic evidence of early hominin diets. PNAS 2013 : 1222579110v1-201222579.

2. Jonathan G. Wynn, Matt Sponheimer, William H. Kimbel, Zeresenay Alemseged, Kaye Reed, Zelalem K. Bedaso, and Jessica N. Wilson. Diet of Australopithecus afarensis from the Pliocene Hadar Formation, Ethiopia. PNAS 2013 : 1222559110v1-201222559.

3. Thure E. Cerling, Fredrick Kyalo Manthi, Emma N. Mbua, Louise N. Leakey, Meave G. Leakey, Richard E. Leakey, Francis H. Brown, Frederick E. Grine, John A. Hart, Prince Kaleme, Hélène Roche, Kevin T. Uno, and Bernard A. Wood. Stable isotope-based diet reconstructions of Turkana Basin hominins. PNAS 2013 : 1222568110v1-201222568

4. Thure E. Cerling, Kendra L. Chritz, Nina G. Jablonski, Meave G. Leakey, and Fredrick Kyalo Manthi. Diet of Theropithecus from 4 to 1 Ma in Kenya. PNAS 2013 : 1222571110v1-201222571

5. Arnold, Carrie. “Even Our Ancestors Never Really Ate the “Paleo Diet” – The Crux | Discovermagazine.com.” DISCOVER Magazine: The Crux. Kalmbach Publishing Co., 3 June 2013.

6. Joyce, Chris. “Grass: It’s What’s For Dinner (3.5 Million Years Ago).” NPR the Salt. NPR, 3 June 2013.

7. Griffin, Catherine. “Human Ancestors’ Ape-like Diet Changed 3.5 Million Years Ago to Grass.” Science World Report: Nature & Environment. Science World Report, 4 June 2013.

Affiliates and Credentials