Disclaimer
• Your life and health are your own responsibility.
• Your decisions to act (or not act) based on information or advice anyone provides you—including me—are your own responsibility.

Categories

Big Brains Require An Explanation, Part VI:
Why Learning Is Fundamental, Even For Australopithecines

Our Story So Far (abridged)

  • By 3.4 MYA, Australopithecus afarensis was most likely eating a paleo diet recognizable, edible, and nutritious to modern humans. (Yes, the “paleo diet” predates the Paleolithic age by at least 800,000 years!)
  • The only new item on the menu was large animal meat (including bone marrow), which was more calorie- and nutrient-dense than any other food available to A. afarensis—especially in the nutrients (e.g. animal fats, cholesterol) which make up the brain.
  • Therefore, the most parsimonious interpretation of the evidence is that the abilities to live outside the forest, and thereby to somehow procure meat from large animals, provided the selection pressure for larger brains during the middle and late Pliocene.
  • A. africanus was slightly larger-brained and more human-faced than A. afarensis, but the differences weren’t dramatic.

(This is Part VI of a multi-part series. Go back to Part I, Part II, Part III, Part IV, or Part V.)

And here’s our timeline again, because it helps to stay oriented:

Timeline of hominin evolution

Click the image for more information about the chart. Yes, 'heidelbergensis' is misspelled, and 'Fire' is early by a few hundred KYA, but it's a solid resource overall.


It Doesn’t Take Much Selection Pressure To Change A Genome (Given Enough Time)

When we’re talking about the selection pressure exerted by the adaptations our ancestors made to different dietary choices, it’s important to remember that it only takes a very small selective advantage to make an adaptation stick.

The path to fixation

Remember, these are based on the most pessimistic assumptions possible.

The math is complicated, and I don’t want to drag my readers through it—but even under the most pessimistic initial assumptions (Haldane 1957), the following rules of thumb hold:

  • A mutation that confers a 10% selective advantage on a single individual takes, on average, a couple hundred generations to become fixed (present in 100% of the population).
  • Even a mutation that confers a tiny 0.1% selective advantage takes only a few thousand generations to become fixed.
  • Therefore, a 10% selective advantage would have become fixed in just a few thousand years—a fraction of an instant in geological time.
  • Even a 0.1% selective advantage would have taken perhaps 50,000 years to reach fixation—still an instant in geological time, and well beyond the precision of our ability to date fossils from millions of years ago.

I’m using approximate figures because they depend very strongly on initial assumptions and the modeling method used…not to mention the idea of a precisely calculated figure for “selective advantage” is silly.

Why is this important? First, because we need to remember that we are thinking about long, long spans of time. All of what we blithely call “human history” (i.e. the history of agriculture, from the Sumerians to the present) spans less than 10,000 years, versus the millions of years we’ve covered so far!

Second, and most critically, it’s important because we don’t need to posit that australopithecines ate lots of meat in order for the ability and inclination to be selected for—and to reach fixation. Even if rich, fatty, calorie-dense meat (including marrow and brains) only provided 5% of the australopith diet—and 4.9% of that advantage was lost due to the extra effort and danger of getting the meat (it doesn’t matter if you’re better-fed if a lion eats you)—the remaining 0.1% advantage still would have reached fixation in perhaps 50,000 years.

In other words: the ability and inclination to eat meat when available might have been a tiny advantage for an individual australopith…but given hundreds of thousands of years, that tiny advantage is more than sufficient to explain the existence and spread of meat-eating.

Most Mutations Are Lost: Why Learning Is Fundamental (Even For Australopithecines)

The flipside of the above calculations is that most mutations occurring from a single individual—even strongly beneficial ones—are lost.

Using the simple mathematical model, the probability that even a beneficial mutation will achieve fixation in the population, when starting from a single individual, is extremely low. J.B.S. Haldane calculated it at approximately 2 times the selective advantage—so even a 10% advantage is only 20% likely to reach fixation if it begins with a single individual! And for a 0.1% selective advantage, well, 0.2% doesn’t sound very encouraging, does it?

For those interested in the dirty mathematical details of simulating gene fixation, see (for instance) Kimura 1974 and Houchmandzadeh & Vallade 2011.

This low probability is because any gene carried by only one individual, or only a few individuals, is usually lost right away due to random chance while we’re on the initial part of the S-curve in the graph above. (As the number carrying the gene increases, the probability that everyone carrying it will die decreases.) So according to this naive model, we would expect individual australopithecines to have discovered meat-eating over and over again, hundreds if not thousands of times, before sheer luck finally allowed the behavior to spread throughout the population! Is that why it took millions of years to make progress?

Perhaps—but it seems doubtful. Meat-eating isn’t a single action: even if we assume that australopithecines were pure scavengers, it’s still a long, complicated sequence of behaviors involving finding suitable scraping/smashing rocks; looking for unattended carcasses; watching for their owners or other predators to return, which is probably a group behavior; grabbing any part that looked tasty; and using the rocks found earlier to help scrape off meat scraps, or to smash them open for marrow or brains. And hunting behavior is even more complex!

Of course, the naive mathematical model assumes that behavioral changes are purely under genetic control, and that individuals are not capable of learning. Since we know that the ability of humans to communicate knowledge by teaching and learning (known generally as “culture”) is greater than that of any other animal, it seems likely that the ability and inclination to learn from other australopiths was the primary mechanism by which our ancestors adapted a new mode of life that involved survival outside the forest—including meat-eating.

Note that chimpanzees can be taught all sorts of complicated skills, including how to make Oldowan stone tools—but they don’t seem to show any particular interest in teaching other chimps what they’ve learned.

Evidence That Increased Learning Ability Was The Key Hominin Adaptation During The Late Pliocene

We’ve just established that it’s very unlikely for a behavior discovered by one individual to spread throughout the population if it’s purely driven by a genetic mutation, even if it confers a substantial survival advantage—because the mathematics show that most individual mutations, even beneficial ones, are lost.

Here’s a summary of the physical evidence that our ancestors’ behavioral change was driven, at least in large part, by the ability to learn:

  • Body mass decreased by almost half between Ardipithecus ramidus (110#, 50kg) and Australopithecus africanus (65#, 30kg). Height also decreased slightly, from 4′ (122cm) to about 3’9″ (114cm). Clearly our ancestors’ adaptation to bipedal, ground-based living outside the forest didn’t depend on being big, strong, or physically imposing!
  • None of the physical changes appear to be a specific adaptation to anything but bipedalism, or to a larger brain case: faces became flatter and less prognathic, canines became shorter and less prominent, etc.
  • Despite a much smaller body, brain size increased from 300-350cc to 420-500cc. As brains are metabolically expensive (ranking behind only the heart and kidney by weight, and roughly equal to the GI tract—see Table 1 of Aiello 1997), this suggests that it was very important to conserve them.

Furthermore, it’s probably not a coincidence that bone marrow and brains are high in the same nutrients of which hominin brains are made—cholesterol and long-chain fats.

World Rev Nutr Diet 2001, 90:144-161.
Fatty acid composition and energy density of foods available to African hominids: evolutionary implications for human brain development.
Cordain L, Watkins BA, Mann NJ.

Scavenged ruminant brain tissue would have provided a moderate energy source and a rich source of DHA and AA. Fish would have provided a rich source of DHA and AA, but not energy, and the fossil evidence provides scant evidence for their consumption. Plant foods generally are of a low energetic density and contain virtually no DHA or AA. Because early hominids were likely not successful in hunting large ruminants, then scavenged skulls (containing brain) likely provided the greatest DHA and AA sources, and long bones (containing marrow) likely provided the concentrated energy source necessary for the evolution of a large, metabolically active brain in ancestral humans.

The learning-driven hypothesis fits with other facts we’ve already established. General-purpose intelligence is an inefficient way to solve problems:

…Intelligence is remarkably inefficient, because it devotes metabolic energy to the ability to solve all sorts of problems, of which the overwhelming majority will never arise. This is the specialist/generalist dichotomy. Specialists do best in times of no change or slow change, where they can be absolutely efficient at exploiting a specific ecological niche, and generalists do best in times of disruption and rapid change.” –Efficiency vs. Intelligence

Yet our hominin ancestors found success via greater intelligence rather than specific adaptations—most likely because of the cooling and rapidly oscillating climate previously discussed in Part I and Part IV. I’ll quote this paper again because it’s important:

PNAS August 17, 2004 vol. 101 no. 33 12125-12129
High-resolution vegetation and climate change associated with Pliocene Australopithecus afarensis
R. Bonnefille, R. Potts, F. Chalié, D. Jolly, and O. Peyron

Through high-resolution pollen data from Hadar, Ethiopia, we show that the hominin Australopithecus afarensis accommodated to substantial environmental variability between 3.4 and 2.9 million years ago. A large biome shift, up to 5°C cooling, and a 200- to 300-mm/yr rainfall increase occurred just before 3.3 million years ago, which is consistent with a global marine δ18O isotopic shift.

We hypothesize that A. afarensis was able to accommodate to periods of directional cooling, climate stability, and high variability.

The temperature graphs show that this situation continued. How did it affect our ancestors’ habitat and mode of life?

J Hum Evol. 2002 Apr;42(4):475-97.
Faunal change, environmental variability and late Pliocene hominin evolution.
Bobe R, Behrensmeyer AK, Chapman RE.

This study provides new evidence for shifts through time in the ecological dominance of suids, cercopithecids, and bovids, and for a trend from more forested to more open woodland habitats. Superimposed on these long-term trends are two episodes of faunal change, one involving a marked shift in the abundances of different taxa at about 2.8+/-0.1 Ma, and the second the transition at 2.5 Ma from a 200-ka interval of faunal stability to marked variability over intervals of about 100 ka. The first appearance of Homo, the earliest artefacts, and the extinction of non-robust Australopithecus in the Omo sequence coincide in time with the beginning of this period of high variability. We conclude that climate change caused significant shifts in vegetation in the Omo paleo-ecosystem and is a plausible explanation for the gradual ecological change from forest to open woodland between 3.4 and 2.0 Ma, the faunal shift at 2.8 +/-0.1 Ma, and the change in the tempo of faunal variability of 2.5 Ma.

In summary, 2.8 MYA is when things started to get exciting, climate-wise…and 2.6 MYA (the beginning of the Pleistocene) is when they started to get really exciting.

None of this is to say that the ability to learn was the only adaptation responsible for meat-eating: learning ability could easily have combined with other adaptations like inquisitiveness, aggressiveness, or a propensity to break things and see what happens.

Conclusion: A Tiny Difference Can Make All The Difference

  • Given the time-scale involved, a small selective advantage conferred by a small amount of meat-eating could easily have produced the selection pressure for meat-eating behavior to reach fixation in australopithecines.
  • Several lines of evidence—the mathematics of population genetics, the trends of australopithecine physical evolution, the ability of the nutrients in meat to build and nourish brains, and the increasingly colder, drier, and more variable climate—all point towards intelligence and the ability to learn (as opposed to physical power, or specific genetically-driven behavioral adaptations) being the primary source of the australopithecines’ ability to procure meat.

Don’t stop here! Continue to Part VII, “The Most Important Event In History”.

Live in freedom, live in beauty.

JS

Anti-Nutritionism, L-Canavanine, And The Limitations of N=1 Self-Experimentation

What Is Nutritionism?

While I disagree with Gyorgy Scrinis (and the popularizer of the concept, Michael Pollan) on their proposed solution, I believe Scrinis’ concept of “nutritionism” as an error in dietary thinking has merit—and I doubt anyone in the paleo community would disagree.

Reducing food to its nutrient components could be called “nutritionism”, and it has probably become the dominant way of thinking about food and health, and of constructing healthy diets.

The nutrition industry has implicitly, if not explicitly, promoted nutritionism by continually framing most research studies and dietary advice in terms of these chemical-nutrient categories.

The rise of nutritionism is clear in one of the well-known sayings promoted by the food industry and some nutritionists: “There is no such thing as good and bad foods, only good and bad diets.” According to this argument, all types of foods, including junk food, have a place in a “balanced” diet.

Marketing foods and diets on the basis of their nutritional composition tends to take attention away from the quality and the type of foods being promoted.

Processed foods, for example, are often fortified with vitamins and minerals, or stripped of some of their fat, to enable such nutrient-content claims to be made. Nutrient claims on the labels of processed foods and drinks conceal the fact these foods are typically high in added fat, sugar, salt, chemical additives and reconstituted ingredients, and have often been stripped of a range of beneficial micro-nutrients and food components.”

High in protein, low in fat and too good to be true, Gyorgy Scrinis, Sydney Morning Herald, April 7, 2006

Nutritionism makes several unspoken assumptions:

  • We already know all the important nutrients and their functions.
  • The function of an isolated nutrient (even in a synthetic form not occurring in nature, e.g. folic acid) is exactly the same as its function in food, because…
  • There are no competitive or synergistic effects between the thousands of chemical compounds found in one bite of real food.
  • The effect of a food on health is reducible to its effects on the numbers obtained from cheap, easy tests like “BMI” and “total cholesterol”.
  • Therefore, so long as our diet contains the proper “nutrients”, we will be healthy and happy.

I doubt anyone in the paleo community disagrees with Scrinis (and Pollan) that nutritionism, in its modern form, is bunk. A diet of chicken nuggets, Twinkies, and Diet Coke is not nutritionally equivalent to a diet of fresh meat, fruits, and vegetables no matter how many supplements we take.

What Is Anti-Nutritionism?

Unfortunately it’s possible to fall into an analogous trap when pursuing a paleo way of life…a trap I call “anti-nutritionism”. Anti-nutritionism also makes several unspoken assumptions:

  • We already know all the important anti-nutrients and their functions.
  • The function of an isolated anti-nutrient is exactly the same as its function in food, because…
  • There are no competitive or synergistic effects between the thousands of chemical compounds found in one seed, sprout, fruit, or bite of plant or animal tissue.
  • Herbivorous, seed-eating mice—especially genetic knockout mice—are metabolically and biochemically the same as humans, and are excellent models for human digestion and metabolism.
  • Therefore, if I eat a food for six months and I don’t get any fatter or suffer obvious health problems, I can recommend it to others as healthy—and perhaps even paleo-compatible.

Food Doesn’t Want To Be Eaten

It’s tempting to believe that if a food we like doesn’t contain gluten, excessive omega-6 fats, or excessive fructose, that it’s fine to eat. However, all food has defenses against being eaten—because any plant or animal that was eaten before it reproduced failed to leave descendants!

This leads us to a tautological but astonishing conclusion. Every living thing on this Earth is the descendant of millions of generations of successful ancestors—not a single one of which was eaten, trampled, gored, poisoned, burned, drowned, starved, fell from a tree, killed by parasites or infection, or otherwise died before it managed to reproduce at least once.

“Being eaten” certainly qualifies as a reproduction-limiting event. Animals can hide, run away, or counter-attack—but plants cannot. Therefore, we might expect their defenses to involve being disgusting, poisonous, or indigestible—particularly for seeds, their agents of reproduction.

Fruit is the exception to the rule, but there’s an unspoken bargain involved: “eat this delicious, sweet fruit, but don’t digest the seeds…poop them out somewhere else.” As we’d expect, the seeds of most sweet fruits range from bitter to frankly poisonous.

Many books, websites, and scientific papers explore the biochemistry of anti-nutrients like gluten and gliadin (found in wheat and its relatives) and lectins (found in just about every plant seed), and I won’t rehash the biochemistry here. But just as our knowledge of the nutrients in food and their function is incomplete, our knowledge of anti-nutrients is, if anything, far more incomplete.

A Partial List Of Plant Toxins

Lectins, trypsin inhibitors, antigenic proteins, cyanogens, tannins, quinolizidine alkaloids, glucosinolates, saponins, phytoestrogens, non-protein amino acids…

Today’s Unsung Anti-Nutrient: L-Canavanine

To illustrate the limitations of the paleo community’s understanding of anti-nutrients, here’s an example I’ve never seen mentioned by any paleo source: L-canavanine.

(Update: Though it makes no appearance in the literature, apparently Dr. Loren Cordain has indeed been discussing L-canavanine in his speeches and presentations. Thanks to Pedro Bastos for the correction.)

“L-canavanine is a common non-protein amino acid found naturally in alfalfa sprouts, broad beans [also known as “fava beans”], jack beans, and a number of other legume foods [including sword beans] and animal feed ingredients [1] at up to 2.4% of food dry matter. This analog of arginine (Figure 1.) can also block NO synthesis [2-5], interfere with normal ammonia disposal [6,7], charge tRNAarg, cause the synthesis of canavanyl proteins [8], as well as prevent normal reproduction in arthropods [9] and rodents [10].

Canavanine has also been reported to induce a condition that mimics systemic lupus erythematosus (SLE) in primates [11,12], to increase antibodies to nuclear components and promote SLE-like lesions in auto immune-susceptible (e.g., (NZB X NZW)F1) mice [13].” (Brown 2005)

Stated plainly: canavanine “looks” like arginine, and is incorporated into our tissues like arginine…but the resulting proteins don’t function properly. And did I hear someone say “lupus”?

Arthritis Rheum. 1985 Jan;28(1):52-7.
Effects of L-canavanine on T cells may explain the induction of systemic lupus erythematosus by alfalfa.
Alcocer-Varela J, Iglesias A, Llorente L, Alarcón-Segovia D.

Alfalfa sprouts can induce systemic lupus erythematosus (SLE) in monkeys. This property of alfalfa sprouts has been attributed to their non-protein amino acid constituent, L-canavanine. Occurrence of autoimmune hemolytic anemia and exacerbation of SLE have been linked to ingestion of alfalfa tablets containing L-canavanine. In this report we show that L-canavanine has dose-related effects in vitro on human immunoregulatory cells, which could explain its lupus-inducing potential.

Rheum Dis Clin North Am. 1991 May;17(2):323-32.
Dietary amino acid-induced systemic lupus erythematosus.
Montanaro A, Bardana EJ Jr.
“In this article, we detail our experience with a human subject who developed autoimmune hemolytic anemia while participating in a research study that required the ingestion of alfalfa seeds. Subsequent experimental studies in primates ingesting alfalfa sprout seeds and L-canavanine (a prominent amino acid constituent of alfalfa) is presented. The results of these studies indicate a potential toxic and immunoregulatory role of L-canavanine in the induction of a systemic lupus-like disease in primates.”

L-canavanine, being an amino acid, is not deactivated by heat or cooking. So when we hear statements like “Beans are fine so long as you soak or sprout them”, it’s worth reminding ourselves that this isn’t even true according to the tiny fraction of legume biochemistry we understand—let alone the overwhelming majority we don’t.

Further Reading

J. Agric. Food Chem. 2003, 51, 2854−2865
Nonprotein Amino Acids of Plants: Significance in Medicine, Nutrition, and Agriculture
E. Arthur Bell

“Much more needs to be learned of the biological activity, the relative toxicities of these compounds to different organisms, and their nutritional value if we are to make the best use of them and the plants in which they are synthesized.”

Autoimmun Rev. 2006 Jul;5(6):429-35. Epub 2005 Dec 29.
Role of non-protein amino acid L-canavanine in autoimmunity.
Akaogi J, Barker T, Kuroda Y, Nacionales DC, Yamasaki Y, Stevens BR, Reeves WH, Satoh M.

Am J Clin Nutr November 1995 vol. 62 no. 5 1027-1028
Reply to NR Farnsworth
Victor Herbert

Also note that you’ll find a much-copied reference on the Internet claiming that canavanine toxicity is irrelevant to humans. Don’t be misled: it’s an article from a 1995 vegetarian journal which makes a host of blatantly false claims, such as “There is NO canavanine at all in other legumes that are commonly used as human food.”

Favism: A Postscript to the Fava Bean/Broad Bean Issue

Canavanine toxicity is distinct from vicine toxicity. Vicine (and its analogs covicin and isouramil) is a poison in fava beans that causes hemolytic anemia in susceptible people—a sometimes-fatal condition known as favism. Favism is caused by G6PDH deficiencies, common X-linked mutations which affect over 400 million people worldwide, mostly in Africa, the Middle East, and southern Asia.

Intermission

The Limitations Of Self-Experimentation and N=1

Self-experimentation is very important, and we can learn much that is useful from it. For instance, trying to dial in carbohydrate intake can be a balancing act between weight loss, mood, and physical performance. People have found solutions to their own individual health issues via anything from egg yolks to beef liver to coconut oil to magnesium supplementation. And just coming up with a new repertoire of healthy, paleo-compatible foods to replace the pantry full of junk we used to eat involves extensive N=1 with new recipes—with immediate success not guaranteed.

However, there are limits to the knowledge we can accumulate. Stated plainly:

N=1 self-experimentation can tell us what works best for ourselves—within the limits of healthy eating, as defined by biochemistry and evolutionary context.

However, self-experimentation alone cannot tell us which foods are healthy to eat, because even a dramatic increase in lifetime risk is vanishingly unlikely to manifest itself during a few months of self-experimentation.

For instance, here’s a seemingly reasonable statement:

1. “I ate corn for six months, and I didn’t gain weight or feel worse. Therefore corn is healthy to eat.”

It’s certainly tempting to make these sorts of statements—but I find that temptation is best resisted. To illustrate why, here’s an equivalent statement that we can all agree isn’t reasonable:

2. “I started smoking six months ago, and I feel fine. Therefore smoking is healthy.”

Permit me to drive the point home with force:

3. “I started eating strontium-90 six months ago, and I haven’t got cancer yet. Therefore radiation exposure is healthy.”
4. “I started shooting heroin six months ago. It’s solved all my anxiety issues, and I’ve lost twenty pounds! Therefore shooting heroin is healthy.”
5. “I started having unprotected sex with Tanzanian hookers six months ago, and I feel great! Therefore unprotected sex with high-risk strangers is healthy.”

The reason we can identify the second through fifth statements as false is because we don’t trust the results of our own self-experimentation. We know that long-term observations show that smoking greatly increases our risk of several forms of cancer and heart disease; each Sievert of radiation exposure causes a 5-10% increase in cancer deaths (Strom 2003); heroin addiction is almost never a controllable vice; and HIV infection takes longer than six months to produce symptoms of AIDS—no matter how we feel in the short term.

No, I’m not directly comparing eating corn to smoking or unprotected sex with high-risk strangers! I’m demonstrating that even a substantial increase in lifetime risk is vanishingly unlikely to manifest itself within any period of self-experimentation. This is why anecdotes are useless when evaluating risk.

For example, my grandfather smoked two packs of cigarettes a day for over sixty years, dying in his 80s of a non-smoking-related illness…but that doesn’t change the fact that smokers contract lung cancer 15-20x more often than non-smokers (Thun et.al. 2008), and also suffer from all types of heart disease, many other cancers, renal damage, and impotence at a far greater rate than non-smokers. And while I’ve spent plenty of time making fun of weak associations extracted from known-bad data, I do find the evidence for negative health effects from regular smoking reasonably convincing—though perhaps of smaller magnitude than claimed by typical sound-bites.

In conclusion, it’s clear that anti-nutritionism makes it easy to fall into the trap of extrapolating N=1 beyond its limits. By assuming that we already know all the important anti-nutrients, we can easily convince ourselves that a clearly Neolithic food is healthy (or, at least, harmless) just because we don’t feel any obvious harmful effects from consuming it in the short term.

To answer such questions, we need to apply science, not N=1…

…and it is very likely that the answer will not be authoritative. Scientific answers are much more likely to be of the form “There are a lot of potential toxins, but we don’t know how bad they are for humans, either singly or in combination” or “It’s analogous to something that quickly causes pancreatic cancer in rats—at 10 times a realistic dietary dose.”

That’s where evolutionary context comes in, and where I use my general rule of thumb, previously seen here:

Eat foods you could pick, dig, or spear. Mostly spear.

The Takeaways: Now What?

My intent is not to encourage anyone to become overly fearful about eating the occasional bowl of ice cream or tarka dal! I understand that even functional paleo can feel somewhat limiting at times, and that nothing will make a fresh, hot Krispy Kreme not taste delicious.

What I’m doing is cautioning my readers that no interesting or useful information comes from arguments about whose N=1 is more authoritative; I’m reiterating my own commitment to careful, rational inquiry; and, most importantly, I’m hoping to communicate my own respect, humility, and awe as one infinitesimal part of our huge, beautiful and dizzyingly complex world and the multi-billion year history of life upon it. As I said nearly a year ago:

“There is an important difference between “We don’t know all the answers yet” and “Do what feels right, man.” These questions have answers, because humans have biochemistry, and we should do our best to find them and live by the results.”

The Paleo Identity Crisis: What Is The Paleo Diet, Anyway?

Meanwhile, I will continue to do my best to find interesting and useful information at the intersection of biochemistry and evolutionary context, and I will continue to explain it as best I can to you, my readers, here at gnolls.org.

And since I like to leave my readers with a few practical takeaways, here are some useful thoughts for when you start finding even functional paleo limiting or monotonous.

  • Consider what you’ve gained, not just what you’ve lost. Sure, you can’t just binge on half a dozen crullers anymore…but you can eat all the prime rib you want without any form of guilt. How cool is that?
  • If you’re stuck in a rut of monotonous food, try some new recipes. Yes, it’ll take some time and several tries to find and perfect a new dish you like as much as your current favorites. Here’s an endless source to get you started.
  • Cheat proudly. For the most part, the dose makes the poison…so unless cheating will start you on a binge, it’s better to say “I am going to eat these street tacos because they’re delicious and I want some” than to try to convince yourself that corn is paleo.
  • Cheat intelligently. Think of a cheat as dessert: once you’ve satisfied yourself with a complete meal, you can think about a Coke or a Reese’s. Otherwise you run the risk of your cheat replacing an entire meal—and once you’ve been paleo for a while, 1200 calories worth of Krispy Kremes will most likely make you feel like you’ve contracted Ebola Zaire.
  • Live in your body. The pleasure of junk food lasts until it slides down your throat: the pleasure of good health manifests itself 24/7 in better sleep, less pain, greater mental clarity and capacity, and greater physical ability. The strong, sleek, healthy body of an apex predator is a great place to be. Instead of medicating it into passivity or becoming a sessile peripheral to your computer and television, go outside. Climb a tree, kick balls, shoot baskets. Learn a new skill. Explore somewhere you’ve never been.

    There’s a big, bright, beautiful world out there: what are you waiting for?

Atop a Sierra peak that shall remain nameless

Live in freedom, live in beauty.

JS


Yes, this one turned into another epic! Spread it like pollen with the new social clicky-popup-thing…and please support my continued efforts by making your Amazon purchases through my referral link. Did I mention that T-shirts are back in stock, in all sizes?

Big Brains Require An Explanation, Part V:
Re-Orienting Ourselves In Time, and Why Are There “Southern Apes” In Ethiopia?

In Part IV, we established the following:

  • Our ancestors’ dietary shift towards ground-based foods, and away from fruit, did not cause an increase in our ancestors’ brain size.
  • Bipedalism was necessary to allow an increase in our ancestors’ brain size, but did not cause the increase by itself.
  • Bipedalism allowed Australopithecus afarensis to spread beyond the forest, and freed its hands to carry tools. This coincided with a 20% increase in brain size from Ardipithecus, and a nearly 50% drop in body mass.
  • Therefore, the challenges of obtaining food in evolutionarily novel environments (outside the forest) most likely selected for intelligence, quickness, and tool use, and de-emphasized strength.
  • By 3.4 MYA, A. afarensis was most likely eating a paleo diet recognizable, edible, and nutritious to modern humans. (Yes, the “paleo diet” predates the Paleolithic age by at least 800,000 years!)
  • The only new item on the menu was large animal meat (including bone marrow), which was more calorie- and nutrient-dense than any other food available to A. afarensis—especially in the nutrients (e.g. animal fats, cholesterol) which make up the brain.
  • Therefore, the most parsimonious interpretation of the evidence is that the abilities to live outside the forest, and thereby to somehow procure meat from large animals, provided the selection pressure for larger brains during the middle and late Pliocene.

Keep in mind that, as always, I am presenting what I believe to be the current consensus interpretation—or, when no consensus exists, the most parsimonious interpretation.

(This is Part V of a multi-part series. Go back to Part I, Part II, Part III, or Part IV.)

Re-Orienting Ourselves In Time

Since we’re all returning to this series after a few weeks off, let’s take a minute to re-orient ourselves. Our narrative has just reached 3 MYA, between Australopithecus afarensis and Australopithecus africanus:

Timeline of hominin evolution

Click the image for more information about the chart. Yes, 'heidelbergensis' is misspelled, and 'Fire' is early by a few hundred KYA, but it's a solid resource overall.

And here’s an excellent reminder that while we’re making progress, there is much left to explain:

Graph of hominin brain size

With that in mind, let’s keep moving!

Australopithecus africanus: The Original Australopith

Back in 1924, the world still believed that the “Piltdown Man” was the “missing link” between apes and humans. Actually, Piltdown Man was a hoax, made from pieces of the skull of a modern human and the jaw of an orangutan—and though it was first publicized in 1912, it wasn’t universally acknowledged as a fraud until 1953. (Though several paleontologists of the time had immediately voiced their doubts, and its influence gradually declined as more and more African fossils were found. By 1953 its official repudiation was basically a formality.)

Strongly contributing to the acceptance of the Piltdown hoax was the early 20th-century belief that the ancestors of humans must have been European, and that brain enlargement must have preceded bipedalism.

You can read more about “Piltdown Man”, and other paleontological controversies, in Roger Lewin’s Bones of Contention.

Unsurprisingly, the Piltdown hoax sabotaged our understanding of human evolutionary history for decades. The first casualty was the Taung child, a skull (complete with teeth) and cranial endocast discovered by quarry workers in the Taung lime mine in South Africa, and officially announced by Raymond Dart in 1925—though not universally accepted as a hominin until two decades later.

Skull of Taung child

Note the short canine teeth.


Why Are There “Southern Apes” In Ethiopia?

The first person to publish the discovery of a new animal (or its fossil) gets to name it. Anyone who names a new genus runs the risk of “their” find being reclassified into an existing genus…but Dart’s classification has stood the test of time, and later finds (such as “Plesianthropus transvaalensis”, later reclassified as another A. africanus) have been absorbed into it.

Unfortunately, the context of a fossil often changes as more and more fossils are found, and the original name can easily turn out to be inappropriate. For instance, Australopithecus means “southern ape”, because the Taung child was found in South Africa…

…and now all australopithecines, even those found in Ethiopia and Kenya, are forever known as “southern apes”. (Even worse, “australo” is Latin, while “pithecus” is Greek.)

While his naming may have been clumsy, it’s important to note that Raymond Dart was correct in several important respects: subsequent fossil finds proved A. africanus was both a hominin and fully bipedal, as Dart had always asserted.

The Taung child dates to 2.5 MYA, and Mrs. Ples (which may actually be a Mr. Ples), discovered in 1947, dates to 2.05 MYA. In total, the time of fossils we classify as A. africanus spans nearly a million years, from 3.03 MYA to 2.05 MYA.

A. africanus vs. A. afarensis

Since we’re entering a time from which we have more fossils to study, the transitions from here on will be more gradual. A. africanus is a relatively short step away from A. afarensis, but the similarities and differences are instructive:

  • A. africanus is slightly shorter than A. afarensis: 3’9″/115cm for females, 4’6″/138cm for males. However, with so few fossils, this may simply be sampling error.
  • Body weight estimates are essentially identical: 66#/30kg for females, 90#/41kg for males. (Source for height and weight estimates.)
  • The africanus skull appears more human-like: the face is flatter and more vertical, the brow ridges are less pronounced, the cheekbones are narrower, and the forehead is more rounded.
  • Africanus teeth and jaws were more human-like than afarensis teeth and jaws: while the teeth and jaws were much larger than a modern human’s, the canines were shorter and less prominent (with no gaps between them and the incisors), and the jawline was more parabolic (human-shaped) and less prognathic. (Click here for a pictorial comparison.)
  • Most importantly, A. africanus adults had a brain volume of 420-500cc, meaningfully larger than the A. afarensis range of 380-430cc.

This implies that there was continuing selection pressure for larger brains—but not larger bodies. We’ve established in Part IV that the ability to somehow procure meat outside the forest most likely provided the necessary selection pressure up to that time…but what is the evidence during the time of A. africanus and beyond?

Continue reading! Big Brains Require An Explanation, Part VI: Why Learning Is Fundamental, Even For Australopithecines

Live in freedom, live in beauty.

JS

(This is Part V of a multi-part series. Go back to Part I, Part II, Part III, or Part IV.)


I’m using a new “share” plugin: let me know if it isn’t working for you. And if anyone knows how to insert a Google +1 button that doesn’t have a counter (counters slow page loads tremendously), please let me know!