Disclaimer
• Your life and health are your own responsibility.
• Your decisions to act (or not act) based on information or advice anyone provides you—including me—are your own responsibility.
Yes, I’m serious, and I’m asking a serious question: why are we here?
By “here”, I mean “on the Internet, reading paleo and nutrition blogs, almost every day.” This describes many of us—myself included—and I had to stop and ask myself “Why am I doing this? What am I looking for?”
Are we afraid that one of our number will turn nutrition completely upside down tomorrow morning—and that a few extra days, or even a couple weeks, of eating as we do now will harm us irreparably? Will a new archaeological find prove that Paleolithic humans subsisted mostly on flowers? Will Harvard researchers publish a double-blinded trial showing that corn oil and HFCS are the foundation of a healthy human diet, and we’ve simply been deficient in them for the last six million years?
It seems unlikely.
So why the continual search for our daily “fix” of updates?
Learning Is Fundamental For Human Survival
I’ve previously made the point that the process of learning allows an animal to change its behavior in cultural time, not evolutionary time…and this can be a powerful survival technique. A moth will spiral into any light source and either immolate itself or repeatedly smash against it…until it dies, someone turns the light out, or the sun comes up. In contrast, most mammals are quite capable of learning that fire burns, thorns are sharp, and just because you can’t see a predator doesn’t mean it can’t see you. And while we’re all familiar with the myriad self-destructive behaviors exhibited by humans, we’re also capable of learning that (for instance) the child we see in a mirror is ourselves, not a stranger that always does what we’re doing—not to mention complex abstractions like algebra.
More importantly, throughout evolutionary time, we were able to learn where animals lived and how they behaved, throughout the seasons of the year and over widely varying climactic conditions; we were able to learn how to find, catch, and kill them despite being much smaller, slower, and weaker; we were able to learn how to make stone tools to kill and butcher them; we were able to learn which plants were edible, which were poisonous, and which poisonous ones could be made edible in a pinch; and we were able to learn the myriad other skills necessary to survive in environments quite hostile to hairless apes.
In other words, the process of learning allowed us to adjust our behavior to conditions—such as Ice Age Europe—completely outside our evolutionary context.
Food Associations Are Powerful
Since procuring food is the central problem of any animal’s daily survival, we would expect our learned knowledge about food to exert a powerful effect on our behavior.
I’m using words informally here: “associative learning” has a specific meaning in cognitive science, and refers only to classical and operant conditioning. Technically we’re also speaking of episodic learning, observational learning, enculturation, and so on…but, speaking in the most general terms, we’re associating smells, tastes, textures, images, sounds, and other experiences with the circumstances surrounding them.
I’ve made the point before that the circumstances surrounding food consumption are powerful determinants of whether we ‘like’ a food. The classic example is beer: almost everyone dislikes beer the first time they taste it. We’re told we simply need to ‘develop a taste’ for beer—
—which usually means drinking with friends until we start to associate its bitter taste with intoxication and positive social interactions. Note the universal context of beer commercials: beer = fun times with friends, who are all gorgeous and/or handsome.
Similarly, foods our parents fed us repeatedly as small children often give us a feeling of emotional security in later life: we call them “comfort foods”. Peanut butter and jelly sandwiches. Kraft Macaroni and Cheese. Spaghetti. Chocolate chip cookies. “Just like Mom used to make!”
Reminds you of childhood, doesn't it?
If you are a parent, consider the associations you’re creating by frequently feeding your child fast food. Yes, it’s convenient and cheap…but do you really want chicken nuggets and Mountain Dew to become your child’s “comfort food”?
We can demonstrate the important role of learned associations by foods unique to certain cultures. Do these pictures make your mouth water?
Those are fermented soybeans, called “natto”.
A chicken embryo, known as “balut”.
Unless you are Japanese or Filipino, it’s very unlikely.
The Food Associations Of Someone New To Paleo
The power of food associations is, I believe, a major reason for the “stickiness” of the online paleo community. We’ve abandoned a way of eating that has powerful positive associations:
The comfort foods we ate as a child: PB&Js, mac and cheese, spaghetti, cookies, cake …
The foods we “eat out” with friends and colleagues: sandwiches, pizza, burgers, burritos …
The “healthy foods” we feel virtuous about eating: low-fat yogurt, whole-grain bagels and cereal, soy lattes, veggieburgers …
The junk foods we know we shouldn’t eat, but which are so delicious anyway: candy, cookies, chips and crisps, cakes and pies, ice cream …
In contrast, many food associations of someone new to Paleo are negative:
Screwing up an unfamiliar recipe in your own kitchen and having to eat it anyway because you’re hungry
Being “that guy” during social occasions (“Burger with no bun. Yes, I said no bun..and can I get veggies instead of french fries? No, I don’t drink beer”)
Having to order basic dietary staples over the Internet, instead of just going to the supermarket
Being unable to just “stop and grab a bite” when you’re traveling
Being completely, utterly sick of the two or three paleo-compatible recipes you’ve figured out how to cook reliably
(Do you have any additions to these lists? Leave a comment!)
Building New Associations: Why We’re Online So Often?
I think that a primary reason paleo eaters—particularly those of us new to paleo—spend so much time online is to build positive associations with our new way of eating.
Having lost many positive associations with a fundamental survival behavior—our drive to eat—it’s easy to feel a bit “down”. Sure, we know how much better we feel physically, how much sharper we are mentally, and we’re unwilling to give that up…but it’s difficult, and often lonely, to abandon decades’ worth of positive associations. When we eat a pizza, we’re not just eating bread, cheese, veggies, and pepperoni…we’re associating the smell, taste, and texture with all the pizza parties we’ve had over the years. When we drink beer, we’re associating it with everything from college keggers to “girls’ night out” to the avalanche of commercials promising good times with beautiful people. When we eat a PB&J, we’re associating it with all the times our parents fixed us one as hungry children coming home from the playground or sports practice. And so on.
Food associations do not trump biochemistry, nor do they magically cause obesity! They can, however, affect our food choices—along with many other factors. I discuss the distinction at length here.
Some Pitfalls Of The Search For Novelty
Unfortunately, there are several pitfalls we can fall into while trying to build positive associations, via the Internet, with our new way of eating.
I believe the underlying reason for most of them is that there’s only so much new information to write about. Back in 2009, “maybe saturated fat isn’t just a waxy form of Death” was a relatively new and daring insight—as was “maybe expensive running shoes aren’t actually good for our feet,” “the Paleolithic really did last millions of years, and agriculture really is a recent development to which we’re not well-adapted,” and “the USDA’s Food Pyramid is an excellent program for producing obesity, Type 2 diabetes, and the rest of the metabolic syndrome.”
However, in 2012, the bar has been dramatically raised by the ongoing hard work of many different authors, scientists, and bloggers. It’s difficult to present an insight that no one has had before, or to find new information that’s empirically useful—and it takes a strong background in the relevant sciences, as well as a substantial time commitment, to find such information or come up with such insights. And we must face the awkward fact that, as we progress beyond the exciting first flush of discovery, we’re unlikely to produce any new insights that overturn the existing paradigm in the way that Paleo overturned the conventional wisdom.
This problem is not unique to the paleo community: it’s shared by every successful movement. What do you do when the excitement of discovery starts to wane?
Yet we’re still combing the Internet in search of…something. Like minds, moral support, better recipes—any sort of positive association to replace the voids left by our abandonment of our old eating habits. But with only so much new information to present, and only so many writers capable of presenting it, it’s easy to fall (or, even worse, to lure one’s readers) into one of several traps.
The search to imitate old favorites that are now off limits. Fake cupcakes don’t taste like real ones, and they’re still calorie bombs. If you really want a cupcake and you’re not gluten-intolerant, just eat a cupcake!
However, you’ll find that, as you eat this way for longer and you develop more positive associations with real food, your emotional cravings for childhood and comfort foods will diminish. (Though not to zero…nothing will make a Grimaldi’s or Giordano’s pizza not taste good.)
Overselling the part of the system one understands as The Key To Obesity (and everything else.) Insulin, leptin, “food reward”, and the hypothalamus have all taken their turns: I predict gut flora will be the Next Big Thing. (And I’d like to remind everyone that the colon is downstream of the small intestine—so nothing about your gut flora will make gluten grains safe to eat. See, for instance, Fasano 2011.)
“Hey, look at me!”—using “Science!” as a tool to distract or obfuscate. For any specific assertion, it’s not difficult to trawl Pubmed until I find a sentence in an abstract that, on the surface, appears to contradict it. See how smart I am?
The important question, of course, now becomes “Well, what should we do now?” If I can’t answer that, then why should I get everyone all worked up? And I’ll say it again: statements such as “I don’t know” and “That’s interesting, tell me more” do not diminish my stature or reputation.
Interpersonal drama. I’ve done my best to avoid it, and to keep gnolls.org safe for calm, reasoned exploration of the science behind paleo. That being said, there’s nothing wrong with some rough give-and-take…
…but it’s important to ask ourselves questions like “Is anything being accomplished here?” “If I ‘win’ this argument, is that going to change anything?” and, most importantly, “Why am I here? If I’m simply trying to make positive associations with my new way of eating, is this counterproductive?”
Once again, it’s fine to host such discussions, or even encourage them, because they help weed the garden of ideas…but we all need to ask ourselves if participating in them is furthering our own goals, or just breeding negativity.
Jealousy. Any successful movement will accumulate both hangers-on and gadflies. And I can’t resist noting that paleo’s most vociferous critics are either avowedly non-paleo, find it pseudoscientific or “too limiting”, or claim enough differences that they require their own brand—but they can’t bring themselves to leave the party, because they know it’s where the action is.
Conclusion: Two Questions To Ask Yourself
If you find yourself becoming drawn into drama, confused about scientific-sounding arguments, or otherwise feeling negative about yourself or the state of the online community, ask yourself:
“Why am I here? What am I looking for?”
If the answer is “I’m looking for positive associations to replace those I’ve lost,” then perhaps it’s time to stand up from the computer desk. Go outside. Play with your kids or your dog. Lift some barbells or kettlebells. Climb something and jump off of it.
Click for my foolproof prime rib recipe, with step-by-step directions.
…and remember that we’re not eating like predators so we can argue more effectively on the Internet.We’re eating like predators so we can live like predators—strong, healthy, alert, and vital.
Live in freedom, live in beauty.
JS
What do you think, and why are you here? Leave a comment…
…and if you find the atmosphere here congenial, feel free to join other gnolls in the forums.
While I disagree with Gyorgy Scrinis (and the popularizer of the concept, Michael Pollan) on their proposed solution, I believe Scrinis’ concept of “nutritionism” as an error in dietary thinking has merit—and I doubt anyone in the paleo community would disagree.
“Reducing food to its nutrient components could be called “nutritionism”, and it has probably become the dominant way of thinking about food and health, and of constructing healthy diets.
The nutrition industry has implicitly, if not explicitly, promoted nutritionism by continually framing most research studies and dietary advice in terms of these chemical-nutrient categories.
The rise of nutritionism is clear in one of the well-known sayings promoted by the food industry and some nutritionists: “There is no such thing as good and bad foods, only good and bad diets.” According to this argument, all types of foods, including junk food, have a place in a “balanced” diet.
…
Marketing foods and diets on the basis of their nutritional composition tends to take attention away from the quality and the type of foods being promoted.
Processed foods, for example, are often fortified with vitamins and minerals, or stripped of some of their fat, to enable such nutrient-content claims to be made. Nutrient claims on the labels of processed foods and drinks conceal the fact these foods are typically high in added fat, sugar, salt, chemical additives and reconstituted ingredients, and have often been stripped of a range of beneficial micro-nutrients and food components.”
We already know all the important nutrients and their functions.
The function of an isolated nutrient (even in a synthetic form not occurring in nature, e.g. folic acid) is exactly the same as its function in food, because…
There are no competitive or synergistic effects between the thousands of chemical compounds found in one bite of real food.
The effect of a food on health is reducible to its effects on the numbers obtained from cheap, easy tests like “BMI” and “total cholesterol”.
Therefore, so long as our diet contains the proper “nutrients”, we will be healthy and happy.
I doubt anyone in the paleo community disagrees with Scrinis (and Pollan) that nutritionism, in its modern form, is bunk. A diet of chicken nuggets, Twinkies, and Diet Coke is not nutritionally equivalent to a diet of fresh meat, fruits, and vegetables no matter how many supplements we take.
What Is Anti-Nutritionism?
Unfortunately it’s possible to fall into an analogous trap when pursuing a paleo way of life…a trap I call “anti-nutritionism”. Anti-nutritionism also makes several unspoken assumptions:
We already know all the important anti-nutrients and their functions.
The function of an isolated anti-nutrient is exactly the same as its function in food, because…
There are no competitive or synergistic effects between the thousands of chemical compounds found in one seed, sprout, fruit, or bite of plant or animal tissue.
Herbivorous, seed-eating mice—especially genetic knockout mice—are metabolically and biochemically the same as humans, and are excellent models for human digestion and metabolism.
Therefore, if I eat a food for six months and I don’t get any fatter or suffer obvious health problems, I can recommend it to others as healthy—and perhaps even paleo-compatible.
Food Doesn’t Want To Be Eaten
It’s tempting to believe that if a food we like doesn’t contain gluten, excessive omega-6 fats, or excessive fructose, that it’s fine to eat. However, all food has defenses against being eaten—because any plant or animal that was eaten before it reproduced failed to leave descendants!
This leads us to a tautological but astonishing conclusion. Every living thing on this Earth is the descendant of millions of generations of successful ancestors—not a single one of which was eaten, trampled, gored, poisoned, burned, drowned, starved, fell from a tree, killed by parasites or infection, or otherwise died before it managed to reproduce at least once.
“Being eaten” certainly qualifies as a reproduction-limiting event. Animals can hide, run away, or counter-attack—but plants cannot. Therefore, we might expect their defenses to involve being disgusting, poisonous, or indigestible—particularly for seeds, their agents of reproduction.
Fruit is the exception to the rule, but there’s an unspoken bargain involved: “eat this delicious, sweet fruit, but don’t digest the seeds…poop them out somewhere else.” As we’d expect, the seeds of most sweet fruits range from bitter to frankly poisonous.
Many books, websites, and scientific papers explore the biochemistry of anti-nutrients like gluten and gliadin (found in wheat and its relatives) and lectins (found in just about every plant seed), and I won’t rehash the biochemistry here. But just as our knowledge of the nutrients in food and their function is incomplete, our knowledge of anti-nutrients is, if anything, far more incomplete.
To illustrate the limitations of the paleo community’s understanding of anti-nutrients, here’s an example I’ve never seen mentioned by any paleo source: L-canavanine.
(Update: Though it makes no appearance in the literature, apparently Dr. Loren Cordain has indeed been discussing L-canavanine in his speeches and presentations. Thanks to Pedro Bastos for the correction.)
“L-canavanine is a common non-protein amino acid found naturally in alfalfa sprouts, broad beans [also known as “fava beans”], jack beans, and a number of other legume foods [including sword beans] and animal feed ingredients [1] at up to 2.4% of food dry matter. This analog of arginine (Figure 1.) can also block NO synthesis [2-5], interfere with normal ammonia disposal [6,7], charge tRNAarg, cause the synthesis of canavanyl proteins [8], as well as prevent normal reproduction in arthropods [9] and rodents [10].
Canavanine has also been reported to induce a condition that mimics systemic lupus erythematosus (SLE) in primates [11,12], to increase antibodies to nuclear components and promote SLE-like lesions in auto immune-susceptible (e.g., (NZB X NZW)F1) mice [13].” (Brown 2005)
Stated plainly: canavanine “looks” like arginine, and is incorporated into our tissues like arginine…but the resulting proteins don’t function properly. And did I hear someone say “lupus”?
“Alfalfa sprouts can induce systemic lupus erythematosus (SLE) in monkeys. This property of alfalfa sprouts has been attributed to their non-protein amino acid constituent, L-canavanine. Occurrence of autoimmune hemolytic anemia and exacerbation of SLE have been linked to ingestion of alfalfa tablets containing L-canavanine. In this report we show that L-canavanine has dose-related effects in vitro on human immunoregulatory cells, which could explain its lupus-inducing potential.”
Rheum Dis Clin North Am. 1991 May;17(2):323-32. Dietary amino acid-induced systemic lupus erythematosus.
Montanaro A, Bardana EJ Jr.
“In this article, we detail our experience with a human subject who developed autoimmune hemolytic anemia while participating in a research study that required the ingestion of alfalfa seeds. Subsequent experimental studies in primates ingesting alfalfa sprout seeds and L-canavanine (a prominent amino acid constituent of alfalfa) is presented. The results of these studies indicate a potential toxic and immunoregulatory role of L-canavanine in the induction of a systemic lupus-like disease in primates.”
L-canavanine, being an amino acid, is not deactivated by heat or cooking. So when we hear statements like “Beans are fine so long as you soak or sprout them”, it’s worth reminding ourselves that this isn’t even true according to the tiny fraction of legume biochemistry we understand—let alone the overwhelming majority we don’t.
“Much more needs to be learned of the biological activity, the relative toxicities of these compounds to different organisms, and their nutritional value if we are to make the best use of them and the plants in which they are synthesized.”
Am J Clin Nutr November 1995 vol. 62 no. 5 1027-1028 Reply to NR Farnsworth
Victor Herbert
Also note that you’ll find a much-copied reference on the Internet claiming that canavanine toxicity is irrelevant to humans. Don’t be misled: it’s an article from a 1995 vegetarian journal which makes a host of blatantly false claims, such as “There is NO canavanine at all in other legumes that are commonly used as human food.”
Favism: A Postscript to the Fava Bean/Broad Bean Issue
Canavanine toxicity is distinct from vicine toxicity. Vicine (and its analogs covicin and isouramil) is a poison in fava beans that causes hemolytic anemia in susceptible people—a sometimes-fatal condition known as favism. Favism is caused by G6PDH deficiencies, common X-linked mutations which affect over 400 million people worldwide, mostly in Africa, the Middle East, and southern Asia.
Intermission
The Limitations Of Self-Experimentation and N=1
Self-experimentation is very important, and we can learn much that is useful from it. For instance, trying to dial in carbohydrate intake can be a balancing act between weight loss, mood, and physical performance. People have found solutions to their own individual health issues via anything from egg yolks to beef liver to coconut oil to magnesium supplementation. And just coming up with a new repertoire of healthy, paleo-compatible foods to replace the pantry full of junk we used to eat involves extensive N=1 with new recipes—with immediate success not guaranteed.
However, there are limits to the knowledge we can accumulate. Stated plainly:
N=1 self-experimentation can tell us what works best for ourselves—within the limits of healthy eating, as defined by biochemistry and evolutionary context.
However, self-experimentation alone cannot tell us which foods are healthy to eat, because even a dramatic increase in lifetime risk is vanishingly unlikely to manifest itself during a few months of self-experimentation.
For instance, here’s a seemingly reasonable statement:
1. “I ate corn for six months, and I didn’t gain weight or feel worse. Therefore corn is healthy to eat.”
It’s certainly tempting to make these sorts of statements—but I find that temptation is best resisted. To illustrate why, here’s an equivalent statement that we can all agree isn’t reasonable:
2. “I started smoking six months ago, and I feel fine. Therefore smoking is healthy.”
Permit me to drive the point home with force:
3. “I started eating strontium-90 six months ago, and I haven’t got cancer yet. Therefore radiation exposure is healthy.”
4. “I started shooting heroin six months ago. It’s solved all my anxiety issues, and I’ve lost twenty pounds! Therefore shooting heroin is healthy.”
5. “I started having unprotected sex with Tanzanian hookers six months ago, and I feel great! Therefore unprotected sex with high-risk strangers is healthy.”
The reason we can identify the second through fifth statements as false is because we don’t trust the results of our own self-experimentation. We know that long-term observations show that smoking greatly increases our risk of several forms of cancer and heart disease; each Sievert of radiation exposure causes a 5-10% increase in cancer deaths (Strom 2003); heroin addiction is almost never a controllable vice; and HIV infection takes longer than six months to produce symptoms of AIDS—no matter how we feel in the short term.
No, I’m not directly comparing eating corn to smoking or unprotected sex with high-risk strangers! I’m demonstrating that even a substantial increase in lifetime risk is vanishingly unlikely to manifest itself within any period of self-experimentation. This is why anecdotes are useless when evaluating risk.
For example, my grandfather smoked two packs of cigarettes a day for over sixty years, dying in his 80s of a non-smoking-related illness…but that doesn’t change the fact that smokers contract lung cancer 15-20x more often than non-smokers (Thun et.al. 2008), and also suffer from all types of heart disease, many other cancers, renal damage, and impotence at a far greater rate than non-smokers. And while I’ve spent plenty of time making fun of weak associations extracted from known-bad data, I do find the evidence for negative health effects from regular smoking reasonably convincing—though perhaps of smaller magnitude than claimed by typical sound-bites.
In conclusion, it’s clear that anti-nutritionism makes it easy to fall into the trap of extrapolating N=1 beyond its limits. By assuming that we already know all the important anti-nutrients, we can easily convince ourselves that a clearly Neolithic food is healthy (or, at least, harmless) just because we don’t feel any obvious harmful effects from consuming it in the short term.
To answer such questions, we need to apply science, not N=1…
…and it is very likely that the answer will not be authoritative. Scientific answers are much more likely to be of the form “There are a lot of potential toxins, but we don’t know how bad they are for humans, either singly or in combination” or “It’s analogous to something that quickly causes pancreatic cancer in rats—at 10 times a realistic dietary dose.”
That’s where evolutionary context comes in, and where I use my general rule of thumb, previously seen here:
Eat foods you could pick, dig, or spear. Mostly spear.
The Takeaways: Now What?
My intent is not to encourage anyone to become overly fearful about eating the occasional bowl of ice cream or tarka dal! I understand that even functional paleo can feel somewhat limiting at times, and that nothing will make a fresh, hot Krispy Kreme not taste delicious.
What I’m doing is cautioning my readers that no interesting or useful information comes from arguments about whose N=1 is more authoritative; I’m reiterating my own commitment to careful, rational inquiry; and, most importantly, I’m hoping to communicate my own respect, humility, and awe as one infinitesimal part of our huge, beautiful and dizzyingly complex world and the multi-billion year history of life upon it. As I said nearly a year ago:
“There is an important difference between “We don’t know all the answers yet” and “Do what feels right, man.” These questions have answers, because humans have biochemistry, and we should do our best to find them and live by the results.”
Meanwhile, I will continue to do my best to find interesting and useful information at the intersection of biochemistry and evolutionary context, and I will continue to explain it as best I can to you, my readers, here at gnolls.org.
And since I like to leave my readers with a few practical takeaways, here are some useful thoughts for when you start finding even functional paleo limiting or monotonous.
Consider what you’ve gained, not just what you’ve lost. Sure, you can’t just binge on half a dozen crullers anymore…but you can eat all the prime rib you want without any form of guilt. How cool is that?
If you’re stuck in a rut of monotonous food, try some new recipes. Yes, it’ll take some time and several tries to find and perfect a new dish you like as much as your current favorites. Here’s an endless source to get you started.
Cheat proudly. For the most part, the dose makes the poison…so unless cheating will start you on a binge, it’s better to say “I am going to eat these street tacos because they’re delicious and I want some” than to try to convince yourself that corn is paleo.
Cheat intelligently. Think of a cheat as dessert: once you’ve satisfied yourself with a complete meal, you can think about a Coke or a Reese’s. Otherwise you run the risk of your cheat replacing an entire meal—and once you’ve been paleo for a while, 1200 calories worth of Krispy Kremes will most likely make you feel like you’ve contracted Ebola Zaire.
Live in your body. The pleasure of junk food lasts until it slides down your throat: the pleasure of good health manifests itself 24/7 in better sleep, less pain, greater mental clarity and capacity, and greater physical ability. The strong, sleek, healthy body of an apex predator is a great place to be. Instead of medicating it into passivity or becoming a sessile peripheral to your computer and television, go outside. Climb a tree, kick balls, shoot baskets. Learn a new skill. Explore somewhere you’ve never been.
There’s a big, bright, beautiful world out there: what are you waiting for?
Live in freedom, live in beauty.
JS
Yes, this one turned into another epic! Spread it like pollen with the new social clicky-popup-thing…and please support my continued efforts by making your Amazon purchases through my referral link. Did I mention that T-shirts are back in stock, in all sizes?
Normally I’d be continuing my ongoing series on the evolutionary history of the human brain. However, there is yet another red meat scare story making the rounds—and many readers have asked me to analyze it. Should we really be eating less red meat?
I don’t like to spend my time debunking specific studies—because as I said in a previous article about bad science, it’s like trying to hold back the ocean with a blue tarp and some rebar. However, I’ve wanted to write an article about the limitations and potential abuses of observational studies for some time, and “Red Meat Consumption And Mortality” is as good a starting point as any.
What Kind Of Study Is This, Anyway? Randomized Controlled Trials Vs. Observational Studies
The first and most important question we must ask is “What actual scientific data is this article based on?” It’s often tricky to find out, because most “news” articles don’t even mention the title of the original scientific paper, let alone link to it. (I usually start with a Pubmed search on the authors and narrow it down by journal.) In the overwhelming majority of cases, we’ll find that the data in question comes from what’s known as a “retrospective cohort study”.
In some cases, there isn’t any data: it’s just a lightly-camouflaged press release from a supplement peddler or drug manufacturer, designed to sell you pills, powders, or extracts. We’ll ignore those for now.
When most of us think of a scientific study, we’re thinking of a randomized controlled trial (RCT). The participants are divided randomly into two groups, matched as well as possible for age, sex, health history, smoking status, and any other factor that might affect the outcome. One group is given the treatment, the other is given no treatment.
In the more rigorous forms of RCT, the “no treatment” group is given a sham treatment (known as “placebo”) so that the subjects don’t know whether they’ve received treatment or not. This is sometimes called a “single-blinded” trial. Since the simple act of being attended to has a positive and significant effect on health (the “placebo effect”), unblinded trials (also known as “open label”) are usually not taken very seriously.
To add additional rigor, some trials are structured so that the clinicians administering the treatment don’t know who’s receiving the real treatment or not. This is sometimes called a “double-blinded” trial. And if the clinicians assessing outcomes don’t know who received the real treatment, it’s sometimes called “triple-blinded”. (These terms are now being discouraged in favor of simply calling a study “blinded” and specifying which groups have been blinded.)
Double-blinded, randomized controlled trials are the gold standard of research, because they’re the only type of trial that can prove the statement “X causes Y”. Unfortunately, RCTs are expensive—especially nutrition studies, which require feeding large groups over extended periods, and to be completely rigorous, isolating the subjects so they can’t consume foods that aren’t part of the experiment. (These are usually called “metabolic ward studies”.)
Result: RCTs are infrequently done, especially in the nutrition field.
What Is An Observational Study? Cohort Studies and Cross-Sectional Studies
Since nutrition RCTs are so rare, almost all nutrition headlines are based on observational studies.
In an observational study, the investigators don’t attempt to control the behavior of the subjects: they simply collect data about what the subjects are doing on their own. There are two main types of observational studies: cohort studies identify a specific group and track it over a period of time, whereas population studies measure characteristics of an entire population at one single point in time.
Cohort studies can be further divided into prospective cohort studies, in which the groups and study criteria are defined before the study begins, and retrospective cohort studies, in which existing data is “mined” after the fact for possible associations. (More.)
As the terminology starts getting intense (e.g. case-control studies vs. nested case-control studies), I’ll stop here.
The overwhelming majority of nutrition headlines are from cohort studies, in which health data has been collected for years (or decades) from a fixed group of people, often with no specific goal in mind. Expressed in the simplest possible language:
“Let’s watch the same group of people for decades, measure some things every once in a while, and see what happens to them. Then we can go back through the data and see if the people with a specific health issue had anything else in common.”
It’s easy to see that looking for statistical associations in data that already exists is far easier and cheaper than performing a randomized clinical trial. Unfortunately, there are several problems with observational studies. The first, and most damning, is that observational studies cannot prove that anything is the cause of anything else! They can only show an association between two or more factors—
—and that association may not mean what we think it means. In fact, it may not mean anything at all!
There are more potential pitfalls of the retrospective observational studies which underlie almost every nutrition headline. Let’s explore some of them.
Problem: Sampling Bias
Here’s the classic example of sampling bias:
Going into the 1948 presidential election, polls consistently predicted a Dewey victory, by a substantial margin of 5-15%. Of course, Harry S Truman won by 4.4%. The reason the poll results differed so much from the actual outcome was that the polling was done by telephone—and in 1948, private telephone lines were very expensive. Therefore, the pollsters were unwittingly selecting only the relatively wealthy—who tended to vote Republican—for their survey. (More: DEWEY DEFEATS TRUMAN and Cancer Statistics, J Natl Cancer Inst (2009) 101 (16): 1157.)
In other words, the entire group we’re studying may have inadvertently been selected for certain characteristics that skew our results, making them inapplicable to the population at large.
Selection Bias, or The Healthy Volunteer Problem
“Selection bias” occurs because, unlike an RCT in which the participants are randomly assigned to groups that are matched as well as possible, the people in an observational study choose their own behavior.
Most women will be familiar with the classic story of selection bias: the saga of hormone replacement therapy, or HRT.
1991: “Every woman should get on HRT immediately, because it prevents heart attacks!” 2002: “Every woman should get off HRT immediately, because it causes heart attacks!”
“…the pooled estimate of effect from the best quality observational studies (internally controlled prospective and angiographic studies) inferred a relative reduction of 50% with ever [sic] use of HRT and stated that ‘overall, the bulk of the evidence strongly supports a protective effect of estrogens that is unlikely to be explained by confounding factors’.4
By contrast, recent randomized trials among both women with established CHD and healthy women have found HRT to be associated with slightly increased risk of CHD or null effects.1,2 For example, the large Women’s Health Initiative (WHI) randomized trial found that the hazards ratio for CHD associated with being allocated to combined HRT was 1.29 (95% CI: 1.02, 1.63), after 5.2 years of follow-up.1″
How did a 50% reduction in CHD (coronary heart disease) turn into a 30% increase in CHD?
It’s because the initial data from 1991 was from the Nurses’ Health Study, an associative cohort study which could only answer the question “What are the health characteristics of nurses who choose to undergo HRT versus nurses who don’t?” The followup data from 2002 was from a randomized clinical trial, which answered the much more relevant question “What happens to two matched groups of women when one undergoes HRT and the other doesn’t?”
It turns out that the effect of selection bias—women voluntarily choosing to be early adopters of a then-experimental procedure—completely overwhelmed the actual health effects of HRT. In other words, nurses who were willing to undergo cutting-edge medical treatment were far healthier than nurses who weren’t.
Is The Data Any Good? Garbage In = Garbage Out
This huge pitfall of observational studies is often neglected: in large cohort studies, data is often self-reported, and self-reported data is often wildly inaccurate.
Since we’re already discussing the Nurses’ Health Study, let’s take a closer look at its food consumption data. This study attempted to rigorously evaluate the accuracy of the FFQs (Food Frequency Questionaire) filled out by study participants:
“The reproducibility and validity of responses for 55 specific foods and beverages on a self-administered food frequency questionnaire were evaluated. One hundred and seventy three women from the Nurses’ Health Study completed the questionnaire twice approximately 12 months apart and also recorded their food consumption for seven consecutive days, four times during the one-year interval.”
In other words, the standard FFQ for the Nurses’ Health Study consists of “Try to remember what you ate last year, on average.” We might expect this not to be terribly accurate…
…and we’d be right.
“They found that the FFQ predicted true intake of some foods very well and true intake of other foods very poorly. True intake of coffee could explain 55 percent of the variation in answers on the FFQ, while true intake of beer could explain almost 70 percent. True intake of skim milk and butter both explained about 45 percent, while eggs followed closely behind at 41 percent.
But the ability of the FFQ to predict true intake of meats was horrible. It was only 19 percent for bacon, 14 percent for skinless chicken, 12 percent for fish and meat, 11 percent for processed meats, 5 percent for chicken with skin, 4 percent for hot dogs, and 1.4 percent for hamburgers.
If your jaw just dropped, let me assure you that you read that right and it is not a typo. The true intake of hamburgers explained only 1.4 percent of the variation in people’s claims on the FFQ about how often they ate hamburgers!”
–“Will Eating Meat Make Us Die Younger?”, Chris Masterjohn, March 27, 2009
Stop for a moment and wrap your mind around this fact: the intake of meat reported by the hundreds of studies which use data mined from the Nurses’ Health Study is almost completely unrelated to how much meat the study participants actually ate.
Here’s a graph of the ugly truth, again from Chris Masterjohn:
The left-hand bars are the first questionaire, which we'd expect to be closer to the reported data in the NHS than the second questionaire (right-hand bars).
Why might this be the case?
“Focusing on the second questionnaire, we found that butter, whole milk, eggs, processed meat, and cold breakfast cereal were underestimated by 10 to 30% on the questionnaire. In contrast, a number of fruits and vegetables, yoghurt and fish were overestimated by at least 50%. These findings for specific foods suggest that participants over-reported consumption of foods often considered desirable or healthy, such as fruit and vegetables, and underestimated foods considered less desirable.” –Salvini et.al., via Chris Masterjohn
In support, I note that reported intake of yellow squash and spinach was also correlated by less than 10% with actual intake. Additionally, I’ll point you towards this article, which begins with a startling statistic: 64% of self-reported ‘vegetarians’ in the USA ate meat on at least one of the two days on which their dietary intake was surveyed.
In other words, the observational studies that cite meat intake data from the Nurses’ Health Study are not telling you about the health of nurses who actually eat meat: they’re telling you about the health of nurses who are willing to admit to eating meat on a written questionaire—and the two are almost completely unrelated. Furthermore, I see no basis to claim that any other data set based on occasional self-reported dietary intake will be substantially more accurate.
“Correlation Does Not Imply Causation”: What Does That Mean?
The logical fallacy of “correlation proves causation” is extremely common—because it’s very easy to slide into.
It’s called “cum hoc ergo propter hoc” in Latin, if you want to impress people at the risk of being pedantic. Literally translated, it means “with this, therefore because of this.”
In plain language, “correlation does not imply causation” means “Just because two things vary in a similar way over time doesn’t mean one is causing the other.” Since observational studies can only prove correlation, not causation, almost every nutrition article which claims “X Causes Y” is factually wrong. The only statements we can make from an observational study are “X Associated With Y” or “X Linked With Y”.
We’ve already covered the cases in which sampling bias and selection bias skew the results, and the cases in which the data is inaccurate: let’s look at the purely logical pitfalls.
First, we could be dealing with a case of reverse causation. (“I always see lots of firemen at fires: therefore, firemen cause fires and we should outlaw firemen.”)
Second, we could be dealing with a third factor. “Sleeping with one’s shoes on is strongly correlated with waking up with a headache. Therefore, sleeping with one’s shoes on causes headaches.” Obviously, in this case, being drunk causes both…but when we’re looking at huge associative data sets and trying to learn more about diseases we don’t understand, the truth isn’t so obvious.
The third factor is often one of the pitfalls we’ve previously discussed: sampling bias, selection bias, or inaccurate data. Another example of selection bias: “Playing basketball is strongly correlated with being tall. Therefore, everyone should play basketball so they grow taller.” (Hat tip to Tom Naughton for the analogy.)
Or, the relationship could be a spurious relationship—pure coincidence.
Personally, I blame M. Night Shyamalan.
Complete Bunk: It Happens
There is also the possibility that the truth is being stretched or broken…that the data is being misrepresented. This isn’t as common with peer-reviewed science as it is with books and popular media (see Denise Minger’s debunking of “The China Study” and “Forks Over Knives”), but it can and has occurred.
“Red Meat Blamed For 1 In 10 Early Deaths”: Where’s The Science?
Now that we understand the limitations and potential pitfalls of observational studies, we can rationally evaluate the claims of the news articles based on them. For example, here’s the actual study on which the latest round of “Red Meat Will Kill You” stories is based:
Arch Intern Med. doi:10.1001/archinternmed.2011.2287 Red Meat Consumption and Mortality: Results From 2 Prospective Cohort Studies
An Pan, PhD; Qi Sun, MD, ScD; Adam M. Bernstein, MD, ScD; Matthias B. Schulze, DrPH; JoAnn E. Manson, MD, DrPH; Meir J. Stampfer, MD, DrPH; Walter C. Willett, MD, DrPH; Frank B. Hu, MD, PhD
“Prospective cohort study?” Apparently this is yet another observational study—and therefore, it cannot be used to prove anything or claim that anything “causes” anything else. Correlation is not causation.
Unfortunately, while the study authors maintain this distinction, it’s quickly lost when it comes time to write newspaper articles. Here’s a typical representative:
Headline: “Red meat is blamed for one in 10 early deaths” (The Daily Telegraph)
[False. Since Pan et.al. is an observational study, we can’t assign blame.]
“Eating steak increases the risk of early death by 12%.”
[Another false statement: associational studies cannot prove causation.]
“The study found that cutting the amount of red meat in peoples’ diets to 1.5 ounces (42 grams) a day, equivalent to one large steak a week, could prevent almost one in 10 early deaths in men and one in 13 in women.”
[Note the weasel words “could prevent”. Just like playing basketball could make you taller, but it won’t. And just like HRT could have prevented heart attacks: instead, it caused them.]
“Replacing red meat with poultry, fish or vegetables, whole grains and other healthy foods cut the risk of dying by up to one fifth, the study found.”
[No, it didn’t. The risk of dying was associated with self-reported intake of red meat and “healthy foods”.]
“But that’s just definitional nitpicking,” you say. “What about that 12% association?” It’s not nitpicking at all—because we’ve just opened the door to explaining that association in many other ways.
What Does “Red Meat Consumption and Mortality” (Pan et.al.) Really Tell Us?
“We prospectively observed 37 698 men from the Health Professionals Follow-up Study (1986-2008) and 83 644 women from the Nurses’ Health Study (1980-2008) who were free of cardiovascular disease (CVD) and cancer at baseline. Diet was assessed by validated food frequency questionnaires and updated every 4 years.”
–Pan et.al.
Remember the Nurses’ Health Study?
The same study we talked about above—which was used to claim that HRT decreased heart disease by 50%, while a controlled trial showed that HRT actually increased heart disease by 30%?
The same study we talked about above—for which we’ve already proven, using peer-reviewed research, that the self-reported meat consumption data from the “food frequency questionaires” was unrelated to how much meat the nurses actually ate? And that the nurses, like most of us, exaggerated their intake of foods they thought were healthy by over 50%, and decreased their intake of foods they thought were unhealthy (like red meat) by up to 30%?
Yes, we’ve just kicked the legs out from under this entire study. It’s pinning a 12% variation in death rate on data we’ve already proven to be off by -30% to +50%—and more importantly, to be unrelated to the nurses’ actual consumption of red meat. (Or of meat in general…even chicken was only recalled with 5-14% accuracy.)
So much for the headlines! Here’s an accurate statement, based on the actual data from Pan et.al.:
“If you are a nurse or other health professional, telling the truth about how much red meat you eat, on a survey you fill out once every four years, is associated with a 12% increased risk of early death.”
And just to nail this down, here’s another study—also from the Harvard School Of Public Health—which comes to the opposite conclusion:
Red meat intake was not associated with CHD (n=4 studies; relative risk per 100-g serving per day=1.00; 95% confidence interval, 0.81 to 1.23; P for heterogeneity=0.36) or diabetes mellitus (n=5; relative risk=1.16; 95% confidence interval, 0.92 to 1.46;
But Wait, There’s More
We’re done, and I could easily stop here—but there’s more to talk about! Note this surprising statement from the “Results” section:
“Additional adjustment for saturated fat and cholesterol moderately attenuated the association between red meat intake and risk of CVD death, and the pooled HR (95% CI) dropped from 1.16 (1.12-1.20) to 1.12 (1.07-1.18).”
–Pan et.al. (Credit to “wildwabbit” at Paleohacks for catching this one.)
And the data from Table 1 clearly shows that the people who admitted to eating the most red meat had, by far, the lowest cholesterol levels.
Wait, what? Aren’t saturated fat and cholesterol supposed to cause heart disease? This is another clue that the story, and the data, isn’t quite as advertised.
Here’s another trick that’s been played with the data: contrary to the statement “replacing 1 serving of total red meat with 1 serving of fish, poultry, nuts, legumes, low-fat dairy products, or whole grains daily was associated with a lower risk of total mortality”, the curve they draw in Figure 1 has been dramatically, er, “smoothed.” The source data, in Table 2, shows that the age-adjusted quintiles of reported unprocessed red meat intake from the Nurses’ Health Study (remember, we’ve already proven these numbers aren’t real) have hazard ratios of 1.00, 1.05, 0.98, 1.09, and 1.48.
In other words, the data isn’t a smooth curve…it’s a hockey stick, and the relative risk is basically 1.0 except for the top quintile. (Credit to Roger C at Paleohacks for catching this one.)
This is important because it helps us to explain the 12% increase based on reported red meat consumption. We already know that the subjects of the study weren’t truthfully reporting their meat intake of any kind—and that foods perceived unhealthy were underreported on average, while foods reported healthy were overreported on average.
Table 1 shows that the highest quintile of reported red meat consumption was strongly associated with other behaviors and characteristics known to be associated with poor health: smoking, drinking, high BMI. Most impressively, it was associated with a 69% increase (NHS data) or 44% increase (HPF data) in reported total calories per day,which lends weight to the idea that the lower quintiles were simply underreporting their intake of foods they considered “unhealthy”, including red meat…
…unless we accept that 1/5 of nurses live on 1200 calories per day (and coincidentally report the lowest red meat intake) while 1/5 eat over 2000 calories per day (and coincidentally report the highest red meat intake).
Calorie consumption is our smoking gun.The average American female aged 20-59 consumes approximately 1900 calories/day, and not all nurses are female. (Source: NHANES 1999-2000, through the CDC.)
Therefore, a reported average consumption of 1200 calories/day is extremely implausible. It’s even less plausible that nurses who reported the lowest intake of red meat just happened to be on a 1200-calorie semi-starvation diet; that total reported calorie intake just happened to rise dramatically with reported red meat intake; and that only the nurses who reported eating the most red meat consumed a statistically average number of total calories.
Since we already know from Salvini et.al. that actual consumption is unrelated to reported consumption, underreporting of red meat and other foods perceived as “unhealthy” by the lower quintiles is a far more reasonable explanation.
So What’s The Real Story?
While we’ll probably never know the truth, I believe the most parsimonious explanation is this:
Nurses and other health professionals know intimately the mainstream advice on health, and cannot fail to have given it to thousands of patients over the decades: “eat less, stop smoking, drink less alcohol, avoid cholesterol, avoid saturated fat, avoid red meat.” Therefore, any health professional willing to admit in writing to smoking, drinking, and eating over three servings of red meat per day (see the NHS data in Table 1) most likely doesn’t care very much about their own state of health.
And just as we saw with the HRT data—where a theoretical 50% decrease in heart disease was later proven to mask a real 30% increase, due to the selection bias inherent in the very same dataset (Nurses’ Health Study) used here—I think that we’ll someday find out through controlled, randomized trials that eating plenty of red meat, eggs, and other whole, natural foods high in cholesterol and saturated fat is the real “heart-healthy diet.”
What an epic this turned out to be! Please use the buttons below to forward this article around, and please send this link to anyone who sends you one of the innumerable scare stories. And if you learn of other solid debunking articles I can link, contact me or leave a comment!
You can support my continued efforts to bring you these dense, informative articles by buying a copy of my “Bold, fresh and interesting,” “Elegantly terse,” “Scathing yet beautiful,” “Funny, provocative, entertaining, fun, insightful” book, The Gnoll Credo—or even a T-shirt.
Support gnolls.org by making your Amazon.com purchases through this affiliate link:
It costs you nothing, and I get a small spiff. Thanks! -JS
.
Subscribe to Posts
Gnolls In Your Inbox!
Sign up for the sporadic yet informative gnolls.org newsletter. Since I don't update every day, this is a great way to keep abreast of important content. (Your email will not be sold or shared.)
IMPORTANT! If you do not receive a confirmation email, check your spam folder.