Dwarf Wheat: is it good for us?

A friend recently suggested that dwarf grains might be a key component in the recent explosion of health conditions like obesity and gluten (or other wheat-related) sensitivities.

According to Wikipedia:

The Green Revolution, or Third Agricultural Revolution, is a set of research technology transfer initiatives occurring between 1950 and the late 1960s, that increased agricultural production worldwide, particularly in the developing world, beginning most markedly in the late 1960s.[1] The initiatives resulted in the adoption of new technologies, including high-yielding varieties (HYVs) of cereals, especially dwarf wheats and rices, in association with chemical fertilizers and agro-chemicals, and with controlled water-supply (usually involving irrigation) and new methods of cultivation, including mechanization.

Most people would say that this has been good because we now have a lot fewer people starving to death. We also have a lot more fat people. There’s an obvious link, inasmuch as it is much easier to be fat if there is more food around, but we’re investigating a less obvious link: does the nutritional/other content of new wheat varieties contribute to certain modern health problems?

Continuing with Wikipedia:

The novel technological development of the Green Revolution was the production of novel wheat cultivarsAgronomists bred cultivars of maize, wheat, and rice that are the generally referred to as HYVs or “high-yielding varieties“. HYVs have higher nitrogen-absorbing potential than other varieties. Since cereals that absorbed extra nitrogen would typically lodge, or fall over before harvest, semi-dwarfing genes were bred into their genomes. …

Dr. Norman Borlaug, who is usually recognized as the “Father of the Green Revolution”, bred rust-resistant cultivars which have strong and firm stems, preventing them from falling over under extreme weather at high levels of fertilization. … These programs successfully led the harvest double in these countries.[40]

Plant scientists figured out several parameters related to the high yield and identified the related genes which control the plant height and tiller number.[43] … Stem growth in the mutant background is significantly reduced leading to the dwarf phenotypePhotosynthetic investment in the stem is reduced dramatically as the shorter plants are inherently more stable mechanically. Assimilates become redirected to grain production, amplifying in particular the effect of chemical fertilizers on commercial yield.

HYVs significantly outperform traditional varieties in the presence of adequate irrigation, pesticides, and fertilizers. In the absence of these inputs, traditional varieties may outperform HYVs.

In other words, if you breed a variety of wheat (or rice, or whatever) that takes up nutrients really fast and grows really fast, it tends to get top-heavy and fall over. Your wheat then lies on the ground and gets all soggy and rotten and is impossible to use. But if you make your fast-growing wheat shorter, by crossing it with some short (dwarf) varieties, it doesn’t fall over and it can devote even more of its energy to making nice, fat wheat berries instead of long, thin stems.

(I find it interesting that a lot of this research was done in Mexico. Incidentally, Mexico is also one of the fattest countries–on average–in the world.)

But we are talking about making the plant grow faster than it normally would, via the intake of more than usual levels of nutrients. This requires the use of more fertilizers, as these varieties can’t grow properly otherwise.

I’ve just started researching this, so I’m just reading papers and posting some links/quotes/summaries.

Elevating optimal human nutrition to a central goal of plant breeding and production of plant-based foods:

…  However, deficiencies in certain amino acids, minerals, vitamins and fatty acids in staple crops, and animal diets derived from them, have aggravated the problem of malnutrition and the increasing incidence of certain chronic diseases in nominally well-nourished people (the so-called diseases of civilization). …

The inadequacy of cereal grains as a primary food for humans arises from the fundamentals of plant physiology. … Their carbohydrate, protein and lipid profiles reflect the specific requirements for seed and seedling survival. This nutrient profile, especially after selection during domestication [], is far from optimal for human or animal nutrition. For example, the seeds of most cultivated plants contain much higher concentrations of omega-6 fatty acids than omega-3 fatty acids than is desirable for human nutrition [], with few exceptions such as flax, camelina (Camelina sativa) and walnuts. …

The authors then describe what’s up up with the fats–for plants to germinate in colder temperatures, they need more omega-3s, which are more liquid at colder temperatures. Plants in warmer climates don’t need omega-3s, so they have more omega-6s. (Presumably omega-6s are more heat tolerant, making them more stable during high-temperature cooking.)

Flax and walnut have low smokepoints (that is, they start turning to smoke at low temperatures) and so are unsuited to high-temperature cooking. People prefer to cook with oils that can withstand higher temperatures, like peanut, soy, corn, and canola.

I think one of the issues with fast food (and perhaps restaurant food in general) is that it needs to be cooked fast, which means it needs to be cooked at high temperatures, which requires the use of oils with high smokepoints, which are not necessarily the best for human health. The same food cooked more slowly at lower temperatures might be just fine, though.

There is a side issue that while oil smoking is unpleasant and bad, the high-temperature oils that don’t smoke aren’t necessarily any better, because I think they are undergoing other undesirable internal changes to prevent smoking.

Then there’s the downstream matter of the feed cattle and chickens are getting. My impression of cattle raising (from having walked around a cattle ranch a few times) is that most cattle eat naturally growing pasture grass most of the time, because buying feed and shipping it out to them is way too expensive. This grass is not human feed and is not fungible with human feed, because growing food for humans requires more effort (and water) than just letting cows wander around in the grass. Modern crops require a lot of water and fertilizer to grow properly (see the Wikipedia quote above.) This is why I am not convinced by the vegetarian argument that we could produce a lot more food for humans if we stopped producing cows–cattle feed and human feed are not energy/resource equivalent.

However, once the cows are grown, they are generally sent to feedlots to be fattened up before slaughter. Here they are given corn and other grains. The varieties of grains they are fed at this point may influence the nature of the fats they subsequently build:

Modern grain-fed meat and grain-rich diets are particularly abundant in omega-6 fatty acids, and it is thought that a deficiency of omega-3 fatty acids, especially the EPA and DHA found in fish oils, can be linked to many of the inflammatory diseases of the western diet, such as cardiovascular disease and arthritis (). DHA has been recognized as being vitally important for brain function, and a deficiency of this fatty acid has been linked to depression, cognitive disorders, and mental illness ().

Let’s get back to the article about plant breeding. I thought this was interesting:

The biological basis of protein limitation in seed-based foods appears to be the result of evolutionary strategies that plants use to build storage proteins. Seed storage proteins have evolved to store amino nitrogen polymerized in compact forms, i.e. in storage proteins such as zein in maize, gluten in wheat and hordein in barley. As the seed germinates, enzymes hydrolyze the storage proteins and the plant is able to use these stored amino acids as precursors to re-synthesize all of the twenty amino acids needed for de novo protein synthesis.

So if we make plants that absorb more nitrogen, and we dump a lot more nitrogen on them, do we get wheat with more gluten in it?

Another book I read, Nourishing Traditions, which is really a cookbook, claims that our ancestors generally ate their grains already sprouted. This was more accidental than on purpose–grains often sat around in storage, got wet, and sprouted. Sprouting (or germinating) makes the wheat use stored gluten to make amino acids. Between sprouting, fermentation (sourdough bread) and less nitrogen-loving wheat varieties, our ancestors’ breads and porridges may have had less gluten than ours.

Another issue:

In the laboratory of the first author we have taken two different approaches to improving the protein quality of crops. First, we successfully selected a series of high lysine wheat cultivars over a period of twenty years, by standard breeding methods []. …  Surviving embryos consistently had elevated levels of lysine relative to parental populations and the seed produced from these embryos also had increased levels of lysine. The increased nutritional value of these lines, however, carried a cost in terms of lower total yield. A striking result was that grasshoppers, aphids, rats and deer preferentially feasted on the foliage of these high lysine wheats in the field, rather than on neighboring conventional low lysine wheats. The highest lysine wheat had the highest predation and subsequently the lowest yield (D.C. Sands, unpublished field observations). … Thus, we are led to the hypothesis that selection for insect resistance may have inadvertently resulted in the selection for lower nutritional value…

Then the authors talk about peas, of Gregor Mendel fame. Two varieties of peas are wrinkled and smooth. The smooth, plump ones look nicer (and probably taste sweeter). The plump ones store sugar in a form that we digest more quickly, resulting in faster increases in blood sugar. They are thus more likely to get stored as fat.

Breeders and buyers are biased toward plump seeds and tubers, in peas and many other crops.

Incidentally, the outside of the wheat grain–the part we discard when producing white flours but keep when making “whole” wheat flour–contains phytates which interfere with iron absorption and other irritants designed by the plant to increase the chance of grazers passing the seed out the other end without digestion. (However, the creation of white flours may remove other nutrients.)

It’s getting late, so I’d better wrap up. The authors end by noting that fermentation is another way to potentially increase the nutritional content of foods and suggest a variety of ways scientists could make grains or yeasts that enhance fermentation.

A few more studies:

The nutritional value of crop residue components from several wheat cultivars grown at different fertilizer levels:

Nine wheat cultivars were grown at two test sites in Saskatoon, each at fertilizer levels of 0, 56, 224 kgN ha−1. Proportions of leaf, stem, chaff and grain were obtained for each level. Significant cultivar differences were observed at each site for plant component yields. A significant increase in the proportion of leaf components and a significant decrease in the proportion of the grain components was observed as soil nitrogen levels increased. Crude protein contents of plant components varied significantly with both cultivar and fertilizer level. Significant differences in digestibility in vitro also existed among cultivars. Increasing fertilizer levels significantly improved the digestibility in vitro of the leaf but not of the chaff.

Genetic differences in the copper nutrition of cereals:

Seven wheat genotypes, one or barley and one of oats were compared for their sensitivity to sub optimal supplies of copper, and their ability to recover from copper deficiency when copper was applied at defined stages of growth Copper deficiency delayed maturity, reduced the straw yield and severely depressed the gram yield In all genotypes. …

Genotypes with relatively higher yield potential were less sensitive to copper deficiency than those with lower yield potential … There was no apparent association between dwarfness and sensitivity to copper deficiency in wheat.

An article suggesting we should eat emmer wheat instead of modern cultivars:

… The production and food-relevant use of domesticated modern-day wheat varieties face increasing challenges such as the decline in crop yield due to adverse fluctuating climatic trends, and a need to improve the nutritional and phytochemical content of the grain, both of which are a result of centuries of crop domestication and advancement of dietary calorie requirements demanding new high-yield dwarf varieties in the last five decades. The focus on improving phenotypic traits such as grain size and grain yield towards calorie-driven macronutrients has inadvertently led to a loss of allelic function and genetic diversity in modern-day wheat, which suffers from poor tolerance to biotic and abiotic stresses, as well as poor nutritional and phytochemcial profiles against high-calorie-driven non-communicable chronic diseases (NCDs).The low baseline phytochemical profile of modern-day wheat varieties along with highly mechanized post-harvest processing have resulted in poor health-relevant nutritional qualities in end products against emerging NCDs. …

Ancient wheat, such as emmer with its large genetic diversity, high phytochemcial content, and better nutritional and health-relevant bioactive profiles, is a suitable candidate to address these nutritional securities…

There’s a lot of information about emmer wheat nutrition in this article/book.


Bluing Meat

Inspired by a question from Littlefoot, I went out to do a little sleuthing:

I remember an anthropology professor reminiscing about buying food at open air markets somewhere in Africa, where refrigeration is non-existent the meat is simply out in the heat and “somehow everyone doesn’t die.” It’s a bit strange to us, because we’re inundated with messages that improper food handling will lead to the growth of horrible bacteria and death (I even refrigerate the eggs and butter, even though our grandmothers never did and the French still don’t,) but our ancestors not only managed without refrigeration, sometimes they actually tried to make the meat rot on purpose.

Helpful Twitter user Stefan Beldie explains that traditionally, pheasants were killed, eviscerated, and then hung for 4-10 days, depending on the weather.

The phrase “bluing meat” is extremely rare these days (try googling it,) but extra-rare or uncooked meat is still referred to as “blue”:

Temperatures for beef, veal, lamb steaks and roasts
Extra-rare or Blue (bleu) very red 46–49 °C 115–125 °F

Of course, there is a bit of difference between food that is merely uncooked/barely cooked, and food that has been intentionally allowed to rot.

Here’s the tale of an Inuit (Eskimo) delicacy, walrus meat that has been allowed to decompose in a hole in the ground for a year (though I suspect not much decomposition happens for about half the year up in the arctic).

Before you judge, remember that cheese is really just rotten vomit.

Have you ever heard the story that early modern Brits used a bunch of spices on their meat to cover up the taste of rot?

It turns out that this is a myth, a tall tale created by people misunderstanding cookbooks that gave instructions for properly rotting meat before eating it:

One of the most pervasive myths about medieval food is that medieval cooks used lots of spices to cover up the taste of rotten meat. This belief is often presented in the popular media as fact, with no cited references. Occasionally though a source is mentioned, and the trail invariably leads to:

The Englishman’s Food: Five Centuries of English Diet
J.C. Drummond, Anne Wilbraham
First published by Jonathan Cape Ltd 1939

Drummond claimed,

… It is not surprising to find that the recipe books of these times give numerous suggestions for making tainted meat edible. Washing with vinegar was an obvious, and one of the commonest procedures. A somewhat startling piece of advice is given in the curious collection of recipes and miscellaneous information published under the title of The Jewell House of Art and Nature by ‘Hugh Platt, of Lincolnes Inne Gentleman’ in 1594. If you had venison that was ‘greene’ you were recommended to ‘cut out all the bones, and bury [it] in a thin olde coarse cloth a yard deep in the ground for 12 or 20 houres’. It would then, he asserted, ‘bee sweet enough to be eaten’.” 

As Daniel Myers notes, washing with vinegar was not done to reduce spoilage, but to tenderize and get rid of the “gamey” taste of some meats. As for burying your meat to make it less spoiled, this is clearly absurd:

The example that Drummond does give is most certainly not for dealing with spoiled meat. He misinterprets the word “greene” to mean spoiled, when in fact it has the exact opposite meaning – unripe. Venison, along with a number of other meats, is traditionally hung to age for two or three days after butchering to help tenderize it and to improve the flavor. With this simple knowledge in mind, Platt’s instructions are clearly a way to take a freshly butchered carcass and speed up the aging process so that it may be eaten sooner.

Similar instructions for rapidly aging poultry can be found in Ménagier de Paris.

Item, to age capons and hens, you should bleed them through their beaks and immediately put them in a pail of very cold water, holding them all the way under, and they will be aged that same day as if they had been killed and hung two days ago.

The goal of these recipes is not to cover up rot, but to speed up the rotting (or “aging”) process.

Myers also notes that the idea of putting spices on rotten meat is also absurd because spices were horribly expensive–often worth their weight in gold. It would be rather like someone looking at a gold-leaf wrapped caviar and concluding the gold was there to distract the peasants from the fact that fish eggs are disgusting. You would have completely misread the dish. In the Medieval case, it would be cheaper to buy fresh meat than to dump spices on it.

Now, to be clear, what I’ve been calling “rot” is really more “aging.” We only think of it as rotting because we are accustomed to throwing everything in the refrigerator as soon as we get it.

As the Omaha World Herald explains:

Three factors affect the tenderness of meat in all animals, whether it be beef cattle or pheasant: background toughness, rigor mortis and aging the meat.

Background toughness results from the amount of collagen (connective tissue) in and between muscle fibers. The amount of collagen, as well as the interconnectivity of the collagen, increases as animals get older, explaining why an old rooster is naturally tougher than a young bird. Rigor mortis is the partial contracting and tightening of muscle fibers in animals after death and results from chemical changes in the muscle cells. Depending on temperature and other factors, rigor mortis typically sets in a few hours after death and maximum muscle contraction is reached 12 to 24 hours after death. Rigor mortis then begins to subside, which is when the aging (tenderization) of the meat begins.

Tenderization results from pH changes in the muscle cells after death that allow naturally occurring proteinase enzymes in cells to become active. These enzymes break down collagen, resulting in more tender meat. In beef cattle, the aging process will continue at a constant rate up to 14 days, as long as the meat is held at a proper and consistent temperature, and then decreases after that. In fowl, the rate of tenderization begins to decline after a few days.

A common misconception is that bacteria-caused rotting is responsible for meat tenderization, and this is why many find the thought of aging game repugnant. … Maintaining a constant, cool temperature is key to preventing bacterial growth when aging meats. The sickness causing E. coli bacteria grows rapidly at temperatures at or above 60 F, but very slowly at 50 F.

It’s a very good article; RTWT.

From Littlefoot we have an article that delves into the technical side of things: Microbiological Changes in the Uneviscerated Bird Hung at 10 degrees C with Particular Reference to the Pheasant:

Several interesting results from this study. They hung up both pheasants and chickens. The pheasants showed very little microbial first two weeks, whereas the chickens started turning green on day five. This is probably a result of chickens having more bacteria in them to start with, a side effect of the crowded, disease-ridden conditions chickens are typically raised in.

A taste testing panel found that pheasants that had hung for at least three days tasted better than ones that had not, with some panel members preferring birds that had aged considerably longer.

So if you plan on hunting pheasant any time soon, consider letting it age for a few days before eating it–carefully, of course. Don’t give yourself food poisoning.

Everything I’ve Read about Food, Summed up in One Graph:

A few years ago I went through a nutrition kick and read about a dozen books about food. Today I came across a graph that perfectly represents what I learned:

Basically, everything will kill you.

There are three major schools of thought on what’s wrong with modern diets: 1. fats, 2. carbs (sugars,) or 3. proteins.

Unfortunately, all food is composed of fats+carbs+proteins.

Ultimately, the best advice I came across was just to stop stressing out. We don’t really know the best foods to eat, and a lot of official health advice that people have tried to follow actually turned out to be quite bad, but we have a decent intuition that you shouldn’t eat cupcakes for lunch.

Dieting doesn’t really do much for the vast majority of people, but it’s a huge industry that sucks up a ton of time and money. How much you weigh has a lot more to do with factors outside of your control, like genetics or whether there’s a famine going on in your area right now.

You’re probably not going to do yourself any favors stressing out about food or eating a bunch of things you don’t like.

Remember the 20/80 rule: 80% of the effect comes from 20% of the effort, and vice versa. Eating reasonable quantities of good food and avoiding junk will do far more good than substituting chicken breast for chicken thighs in everything you cook.

There is definitely an ethnic component to diet–eg, people whose ancestors historically ate grain are better adapted to it than people who didn’t. So if you’re eating a whole bunch of stuff your ancestors didn’t and you don’t feel so good, that may be the problem.

Personally, I am wary of refined sugars in my foods, but I am very sensitive to sugars. (I don’t even drink juice.) But this may just be me. Pay attention to your body and how you feel after eating different kinds of food, and eat what makes you feel good.

Weight, Taste, and Politics: A Theory of Republican Over-Indulgence

So I was thinking about taste (flavor) and disgust (emotion.)

As I mentioned about a month ago, 25% of people are “supertasters,” that is, better at tasting than the other 75% of people. Supertasters experience flavors more intensely than ordinary tasters, resulting in a preference for “bland” food (food with too much flavor is “overwhelming” to them.) They also have a more difficult time getting used to new foods.

One of my work acquaintances of many years –we’ll call her Echo–is obese, constantly on a diet, and constantly eats sweets. She knows she should eat vegetables and tries to do so, but finds them bitter and unpleasant, and so the general outcome is as you expect: she doesn’t eat them.

Since I find most vegetables quite tasty, I find this attitude very strange–but I am willing to admit that I may be the one with unusual attitudes toward food.

Echo is also quite conservative.

This got me thinking about vegetarians vs. people who think vegetarians are crazy. Why (aside from novelty of the idea) should vegetarians be liberals? Why aren’t vegetarians just people who happen to really like vegetables?

What if there were something in preference for vegetables themselves that correlated with political ideology?

Certainly we can theorize that “supertaster” => “vegetables taste bitter” => “dislike of vegetables” => “thinks vegetarians are crazy.” (Some supertasters might think meat tastes bad, but anecdotal evidence doesn’t support this; see also Wikipedia, where supertasting is clearly associated with responses to plants:

Any evolutionary advantage to supertasting is unclear. In some environments, heightened taste response, particularly to bitterness, would represent an important advantage in avoiding potentially toxic plant alkaloids. In other environments, increased response to bitterness may have limited the range of palatable foods. …

Although individual food preference for supertasters cannot be typified, documented examples for either lessened preference or consumption include:

Mushrooms? Echo was just complaining about mushrooms.

Let’s talk about disgust. Disgust is an important reaction to things that might infect or poison you, triggering reactions from scrunching up your face to vomiting (ie, expelling the poison.) We process disgust in our amygdalas, and some people appear to have bigger or smaller amygdalas than others, with the result that the folks with more amygdalas feel more disgust.

Humans also route a variety of social situations through their amygdalas, resulting in the feeling of “disgust” in response to things that are not rotten food, like other people’s sexual behaviors, criminals, or particularly unattractive people. People with larger amygdalas also tend to find more human behaviors disgusting, and this disgust correlates with social conservatism.

To what extent are “taste” and “disgust” independent of each other? I don’t know; perhaps they are intimately linked into a single feedback system, where disgust and taste sensitivity cause each other, or perhaps they are relatively independent, so that a few unlucky people are both super-sensitive to taste and easily disgusted.

People who find other people’s behavior disgusting and off-putting may also be people who find flavors overwhelming, prefer bland or sweet foods over bitter ones, think vegetables are icky, vegetarians are crazy, and struggle to stay on diets.

What’s that, you say, I’ve just constructed a just-so story?

Well, this is the part where I go looking for evidence. It turns out that obesity and political orientation do correlate:

Michael Shin and William McCarthy, researchers from UCLA, have found an association between counties with higher levels of support for the 2012 Republican presidential candidate and higher levels of obesity in those counties.

Shin and McCarthy's map of obesity vs. political orientation
Shin and McCarthy’s map of obesity vs. political orientation

Looks like the Mormons and Southern blacks are outliers.

(I don’t really like maps like this for displaying data; I would much prefer a simple graph showing orientation on one axis and obesity on the other, with each county as a datapoint.)

(Unsurprisingly, the first 49 hits I got when searching for correlations between political orientation and obesity were almost all about what other people think of fat people, not what fat people think. This is probably because researchers tend to be skinny people who want to fight “fat phobia” but aren’t actually interested in the opinions of fat people.)

The 15 most caffeinated cities, from I love Coffee
The 15 most caffeinated cities, from I love Coffee–note that Phoenix is #7, not #1.

Disgust also correlates with political belief, but we already knew that.

A not entirely scientific survey also indicates that liberals seem to like vegetables better than conservatives:

  • Liberals are 28 percent more likely than conservatives to eat fresh fruit daily, and 17 percent more likely to eat toast or a bagel in the morning, while conservatives are 20 percent more likely to skip breakfast.
  • Ten percent of liberals surveyed indicated they are vegetarians, compared with 3 percent of conservatives.
  • Liberals are 28 percent more likely than conservatives to enjoy beer, with 60 percent of liberals indicating they like beer.

(See above where Wikipedia noted that supertasters dislike beer.) I will also note that coffee, which supertasters tend to dislike because it is too bitter, is very popular in the ultra-liberal cities of Portland and Seattle, whereas heavily sweetened iced tea is practically the official beverage of the South.

The only remaining question is if supertasters are conservative. That may take some research.

Update: I have not found, to my disappointment, a simple study that just looks at correlation between ideology and supertasting (or nontasting.) However, I have found a couple of useful items.

In Verbal priming and taste sensitivity make moral transgressions gross, Herz writes:

Standard tests of disgust sensitivity, a questionnaire developed for this research assessing different types of moral transgressions (nonvisceral, implied-visceral, visceral) with the terms “angry” and “grossed-out,” and a taste sensitivity test of 6-n-propylthiouracil (PROP) were administered to 102 participants. [PROP is commonly used to test for “supertasters.”] Results confirmed past findings that the more sensitive to PROP a participant was the more disgusted they were by visceral, but not moral, disgust elicitors. Importantly, the findings newly revealed that taste sensitivity had no bearing on evaluations of moral transgressions, regardless of their visceral nature, when “angry” was the emotion primed. However, when “grossed-out” was primed for evaluating moral violations, the more intense PROP tasted to a participant the more “grossed-out” they were by all transgressions. Women were generally more disgust sensitive and morally condemning than men, … The present findings support the proposition that moral and visceral disgust do not share a common oral origin, but show that linguistic priming can transform a moral transgression into a viscerally repulsive event and that susceptibility to this priming varies as a function of an individual’s sensitivity to the origins of visceral disgust—bitter taste. [bold mine.]

In other words, supertasters are more easily disgusted, and with verbal priming will transfer that disgust to moral transgressions. (And easily disgusted people tend to be conservatives.)

The Effect of Calorie Information on Consumers’ Food Choice: Sources of Observed Gender Heterogeneity, by Heiman and Lowengart, states:

While previous studies found that inherited taste-blindness to bitter compounds such
as PROP may be a risk factor for obesity, this literature has been hotly disputed
(Keller et al. 2010).

(Always remember, of course, that a great many social-science studies ultimately do not replicate.)

I’ll let you know if I find anything else.

Why do women love cupcakes?


One of my kids enjoys watching YouTube cooking videos, and they’re nearly 100% women making cakes.

Women’s magazines focus exclusively on 4 topics: men, fashion, diets, and cupcakes. You might think that diets and cupcakes are incompatible, but women’s magazines believe otherwise:

Picture 5 Picture 6 Picture 8

Just in case it’s not clear, that is not a watermellon. It is cake, cleverly disguised as a watermellon.

(YouTube has videos that show you how to make much better cake watermellons–for starters, you want red velvet cake for the middle, not just frosting…)

Picture 10 Picture 11Magazines specifically aimed at “people who want to make cakes” are also overwhelmingly feminine. Whether we’re talking wedding cakes or chocolate cravings, apple pastries or donuts, sweets and women just seem to go together.

If men’s magazines ever feature food, I bet they’re steak and BBQ. (*Image searches*)

Picture 19 Picture 18 Picture 14 Picture 16






The meat-related articles do appear to be a little more gender-neutral than the cupcake-related articles–probably because men don’t tend to decorate their steaks with tiny baseball bats cut out of steak the way women like to decorate their cakes with tiny flowers made out of frosting.

It’s almost as if women have some kind of overwhelming craving for fats and sugars that men don’t really share.

I was talking with a friend recently about their workplace, where, “All of the women are on diets, but none of them can stay on their diets because they are all constantly eating at their workstations.” Further inquiries revealed that yes, they are eating sweets and pastries, not cashews and carrots, and that there is some kind of “office culture” of all of the women eating pastries together.

The irony here is pretty obvious.

Even many (most?) specialty “diet” foods are designed to still taste sweet. “Fat-free” yogurt is marketed as a health food even though it has as much sugar in it as a bowl of ice cream. Women are so attracted to the taste of sweet sodas, they drink disgusting Diet Coke. Dieting websites advise us that cake topped with fruit is “healthy.”

When men diet, they think “eat nothing but protein until ketosis kicks in” sounds like a great idea. When women diet, they want fat-free icecream.

I don’t think it is just “women lack willpower.” (Or at least, not willpower in the sense of something people have much control over.) Rather, I think that men and women actually have substantially different food cravings.

So do children, for that matter.

Throughout most of human history, from hunter-gatherers to agriculturalists, the vast majority of women have specialized in obtaining (gathering, tending, harvesting,) plants. (The only exceptions are societies where people don’t eat plants, like the Inuit and the Masai, and our modern society, where most of us aren’t involved in food production.) By contrast, men have specialized in hunting, raising, and butchering animals–not because they were trying to hog the protein or had some sexist ideas about food production, but because animals tend to be bigger and heavier than women can easily lift. Dragging home and butchering large game requires significant strength.

I am inventing a “Just So” story, of course. But it seems sensible enough that each gender evolved a tendency to crave the particular kinds of foods it was most adept at obtaining.

Exercise wears down muscles; protein is necessary to build them back up. Protein fuels active lifestyles, and active lifestyles, in turn, require protein. Our male ancestors’ most important activities were most likely heavy labor (eg, building huts, hauling firewood, butchering game,) and defending the tribe. Our female ancestors’ most important activities were giving birth and nursing children (we would not exist had they not, after all.) For these activities, women want to be fat. It’s not good enough to put on weight after you get pregnant, when the growing fetus is already dependent on its mother for nutrients. Far better for a woman to be plump before she gets pregnant (and to stay that way long after.)

Of course, this is “fat” by historical standards, not modern American standards.

I suspect, therefore, that women are naturally inclined to eat as much as possible of sweet foods in order to put on weight in preparation for pregnancy and lactation–only today, the average woman has 2 pregnancies instead of 12, and so instead of turning that extra weight into children and milk, it just builds up.

Obviously we are talking about a relatively small effect on food preferences, both because our ancestors could not afford to be too picky about what they ate, and because the genetic difference between men and women is slight–not like the difference between humans and lizards, say.

Interestingly, gender expression in humans appears to basically be female by default. If, by random chance, you are born with only one X chromosome, (instead of the normal XX or XY,) you can still survive. Sure, you’ll be short, you probably won’t menstruate, and you’ll likely have a variety of other issues, but you’ll be alive. By contrast, if you received only a Y chromosome from your parents and no accompanying X, you wouldn’t be here reading this post. You can’t survive with just a Y. Too many necessary proteins are encoded on the X.

Gender differences show up even in fetuses, but don’t become a huge deal until puberty, when the production of androgens and estrogens really cranks up.

Take muscle development: muscle development relies on the production of androgens (eg, testosterone.) Grownups produce more androgens than small children, and men produce more than women. Children can exercise and certainly children who do daily farm chores are stronger than children who sit on their butts watching TV all day, but children can’t do intense strength-training because they just don’t produce enough androgens to build big muscles. Women, likewise, produce fewer androgens, and so cannot build muscles at the same rate as men, though obviously they are stronger than children.

At puberty, boys begin producing the androgens that allow them to build muscles and become significantly stronger than girls.

Sans androgens, even XY people develop as female. (See Androgen Insensitivity Syndrome, in which people with XY chromosomes cannot absorb the androgens their bodies create, and so develop as female.) Children produce some androgens (obviously,) but not nearly as many as adults. Pre-pubescent boys, therefore, are more “feminine,” biologically, than post-pubescent men; puberty induces maleness.

All children seem pretty much obsessed with sweets, far more than adults. If allowed, they will happily eat cake until they vomit.

Even though food seems like a realm where evolution would heavily influence our tastes, it’s pretty obvious that culture has a huge effect. I doubt Jews have a natural aversion to pork or Hindus to beef. Whether you think chicken hearts are tasty or vomitous is almost entirely dependent on whether or not they are a common food in your culture.

But small children are blissfully less attuned to culture than grownups. Like little id machines, they spit out strained peas and throw them on the floor. They do not care about our notion that “vegetables are good for you.” This from someone who’ll eat bird poop if you let them.

The child’s affection for sweets, therefore, I suspect is completely natural and instinctual. Before the invention of refined sugars and modern food distribution systems, it probably kept them alive and healthy. Remember that the whole reason grownups try to eat more vegetables is that vegetables are low in calories. Grownups have larger stomachs and so can eat more than children, allowing them to extract adequate calories from low-calorie foods, but small children do not and cannot. In developing countries, children still have trouble getting enough calories despite abundant food in areas where that food is low-calorie plants, which they just cannot physically eat enough of. Children, therefore, are obsessed with high-calorie foods.

At puberty, this instinct changes for boys–orienting them more toward protein sources, which they are going to have to expend a lot of energy trying to haul back to their families for the rest of their lives, but stays basically unchanged in females.

ETA: I have found two more sources/items of relevance:

Calorie information effects on consumers’ food choices: Sources of observed gender heterogeneity, by Heiman and Lowengart:

When it comes to what we eat, men and women behave differently: Men consume more beef, eggs, and poultry; while women eat more fruits and vegetables and consume less fat than do men. … The gender differences in preferences for healthier foods begin in childhood. Previous literature has found that girls choose healthier food and are fonder of fruits and vegetables than are boys. Boys rated beef, processed meat, and eggs as more desirable than did girls. …

Sensory (taste) differences between the genders are the second most widely ventured explanation for the differences in food choices, although it is not clear that such genetic differences actually exist. While the popular media argue that females prefer sweetness and dislike bitterness, while males may enjoy bitterness, academic literature on this matter is less conclusive. The bitter taste receptor, gene TAS2R38, has been associated with the ability to taste PROP (6-n-propylthiouracil),
one source of genetic variation in PROP and PTC taste. Individuals who experience bitterness strongly are assumed to also experience sweetness strongly relative to those who experience PROP as only slightly bitter. While previous studies found that inherited taste-blindness to bitter compounds such as PROP may be a risk factor for obesity, this literature has been hotly disputed.

The distribution of perceived bitterness of PROP differs among women and men, as does the correlation between genetic taste measures and acceptance of sweetness. A higher percentage of women are PROP and PTC tasters, sensing bitterness above threshold. It has been suggested that women are more likely to be supertasters, or those who taste with far greater intensity than average.

(I have removed the in-line citations for ease of reading; please refer to the original if you want them.)



Well, I don’t remember where this graph came from, but it looks like my intuitions were pretty good. males and females both have very low levels of testosterone during childhood, and duing puberty their levels become radically different.

Potato Madness

this_is_a_potatoThis is a potato.

Bake it, and you have a healthy, nutritious dinner that you can serve your family and feel good about.

That seems simple enough.

However, the potato would like to clarify some potential confusion about its culinary uses:


18rnflcoijyr0jpgThis is a potato that has been chopped up and deep fried.

Make it from scratch in the morning, and you are not just a good mom, but an excellent mom.

You can’t really eat them for dinner, unless you’re at IHOP or celebrating Chanukah.


Picture 11This is a potato that has been chopped up, deep fried, and served with a side of pickled tomatoes.

It is never for breakfast, except maybe if you are on a roadtrip and there’s nothing else available. If so, we will pretend it never happened.

Serve it for breakfast any other time, and you are a bad mom.

It is fine for lunch, though.

Picture 12This is a potato that has been chopped up and baked.

You can never, ever serve it for breakfast. In fact, you are a bad mom if you even think about serving it for breakfast.

It is not a meal at all!

Why, this potato is so unhealthy, you should probably never eat it at all.


There. I hope that clears everything up.


Is there a correlation between intelligence and taste?

(I am annoyed by the lack of bands between 1200 and 1350)

De gustibus non disputandum est. — Confucius

We’re talking about foods, not whether you prefer Beethoven or Lil’ Wayne.

Certainly there are broad correlations between the foods people enjoy and their ethnicity/social class. If you know whether I chose fried okra, chicken feet, gefilte fish, escargot, or grasshoppers for dinner, you can make a pretty good guess about my background. (Actually, I have eaten all of these things. The grasshoppers were over-salted, but otherwise fine.) The world’s plethora of tasty (and not-so-tasty) cuisines is due primarily to regional variations in what grows well where (not a lot of chili peppers growing up in Nunavut, Canada,) and cost (the rich can always afford fancier fare than the poor,) with a side dish of seemingly random cultural taboos like “don’t eat pork” or “don’t eat cows” or “don’t eat grasshoppers.”

But do people vary in their experience of taste? Does intelligence influence how you perceive your meal, driving smarter (or less-smart) people to seek out particular flavor profiles or combinations? Or could there be other psychological or neurological factors at play n people’s eating decisions?

This post was inspired by a meal my husband, an older relative and I shared recently at McDonald’s. It had been a while since we’d last patronized McDonald’s, but older relative likes their burgers, so we went and ordered some new-to-us variety of meat-on-a-bun. As my husband and I sat there, deconstructing the novel taste experience and comparing it to other burgers, the older relative gave us this look of “Jeez, the idiots are discussing the flavor of a burger! Just eat it already!”

As we dined later that evening at my nemesis, Olive Garden, I began wondering whether we actually experienced the food the same way. Perhaps there is something in people that makes them prefer bland, predictable food. Perhaps some people are better at discerning different flavors, and the people who cannot discern them end up with worse food because they can’t tell?

Unfortunately, it appears that not a lot of people have studied whether there is any sort of correlation between IQ and taste (or smell.) There’s a fair amount of research on taste (and smell,) like “do relatives of schizophrenics have impaired senses of smell?” (More on Schizophrenics and their decreased ability to smell) or “can we get fat kids to eat more vegetables?” Oh, and apparently the nature of auditory hallucinations in epileptics varies with IQ (IIRC.) But not much that directly addresses the question.

I did find two references that, somewhat in passing, noted that they found no relationship between taste and IQ, but these weren’t studies designed to test for that. For example, in A Food Study of Monotony, published in 1958 (you know I am really looking for sources when I have to go back to 1958,) researchers restricted the diets of military personnel employed at an army hospital to only 4 menus to see how quickly and badly they’d get bored of the food. They found no correlation between boredom and IQ, but people employed at an army hospital are probably pre-selected for being pretty bright (and having certain personality traits in common, including ability to stand army food.)

Interestingly, three traits did correlate with (or against) boredom:

Fatter people got bored fastest (the authors speculate that they care the most about their food,) while depressed and feminine men (all subjects in the study were men) got bored the least. Depressed people are already disinterested in food, so it is hard to get less-interested, but no explanation was given of what they meant by “femininity” or how this might affect food preferences. (Also, the hypochondriacs got bored quickly.)

Some foods inspire boredom (or even disgust) quickly, while others are virtually immune. Milk and bread, for example, can be eaten every day without complaint (though you might get bored if bread were your only food.) Potted meat, by contrast, gets old fast.

Likewise, Personality Traits and Eating Habits (warning PDF) notes that:

Although self-reported eating practices were not associated with educational level, intelligence, nor various indices of psychopathology, they were related to the demographic variables of gender and age: older participants reported eating more fiber in their diets than did younger ones, and women reported more avoidance of fats from meats than did men.

Self-reported eating habits may not be all that reliable, though.

Autistic children do seem to be worse at distinguishing flavors (and smells) than non-autistic children, eg Olfaction and Taste Processing in Autism:

Participants with autism were significantly less accurate than control participants in identifying sour tastes and marginally less accurate for bitter tastes, but they were not different in identifying sweet and salty stimuli. … Olfactory identification was significantly worse among participants with autism. … True differences exist in taste and olfactory identification in autism. Impairment in taste identification with normal detection thresholds suggests cortical, rather than brainstem dysfunction.

(Another study of the eating habits of autistic kids found that the pickier ones were rated by their parents as more severely impaired than the less picky ones, but then severe food aversions are a form of life impairment. By the way, do not tell the parents of an autistic kid, “oh, he’ll eat when he’s hungry.” They will probably respond politely, but mentally they are stabbing you.)

On brainstem vs. cortical function–it appears that we do some of our basic flavor identification way down in the most instinctual part of the brain, as Facial Expressions in Response to Taste and Smell Stimulation explores. The authors found that pretty much everyone makes the same faces in response to sweet, sour, and bitter flavors–whites and blacks, old people and newborns, retarded people and blind people, even premature infants, blind infants, and infants born missing most of their brains. All of which is another point in favor of my theory that disgust is real. (And if that is not enough science of taste for you, I recommend Place and Taste Aversion Learning, in which animals with brain lesions lost their fear of new foods.)

Genetics obviously plays a role in taste. If you are one of the 14% or so of people who think cilantro tastes like soap (and I sympathize, because cilantro definitely tastes like soap,) then you’ve already discovered this in a very practical way. Genetics also obviously determine whether you continue producing the enzyme for milk digestion after infancy (lactase persistence). According to Why are you a picky eater? Blame genes, brains, and breastmilk:

In many cases, mom and dad have only themselves to blame for unwittingly passing on the genes that can govern finicky tastes. Studies show that genes play a major role in determining who becomes a picky eater, including recent research on a group of 4- to 7-year-old twins. Part of the pickiness can be attributed to specific genes that govern taste. Variants of the TAS2R38 gene, for example, have been found to encode for taste receptors that determine how strongly someone tastes bitter flavors.

Researchers at Philadelphia’s Monell Chemical Senses Center, a scientific institute dedicated to the study of smell and taste, have found that this same gene also predicts the strength of sweet-tooth cravings among children. Kids who were more sensitive to bitterness preferred sugary foods and drinks. However, adults with the bitter receptor genes remained picky about bitter foods but did not prefer more sweets, the Monell study found. This suggests that sometimes age and experience can override genetics.

I suspect that there is actually a sound biological, evolutionary reason why kids crave sweets more than grownups, and this desire for sweets is somewhat “turned off” as we age.

Picture 10

From a review of Why some like it hot: Food, Genetics, and Cultural Diversity:

Ethnobotanist Gary Paul Nabhan suggests that diet had a key role in human evolution, specifically, that human genetic diversity is predominately a product of regional differences in ancestral diets. Chemical compounds found within animals and plants varied depending on climate. These compounds induced changes in gene expression, which can vary depending on the amount within the particular food and its availability. The Agricultural Age led to further diet-based genetic diversity. Cultivation of foods led to the development of novel plants and animals that were not available in the ancestral environment. …

There are other fascinating examples of gene-diet interaction. Culturally specific recipes, semi-quantitative blending of locally available foods and herbs, and cooking directions needed in order to reduce toxins present in plants, emerged over time through a process of trial-and error and were transmitted through the ages. The effects on genes by foods can be extremely complex given the range of plant-derived compounds available within a given region. The advent of agriculture is suggested to have overridden natural selection by random changes in the environment. The results of human-driven selection can be highly unexpected. …

In sedentary herding societies, drinking water was frequently contaminated by livestock waste. The author suggests in order to avoid contaminated water, beverages made with fermented grains or fruit were drunk instead. Thus, alcohol resistance was selected for in populations that herded animals, such as Europeans. By contrast, those groups which did not practice herding, such as East Asians and Native Americans, did not need to utilize alcohol as a water substitute and are highly sensitive to the effects of alcohol.

Speaking of genetics:

From Eating Green could be in your Genes

Indians and Africans are much more likely than Europeans and native South Americans to have an allele that lets them eat a vegetarian diet:

The vegetarian allele evolved in populations that have eaten a plant-based diet over hundreds of generations. The adaptation allows these people to efficiently process omega-3 and omega-6 fatty acids and convert them into compounds essential for early brain development and controlling inflammation. In populations that live on plant-based diets, this genetic variation provided an advantage and was positively selected in those groups.

In Inuit populations of Greenland, the researchers uncovered that a previously identified adaptation is opposite to the one found in long-standing vegetarian populations: While the vegetarian allele has an insertion of 22 bases (a base is a building block of DNA) within the gene, this insertion was found to be deleted in the seafood allele.

Of course, this sort of thing inspires a wealth of pop-psych investigations like Dr. Hirsch’s What Flavor is your Personality?  (from a review:

Dr. Hirsh, neurological director of the Smell and Taste Research and Treatment Foundation in Chicago, stands by his book that is based on over 24 years of scientific study and tests on more than 18,000 people’s food choices and personalities.)

that nonetheless may have some basis in fact, eg: Personality may predict if you like spicy foods:

Byrnes assessed the group using the Arnett Inventory of Sensation Seeking (AISS), a test for the personality trait of sensation-seeking, defined as desiring novel and intense stimulation and presumed to contribute to risk preferences. Those in the group who score above the mean AISS score are considered more open to risks and new experiences, while those scoring below the mean are considered less open to those things.

The subjects were given 25 micrometers of capsaicin, the active component of chili peppers, and asked to rate how much they liked a spicy meal as the burn from the capsaicin increased in intensity. Those in the group who fell below the mean AISS rapidly disliked the meal as the burn increased. People who were above the mean AISS had a consistently high liking of the meal even as the burn increased. Those in the mean group liked the meal less as the burn increased, but not nearly as rapidly as those below the mean.

And then there are the roughly 25% of us who are “supertasters“:

A supertaster is a person who experiences the sense of taste with far greater intensity than average. Women are more likely to be supertasters, as are those from Asia, South America and Africa.[1] The cause of this heightened response is unknown, although it is thought to be related to the presence of the TAS2R38 gene, the ability to taste PROP and PTC, and at least in part, due to an increased number of fungiform papillae.[2]

Perhaps the global distribution of supertasters is related to the distribution of vegetarian-friendly alleles. It’s not surprising that women are more likely to be supertasters, as they have a better sense of smell than men. What may be surprising is that supertasters tend not to be foodies who delight in flavoring their foods with all sorts of new spices, but instead tend toward more restricted, bland diets. Because their sense of taste is essentially on overdrive, flavors that taste “mild” to most people taste “overwhelming” on their tongues. As a result, they tend to prefer a much more subdued palette–which is, of course, perfectly tasty to them.

Picture 8A French study, Changes in Food Preferences and Food Neophobia during a Weight Reduction Session, measured kids’ ability to taste flavors, then the rate at which they became accustomed to new foods. The more sensitive the kids were to flavors, the less likely they were to adopt a new food; the less adept they were at tasting flavors, the more likely they were to start eating vegetables.

Speaking of pickiness again:

“During research back in the 1980s, we discovered that people are more reluctant to try new foods of animal origin than those of plant origin,” Pelchat says. “That’s ironic in two ways. As far as taste is concerned, the range of flavors in animal meat isn’t that large compared to plants, so there isn’t as much of a difference. And, of course, people are much more likely to be poisoned by eating plants than by animals, as long as the meat is properly cooked.” …

It’s also possible that reward mechanisms in our brain can drive changes in taste. Pelchat’s team once had test subjects sample tiny bits of unfamiliar food with no substantial nutritional value, and accompanied them with pills that contained either nothing or a potent cocktail of caloric sugar and fat. Subjects had no idea what was in the pills they swallowed. They learned to like the unfamiliar flavors more quickly when they were paired with a big caloric impact—suggesting that body and brain combined can alter tastes more easily when unappetizing foods deliver big benefits.

So trying to get people to adopt new foods while losing weight may not be the best idea.

(For all that people complain about kids’ pickiness, parents are much pickier. Kids will happily eat playdoh and crayons, but one stray chicken heart in your parents’ soup and suddenly it’s “no more eating at your house.”)

Of course, you can’t talk about food without encountering meddlers who are convinced that people should eat whatever they’re convinced is the perfect diet, like these probably well-meaning folks trying to get Latinos to eat fewer snacks:

Latinos are the largest racial and ethnic minority group in the United States and bear a disproportionate burden of obesity related chronic disease. Despite national efforts to improve dietary habits and prevent obesity among Latinos, obesity rates remain high. …

there is a need for more targeted health promotion and nutrition education efforts on the risks associated with soda and energy-dense food consumption to help improve dietary habits and obesity levels in low-income Latino communities.

Never mind that Latinos are one of the healthiest groups in the country, with longer life expectancies than whites! We’d better make sure they know that their food ways are not approved of!

I have been saving this graph for just such an occasion.
Only now I feel bad because I forgot to write down who made this graph so I can properly credit them. If you know, please tell me!

(Just in case it is not clear already: different people are adapted to and will be healthy on different diets. There is no magical, one-size-fits-all diet.)

And finally, to bring this full circle, it’s hard to miss the folks claiming that Kids Who Eat Fast Food Have Lower IQs:

4,000 Scottish children aged 3-5 years old were examined to compare the intelligence dampening effects of fast food consumption versus  “from scratch”  fare prepared with only fresh ingredients.

Higher fast food consumption by the children was linked with lower intelligence and this was even after adjustments for wealth and social status were taken into account.

It’d be better if they controlled for parental IQ.

The conclusions of this study confirm previous research which shows long lasting effects on IQ from a child’s diet. An Australian study from the University of Adelaide published in August 2012 showed that toddlers who consume junk food grow less smart as they get older. In that study, 7000 children were examined at the age of 6 months, 15 months, 2 years to examine their diet.

When the children were examined again at age 8, children who were consuming the most unhealthy food had IQs up to 2 points lower than children eating a wholesome diet.



Adulterations in the Feed

It’s no secret that sperm counts have been dropping like rocks over the past 70 years or so (though the trend may have recently leveled out.)

” Sperm counts in the 1940s were typically well above 100m sperm cells per millilitre, but Professor Skakkebaek found they have dropped to an average of about 60m per ml. Other studies found that between 15 and 20 per cent of young men now find themselves with sperm counts of less than 20m per ml, which is technically defined as abnormal.” — from The Independent, “Out for the count: Why levels of sperm in men are falling

While environmental effects (like smoking,) have effects on sperm counts in adults, these appear to be basically small or short-lasting. The biggest, longest-lasting effects on sperm counts appears to be the unterine environment where the future-low-sperm-count-male’s testicles were developing. Improper fetal testicle development => low sperm count for life. Eg,

“A man who smokes typically reduces his sperm count by a modest 15 per cent or so, which is probably reversible if he quits. However, a man whose mother smoked during pregnancy has a fairly dramatic decrease in sperm counts of up to 40 per cent – which also tends to be irreversible.”

What elsecould make a uterine environment hostile to testicular development?

How about too much estrogen?

I’ve posted before about Diethylstilbestrol, (or DES,)  is a synthetic nonsteroidal estrogen. Between 1940 and 1971, DES was given in large quantities to pregnant women to prevent miscarriages. Unfortunately, it turns out that pumping babies full of unnaturally high levels of estrogen might be bad for them–DES was discontinued as a medication for pregnant women because it gave their daughters cancer, (an actual epigenetic effect) and the sons appear to have high rates of transgender, transexual and intersex conditions.

Quoting the Wikipedia:

“In the 1970s and early 1980s, studies published on prenatally DES-exposed males investigated increased risk of testicular cancer, infertility and urogenital abnormalities in development, such as cryptorchidism and hypospadias.[38][39]

“… The American Association of Clinical Endocrinologists (AACE) has documented that prenatal DES exposure in males is positively linked to a condition known as hypogonadism (low testosterone levels) that may require treatment with testosterone replacement therapy.[43]

“… Research on DES sons has explored the long-standing question of whether prenatal exposure to DES in males may include sexual and gender-related behavioral effects and also intersex conditions. Dr. Scott Kerlin, a major DES researcher and founder of the DES Sons International Research Network in 1999, has documented for the past 16 years a high prevalence of individuals with confirmed prenatal DES exposure who self-identify as male-to-female transsexual, transgender, or have intersex conditions, and many individuals who report a history of experiencing difficulties with gender dysphoria.[45][46][47][48]

“… Various neurological changes occur after prenatal exposure of embryonic males to DES and other estrogenic endocrine disrupters. Animals that exhibited these structural neurological changes were also shown to demonstrate various gender-related behavioral changes (so-called “feminization of males”). Several published studies in the medical literature on psychoneuroendocrinology have examined the hypothesis that prenatal exposure to estrogens (including DES) may cause significant developmental impact on sexual differentiation of the brain, and on subsequent behavioral and gender identity development in exposed males and females.”

Here is an excerpt from a paper, published in, I think, the early 40s.


Since the image quality is low, I’ve done my best to type it up for you:

“Experimental Intersexuality: The Effects of Combined Estrogens and Androgens on the Emryonic Sexual Development of the Rat

“RR. Greene, M. W. Rurrill and A. C. Ivy

“Department of Physiology and Pharmacology, Northwestern University Medical School, Chicago, Illinois

“In previous publications the authors have reported and described in detail the effects of large doses of sex hormones on the embryonic sexual development of the rat. Androgens, when administered to the pregnant female, cause a masculinization of the female embryos (Greene, Burrill and Ivy, ’38, ’39 a). The female type of differentiation of most sexual structures is inhibited and a male type of differentiation of those structures is stimulated. Administered estrogens cause a femininization of the male embryos (Greene, Burrill and Ivy, ’38, ’40) in that they inhibit the masculine type of differentiation of some sexual structures and, instead, cause a female type of differentiation.

“…The experimental demonstration that estrogens do have a profound effect…”

What are external sources of estrogens in modern life?

Birth control pills. I know FTM trans folks birth control pills for the hormones in them. (They are often cheaper and easier to get than hormones specifically prescribed for trans folks, especially if you have a female friend.)

Can those hormones stick around in a mother’s body even after she discontinues taking the pills?

Fat and estrogen appear to be correlated:

“Other conditions that cause low estrogen levels in younger women include excessive exercise, eating disorders and too little body fat.” (source)

“Excess estrogen in the body causes weight gain around the abdomen and upper thighs. … Weight gain caused by estrogen starts a vicious cycle. Excessive body fat produces the aromatase enzyme that synthesizes estrogen, thus creating more estrogen in the body, which then promotes additional weight gain, and so on, says Hofmekler.” (source)

“Researchers have found a correlation between estrogen and weight, particularly during menopause, when estrogen levels drop, but weight tends to rise. But since fat cells can produce estrogen, the issue facing researchers is how to target the estrogen receptors that will boost energy and manage hunger and not contribute to menopause-related weight gain.” (source)

“For postmenopausal women, estrogen levels increase with increasing BMI, presumably because conversion of androgens to estrogen in adipose tissue is a primary source of estrogen…” (source)

Since Americans have been getting fatter over the past century, I’d expect estrogen levels to be up, but I’ve found no studies on the subject so far. (Also, the Wikipedia claims there’s no evidence that birth control pills make people fat.)

However, I have found quite a bit of evidence that giving synthetic estrogen to animals makes them fatter:

Picture 4

(Stilbosol is another name for DES, as you may note in the ad’s upper right hand corner.)

Since the picture quality is bad, I’ll try to type it up for you:


“Ralph has been feeding cattle in New York state for 20 years. He runs 300 head a year through his feed lot, buying mountain (?) calves at 400 pounds and finishing them to about 1,000 pounds.  …

“”I lean very heavily on college tests and they’re in favor of Stilbosol. The first time we tried it, back in 1955, I noticed a very definite improvement in appetite.

“”Stilbosol is a ‘must’ in our feeding operations. It has added to our profit. If it didn’t, we wouldn’t be using it.””


“We bring our cattle into the lots around 600 pounds. Feed for about 150 days. … We feed to all weights (950 to 1150 pounds) and take a little chance from time to time and feed t heavier weights,” Dan stated.

“We get about 2.75 lbs. daily gain. And I figure Stilbosol accounts for (unreadable) to 1/2 lb. of that daily gain. …

“Does Stilbosol make us money? There’s no doubt about it! Stilbosol has revolutionized the cattle business. I guess it’s the only good break through in the last ten years.”


“”I tested Stilbosol. Took a bunch of 315 Montana yearlings and split them up. One group was actually lighter than the other. The only change I made in their rations was the addition of Stilbosol. The lighter group received Stilbosol. I figured that the lot fed Stilbosol gained over 1 1/2 lb. per day more than the lot which had no Stilbosol.

“”With all the competition, a man can’t afford to pas up anything that will lower his cost of grain. Stilbosol is one of them.””


“We were trying to find the cheapest, most efficient ration. One group of calves received a ration containing Stilbosol. Another received a similar ration without Stilbosol. The group receiving Stilbosol had a feed conversion of (I can’t tell the number, but it’s clearly a single digit followed by .4). The group receiving no Stilbosol had a feed conversion of 10.35. The Stilbosol group gained 2.49 pounds per day. The group that did not receive Stilbosol gained 2.13 pounds per day.

“With Stilbosol, we figure our cost of grain to be substantially lower than similar rations without Stilbosol.) “

Four farmers wouldn’t lie to us, would they?

Interestingly, eating large quantities of beef while pregnant was one of the things that The Independent article (linked at the top) noted was correlated with low sperm counts years down the road in the all-grown-up-fetuses.

Of course, people who eat more beef may just weigh more, or have some other factors besides adulterations in the cattle feed.

DES was also put in chicken feed, for the exact same reasons as cattle feed, until it came out that DES causes cancer in humans. It was discontinued as a feed additive in the late 70s.

These days, I don’t know what–if anything–they’re using to finish cattle, but we may note that the vast majority of cattle are still finished in feedlots where they get much fatter than they would naturally. (That is, by wandering around eating grass like they normally do.) Feedlot cattle are, to put it bluntly, unnaturally fat.

Now I’m going to do a little math. The Independent article was published in 2010, and states that the article on falling sperm rates was published 19 years prior, or in 1991. The study therefore compared men in the 1940s to men in the 1980s and 1990. Men in the 1940s were fetuses before the age of feedlots, birth control pills, DES, or DES-fed cattle and chicken. Young(ish) men in 1990, by contrast, were born between 1950 and 1970–all within the era of feedlots, BCPs, DES, and DES-fed cattle and chicken.

If it is true that sperm counts have stabilized since the 90s, that is a point potentially in favor of my theory, since after the 70s, DES was basically gone.

This is all me speculating out loud, of course.




You Probably Aren’t Adapted to the Paleo Diet

Sorry, guys.

Look, I like the Paleo Diet as much as you do–maybe even more than you do. After all, I didn’t name this blog Evolutionist X because I haven’t been reading about paleolithic peoples.

The basic idea of the Paleo Diet–in case you’ve been living under a rock–is that you will be healthier if you eat only veggies, fruit, and meat (no grains or milk products,)–the diet your Paleolithic ancestors evolved to eat.

The problem with the Paleo Diet is that evolution did not stop 10,000 years ago. Evolution is constant. It doesn’t stop. You are not a caveman in a suit. You are a modern person. Unless your grandparents were hunter-gatherers, chances are good that your ancestors have been under significant evolutionary pressure to adapt to agriculture for thousands of years.

For example, Lactase Persistence evolved in dairying populations entirely within the last 10,000 years. Today, 80% of Europeans and European-descended people have the gene for lactase persistence. Outside of traditionally dairying areas, this trait is rare. It has spread entirely in response to the development of dairying–which means that if your ancestors have been raising animals for their milk for the past few thousand years, there is a very good chance that you are adapted to drinking milk well after infancy.

Of course, you’re probably not going to hurt yourself drinking water instead.

Likewise for wheat; if your ancestors have been eating wheat for thousands of years, you can probably digest it okay. If your ancestors haven’t been eating wheat for thousands of years, then you might want to avoid it–a Vietnamese friend of mine gets stomach aches from eating wheat (especially whole wheat, which contains more of the irritating chemicals from the external part of of the grain, designed to inspire your stomach to pass out the seed within without digesting it). Their ancestors ate rice, not wheat, so this is hardly surprising. (They also are lactose intolerant, since their ancestors did not keep dairy cows.) However, they have no difficulties digesting rice–a food they are adapted to eat.

If you aren’t adapted to wheat, wheat will give you a stomach ache. If wheat gives you a stomach ache, avoid it! But if your ancestors ate wheat and it doesn’t give you a stomach ache, you’ll probably be safe eating it.

It is reasonable to ask whether there are long-term bad effects from eating wheat or drinking milk–some disease that doesn’t kick in until you’re in your 70s, for example, would be difficult to develop adaptations to combat because it kills you after you’ve already had all of your kids. On this count, I would love to see more research.

Also, there may be some people who, like the 20% or so of Europeans who lack lactase persistence, are particularly sensitive to various foods. People with the ApoE4 gene (the “Alzheimer’s Gene”) may benefit from specific dietary modifications.

However, there’s no particular reason to believe that you are all that well-adapted to eating a diet your ancestors haven’t eaten in thousands of years.

Theory: Americans are fat because we don’t eat enough

I’ve long had a theory that dieting makes people gain weight. Just think about it for a second: at the very least, the correlation is tremendous.

Lots of studies have shown that diets are pretty useless–people tend, on average, to lose little to no weight on them. The whole diet industry, from diet sodas to lite beer to Weight Watchers, is, of course, basically a fraud.

The reasons are probably simple: One, humans have evolved no mechanisms to resist eating whenever possible. Your ancestors are people who ate when they could, not people who were indifferent to food, especially not tasty food*. And two, we live in a society with abundant, cheap, delicious food. Chances are good you’ve never even lived through a famine, much less had to go without for significant periods every few years of your life.

*Or have we?

I have watched people try to diet (mostly relatives.) The process goes something like this:

1. Relative declares, “I am going to lose weight for sure this time!”
2. Eats meager breakfast of oatmeal and apples.
3. Eats more apples for snack.
4. Comes over to my house, devours all my chips.
5. Weight-loss fails.

(A lot of people claim that you are supposed to feel “full” using various diet methods, but I’ve watched this happen enough times to enough different people to suspect that it’s a pretty common scenario.)

So tonight I was getting a bowl of icecream for a sick kiddo. Normally when getting icecream, I sneak a bite at the end. I can’t eat a full bowl of icecream, because hypoglycemia, but the taste is very tempting. But tonight, I looked at the icecream, and said, “No, I don’t want icecream.” What the hell was wrong with me? I’d just eaten a bowl of beans + cheese. I was full.

I suspect that our willpower, our ability to resist the kinds of foods that we can basically all agree aren’t really great to be eating, goes completely down the drain when we are hungry. And people are most likely to be hungry when they are dieting. So if you eat nothing but apples for breakfast, then somewhere along the way, you’re likely to eat nothing but cookies for dinner. But a solid breakfast of eggs, toast, and even a little bacon will probably leave you feeling full and happy, rendering temptation less, well, tempting.