Dwarf Wheat: is it good for us?

A friend recently suggested that dwarf grains might be a key component in the recent explosion of health conditions like obesity and gluten (or other wheat-related) sensitivities.

According to Wikipedia:

The Green Revolution, or Third Agricultural Revolution, is a set of research technology transfer initiatives occurring between 1950 and the late 1960s, that increased agricultural production worldwide, particularly in the developing world, beginning most markedly in the late 1960s.[1] The initiatives resulted in the adoption of new technologies, including high-yielding varieties (HYVs) of cereals, especially dwarf wheats and rices, in association with chemical fertilizers and agro-chemicals, and with controlled water-supply (usually involving irrigation) and new methods of cultivation, including mechanization.

Most people would say that this has been good because we now have a lot fewer people starving to death. We also have a lot more fat people. There’s an obvious link, inasmuch as it is much easier to be fat if there is more food around, but we’re investigating a less obvious link: does the nutritional/other content of new wheat varieties contribute to certain modern health problems?

Continuing with Wikipedia:

The novel technological development of the Green Revolution was the production of novel wheat cultivarsAgronomists bred cultivars of maize, wheat, and rice that are the generally referred to as HYVs or “high-yielding varieties“. HYVs have higher nitrogen-absorbing potential than other varieties. Since cereals that absorbed extra nitrogen would typically lodge, or fall over before harvest, semi-dwarfing genes were bred into their genomes. …

Dr. Norman Borlaug, who is usually recognized as the “Father of the Green Revolution”, bred rust-resistant cultivars which have strong and firm stems, preventing them from falling over under extreme weather at high levels of fertilization. … These programs successfully led the harvest double in these countries.[40]

Plant scientists figured out several parameters related to the high yield and identified the related genes which control the plant height and tiller number.[43] … Stem growth in the mutant background is significantly reduced leading to the dwarf phenotypePhotosynthetic investment in the stem is reduced dramatically as the shorter plants are inherently more stable mechanically. Assimilates become redirected to grain production, amplifying in particular the effect of chemical fertilizers on commercial yield.

HYVs significantly outperform traditional varieties in the presence of adequate irrigation, pesticides, and fertilizers. In the absence of these inputs, traditional varieties may outperform HYVs.

In other words, if you breed a variety of wheat (or rice, or whatever) that takes up nutrients really fast and grows really fast, it tends to get top-heavy and fall over. Your wheat then lies on the ground and gets all soggy and rotten and is impossible to use. But if you make your fast-growing wheat shorter, by crossing it with some short (dwarf) varieties, it doesn’t fall over and it can devote even more of its energy to making nice, fat wheat berries instead of long, thin stems.

(I find it interesting that a lot of this research was done in Mexico. Incidentally, Mexico is also one of the fattest countries–on average–in the world.)

But we are talking about making the plant grow faster than it normally would, via the intake of more than usual levels of nutrients. This requires the use of more fertilizers, as these varieties can’t grow properly otherwise.

I’ve just started researching this, so I’m just reading papers and posting some links/quotes/summaries.

Elevating optimal human nutrition to a central goal of plant breeding and production of plant-based foods:

…  However, deficiencies in certain amino acids, minerals, vitamins and fatty acids in staple crops, and animal diets derived from them, have aggravated the problem of malnutrition and the increasing incidence of certain chronic diseases in nominally well-nourished people (the so-called diseases of civilization). …

The inadequacy of cereal grains as a primary food for humans arises from the fundamentals of plant physiology. … Their carbohydrate, protein and lipid profiles reflect the specific requirements for seed and seedling survival. This nutrient profile, especially after selection during domestication [], is far from optimal for human or animal nutrition. For example, the seeds of most cultivated plants contain much higher concentrations of omega-6 fatty acids than omega-3 fatty acids than is desirable for human nutrition [], with few exceptions such as flax, camelina (Camelina sativa) and walnuts. …

The authors then describe what’s up up with the fats–for plants to germinate in colder temperatures, they need more omega-3s, which are more liquid at colder temperatures. Plants in warmer climates don’t need omega-3s, so they have more omega-6s. (Presumably omega-6s are more heat tolerant, making them more stable during high-temperature cooking.)

Flax and walnut have low smokepoints (that is, they start turning to smoke at low temperatures) and so are unsuited to high-temperature cooking. People prefer to cook with oils that can withstand higher temperatures, like peanut, soy, corn, and canola.

I think one of the issues with fast food (and perhaps restaurant food in general) is that it needs to be cooked fast, which means it needs to be cooked at high temperatures, which requires the use of oils with high smokepoints, which are not necessarily the best for human health. The same food cooked more slowly at lower temperatures might be just fine, though.

There is a side issue that while oil smoking is unpleasant and bad, the high-temperature oils that don’t smoke aren’t necessarily any better, because I think they are undergoing other undesirable internal changes to prevent smoking.

Then there’s the downstream matter of the feed cattle and chickens are getting. My impression of cattle raising (from having walked around a cattle ranch a few times) is that most cattle eat naturally growing pasture grass most of the time, because buying feed and shipping it out to them is way too expensive. This grass is not human feed and is not fungible with human feed, because growing food for humans requires more effort (and water) than just letting cows wander around in the grass. Modern crops require a lot of water and fertilizer to grow properly (see the Wikipedia quote above.) This is why I am not convinced by the vegetarian argument that we could produce a lot more food for humans if we stopped producing cows–cattle feed and human feed are not energy/resource equivalent.

However, once the cows are grown, they are generally sent to feedlots to be fattened up before slaughter. Here they are given corn and other grains. The varieties of grains they are fed at this point may influence the nature of the fats they subsequently build:

Modern grain-fed meat and grain-rich diets are particularly abundant in omega-6 fatty acids, and it is thought that a deficiency of omega-3 fatty acids, especially the EPA and DHA found in fish oils, can be linked to many of the inflammatory diseases of the western diet, such as cardiovascular disease and arthritis (). DHA has been recognized as being vitally important for brain function, and a deficiency of this fatty acid has been linked to depression, cognitive disorders, and mental illness ().

Let’s get back to the article about plant breeding. I thought this was interesting:

The biological basis of protein limitation in seed-based foods appears to be the result of evolutionary strategies that plants use to build storage proteins. Seed storage proteins have evolved to store amino nitrogen polymerized in compact forms, i.e. in storage proteins such as zein in maize, gluten in wheat and hordein in barley. As the seed germinates, enzymes hydrolyze the storage proteins and the plant is able to use these stored amino acids as precursors to re-synthesize all of the twenty amino acids needed for de novo protein synthesis.

So if we make plants that absorb more nitrogen, and we dump a lot more nitrogen on them, do we get wheat with more gluten in it?

Another book I read, Nourishing Traditions, which is really a cookbook, claims that our ancestors generally ate their grains already sprouted. This was more accidental than on purpose–grains often sat around in storage, got wet, and sprouted. Sprouting (or germinating) makes the wheat use stored gluten to make amino acids. Between sprouting, fermentation (sourdough bread) and less nitrogen-loving wheat varieties, our ancestors’ breads and porridges may have had less gluten than ours.

Another issue:

In the laboratory of the first author we have taken two different approaches to improving the protein quality of crops. First, we successfully selected a series of high lysine wheat cultivars over a period of twenty years, by standard breeding methods []. …  Surviving embryos consistently had elevated levels of lysine relative to parental populations and the seed produced from these embryos also had increased levels of lysine. The increased nutritional value of these lines, however, carried a cost in terms of lower total yield. A striking result was that grasshoppers, aphids, rats and deer preferentially feasted on the foliage of these high lysine wheats in the field, rather than on neighboring conventional low lysine wheats. The highest lysine wheat had the highest predation and subsequently the lowest yield (D.C. Sands, unpublished field observations). … Thus, we are led to the hypothesis that selection for insect resistance may have inadvertently resulted in the selection for lower nutritional value…

Then the authors talk about peas, of Gregor Mendel fame. Two varieties of peas are wrinkled and smooth. The smooth, plump ones look nicer (and probably taste sweeter). The plump ones store sugar in a form that we digest more quickly, resulting in faster increases in blood sugar. They are thus more likely to get stored as fat.

Breeders and buyers are biased toward plump seeds and tubers, in peas and many other crops.

Incidentally, the outside of the wheat grain–the part we discard when producing white flours but keep when making “whole” wheat flour–contains phytates which interfere with iron absorption and other irritants designed by the plant to increase the chance of grazers passing the seed out the other end without digestion. (However, the creation of white flours may remove other nutrients.)

It’s getting late, so I’d better wrap up. The authors end by noting that fermentation is another way to potentially increase the nutritional content of foods and suggest a variety of ways scientists could make grains or yeasts that enhance fermentation.

A few more studies:

The nutritional value of crop residue components from several wheat cultivars grown at different fertilizer levels:

Nine wheat cultivars were grown at two test sites in Saskatoon, each at fertilizer levels of 0, 56, 224 kgN ha−1. Proportions of leaf, stem, chaff and grain were obtained for each level. Significant cultivar differences were observed at each site for plant component yields. A significant increase in the proportion of leaf components and a significant decrease in the proportion of the grain components was observed as soil nitrogen levels increased. Crude protein contents of plant components varied significantly with both cultivar and fertilizer level. Significant differences in digestibility in vitro also existed among cultivars. Increasing fertilizer levels significantly improved the digestibility in vitro of the leaf but not of the chaff.

Genetic differences in the copper nutrition of cereals:

Seven wheat genotypes, one or barley and one of oats were compared for their sensitivity to sub optimal supplies of copper, and their ability to recover from copper deficiency when copper was applied at defined stages of growth Copper deficiency delayed maturity, reduced the straw yield and severely depressed the gram yield In all genotypes. …

Genotypes with relatively higher yield potential were less sensitive to copper deficiency than those with lower yield potential … There was no apparent association between dwarfness and sensitivity to copper deficiency in wheat.

An article suggesting we should eat emmer wheat instead of modern cultivars:

… The production and food-relevant use of domesticated modern-day wheat varieties face increasing challenges such as the decline in crop yield due to adverse fluctuating climatic trends, and a need to improve the nutritional and phytochemical content of the grain, both of which are a result of centuries of crop domestication and advancement of dietary calorie requirements demanding new high-yield dwarf varieties in the last five decades. The focus on improving phenotypic traits such as grain size and grain yield towards calorie-driven macronutrients has inadvertently led to a loss of allelic function and genetic diversity in modern-day wheat, which suffers from poor tolerance to biotic and abiotic stresses, as well as poor nutritional and phytochemcial profiles against high-calorie-driven non-communicable chronic diseases (NCDs).The low baseline phytochemical profile of modern-day wheat varieties along with highly mechanized post-harvest processing have resulted in poor health-relevant nutritional qualities in end products against emerging NCDs. …

Ancient wheat, such as emmer with its large genetic diversity, high phytochemcial content, and better nutritional and health-relevant bioactive profiles, is a suitable candidate to address these nutritional securities…

There’s a lot of information about emmer wheat nutrition in this article/book.

 

Advertisement

Book Club: The 10,000 Year Explosion: pt 4 Agriculture

Welcome back to EvX’s Book Club. Today we’re discussing Chapter 4 of The 10,000 Year Explosion: Consequences of Agriculture.

A big one, of course, was plague–on a related note, Evidence for the Plague in Neolithic Farmers’ Teeth:

When they compared the DNA of the strain recovered from this cemetery to all published Y. pestis genomes, they found that it was the oldest (most basal) strain of the bacterium ever recovered. Using the molecular clock, they were able to estimate a timeline for the divergence and radiation of Y. pestis strains and tie these events together to make a new, testable model for the emergence and spread of this deadly human pathogen.

These analyses indicate that plague was not first spread across Europe by the massive migrations by the Yamnaya peoples from the central Eurasian steppe (around 4800 years ago)… Rascovan et al. calculated the date of the divergence of Y. pestis strains at between 6,000 and 5,000 years ago. This date implicates the mega-settlements of the Trypillia Culture as a possible origin point of Y. pestis. These mega-settlements, home to an estimated 10,000-20,000 people, were dense concentrations of people during that time period in Europe, with conditions ideal for the development of a pandemic.

The Cucuteni-Trypilia Culture flourished between the Carpathian Mountains and the Black Sea from 4800-3000 BC. It was a neolithic–that is, stone age–farming society with many large cities. Wikipedia gives a confused account of its demise:

According to some proponents of the Kurgan hypothesis of the origin of Proto-Indo-Europeans … the Cucuteni–Trypillia culture was destroyed by force. Arguing from archaeological and linguistic evidence, Gimbutas concluded that the people of the Kurgan culture (a term grouping the Yamnaya culture and its predecessors) … effectively destroyed the Cucuteni–Trypillia culture in a series of invasions undertaken during their expansion to the west. Based on this archaeological evidence Gimbutas saw distinct cultural differences between the patriarchal, warlike Kurgan culture and the more peaceful egalitarian Cucuteni–Trypillia culture, … which finally met extinction in a process visible in the progressing appearance of fortified settlements, hillforts and the graves of warrior-chieftains, as well as in the religious transformation from the matriarchy to patriarchy, in a correlated east–west movement.[26] In this, “the process of Indo-Europeanization was a cultural, not a physical, transformation and must be understood as a military victory in terms of successfully imposing a new administrative system, language, and religion upon the indigenous groups.[27]

How does it follow that the process was a cultural, not physical transformation? They got conquered.

In his 1989 book In Search of the Indo-Europeans, Irish-American archaeologist J. P. Mallory, summarising the three existing theories concerning the end of the Cucuteni–Trypillia culture, mentions that archaeological findings in the region indicate Kurgan (i.e. Yamnaya culture) settlements in the eastern part of the Cucuteni–Trypillia area, co-existing for some time with those of the Cucuteni–Trypillia.[4]Artifacts from both cultures found within each of their respective archaeological settlement sites attest to an open trade in goods for a period,[4] though he points out that the archaeological evidence clearly points to what he termed “a dark age,” its population seeking refuge in every direction except east. He cites evidence of the refugees having used caves, islands and hilltops (abandoning in the process 600–700 settlements) to argue for the possibility of a gradual transformation rather than an armed onslaught bringing about cultural extinction.[4]

How is “refugees hiding in caves” a “gradual transformation?” That sounds more like “people fleeing an invading army.”

The obvious issue with that theory is the limited common historical life-time between the Cucuteni–Trypillia (4800–3000 BC) and the Yamnaya culture (3300–2600 BC); given that the earliest archaeological findings of the Yamnaya culture are located in the VolgaDonbasin, not in the Dniester and Dnieper area where the cultures came in touch, while the Yamnaya culture came to its full extension in the Pontic steppe at the earliest around 3000 BC, the time the Cucuteni–Trypillia culture ended, thus indicating an extremely short survival after coming in contact with the Yamnaya culture.

How is that an issue? How long does Wikipedia think it takes to slaughter a city? It takes a few days. 300 years of contact is plenty for both trade and conquering.

Another contradicting indication is that the kurgans that replaced the traditional horizontal graves in the area now contain human remains of a fairly diversified skeletal type approximately ten centimetres taller on average than the previous population.[4]

What are we even contradicting? Sounds like they got conquered, slaughtered, and replaced.

Then Wikipedia suggests that maybe it was all just caused by the weather (which isn’t a terrible idea.) Drought weakened the agriculturalists and prompted the pastoralists to look for new grasslands for their herds. They invaded the agriculturalists’ areas because they were lush and good for growing grain, which the pastoralists’ cattle love eating. The already weakened agriculturalists couldn’t fight back.

ANYWAY. Lets get on with Greg and Henry’s account, The 10,000 Year Explosion:

The population expansion associated with farming increased crowding, while farming itself made people sedentary. Mountains of garbage and water supplies contaminated with human waste favored the spread of infectious disease. …

Most infectious diseases have a critical community size, a  number and concentration of people below which they cannot persist. The classic example is measles, which typically infects children and remains infectious for about ten days, after which the patient has lifelong immunity. In order for measles to survive, the virus that causes it, the paramyxovirus, must continually find unexposed victims–more children. Measles can only persist in a large, dense population: Populations that are too small or too spread out (under half a million in close proximity) fail to produce unexposed children fast enough, so the virus dies out.

Measles, bubonic plague, smallpox: all results of agriculture.

Chickenpox: not so much.

I wonder if people in the old Cucuteni–Trypillia area are particularly immune to bubonic plague, or if the successive waves of invading steppe nomads have done too much genetic replacement (slaughtering) for adaptations to stick around?

Harpending and Cochran then discuss malaria, which has had a big impact on human genomes (eg, sickle cell,) in the areas where malaria is common.

In general, the authors play it safe in the book–pointing to obvious cases of wide-scale genetic changes like sickle cell that are both undoubtable and have no obvious effect on personality or intelligence. It’s only in the chapter on Ashkenazi IQ that they touch on more controversial subjects, and then in a positive manner–it’s pleasant to think, “Why was Einstein so smart?” and less pleasant to think, “Why am I so dumb?”

However:

It’s time to address the old chestnut that biological differences among human populations are “superficial,” only skin-deep. It’s not true: We’re seeing genetically caused differences in all kinds of functions, and every such differences was important enough to cause a significant increase in fitness (number of offspring)–otherwise it wouldn’t have reached high frequency in just a few millennia.

As for skin color, Cochran and Harpending lean on the side of high-latitude lightening having been caused by agriculture, rather than mere sunlight levels:

Interestingly, the sets of changes driving light skin color in China are almost entirely different from those performing a similar function in Europe. …

Many of these changes seem to be quite recent. The mutation that appears to have the greatest effect on skin color among Europeans and neighboring peoples, a variant of SLC24A5, has spread with astonishing speed. Linkage disequilibrium… suggests that it came into existence about 5,800 years ago, but it has a frequency of 99 percent throughout Europe and is found at significant levels in North Africa, East Africa, and as far east as India and Ceylon. If it is indeed that recent, it must have had a huge selective advantage, perhaps as high as 20 percent. It would have spread so rapidly that, over a long lifetime a farmer could have noticed the change in appearance in his village.

Wow.

In humans, OAC2 … is a gene involved in the melanin pathway… Species of fish trapped in caves… lose their eyesight and become albinos over many generations. … Since we see changes in OCA2 in each [fish] case, however, there must have been some advantage in knocking out OCA2, at least in that underground environment. The advantage cannot like in increased UV absorption, since there’s no sunlight in those caves.

There are hints that knocking out OCA2, or at least reducing its activity, may he advantageous… in humans who can get away with it. We see a pattern that suggests that having one inactive copy of OCA2 is somehow favored even in some quite sunny regions. In southern Africa, a knocked-out version of OCA2 is fairly common: The gene frequency is over 1 percent.

And that’s an area with strong selection for dark skin.

A form of OCA2 albinism is common among the Navajo and other neighboring tribes, with gene frequencies as high as 4.5 percent. The same pattern appears in southern Mexico, eastern Panama, and southern Brazil. All of which suggests that heterozygotes…may ave some advantage.

Here is an article on the possibility of sexual selection for albinism among the Hopi.

So why do Europeans have such variety in eye and hair color?

Skeletons

The skeletal record clearly supports the idea that there has been rapid evolutionary change in humans over the past 10,000 years. The human skeleton has become more gracile–more lightly built–though more so in some populations than others. Our jaws have shrunk, our long bones have become lighter, and brow ridges have disappeared in most populations (with the notable exception of Australian Aborigines, who have also changed, but not as much; they still have brow ridges, and their skulls are about twice as thick as those of other peoples.)

This could be related to the high rates of interpersonal violence common in Australia until recently (thicker skulls are harder to break) or a result of interbreeding with Neanderthals and Denisovans. We don’t know what Denisovans looked like, but Neanderthals certainly are noted for their robust skulls.

Skull volume has decreased, apparently in all populations: In Europeans, volume is down about 10 percent from the high point about 20,000 years ago.

This seems like a bad thing. Except for mothers.

Some changes can be seen even over the past 1,000 years. English researchers recently compared skulls from people who died in the Black Death ([approximately] 650 years ago), from the crew of the Mary Rose,a  ship that sank in Tudor times ([approximately] 450 years ago) and from our contemporaries. The shape of the skull changed noticeably over that brief period–which is particularly interesting because we know there has been no massive population replacement in England over the past 700 years.

Hasn’t there been a general replacement of the lower classes by the upper classes? I think there was also a massive out-migration of English to other continents in the past five hundred years.

The height of the cranial vault of our contemporaries was about 15 percent larger than that of the earlier populations, and the part of the skull containing the frontal lobes was thus larger.

This is awkwardly phrased–I think the authors want the present tense–“the cranial vault of our contemporaries is…” Nevertheless, it’s an interesting study. (The frontal lobes control things like planning, language, and math.) 

We then proceed to the rather depressing Malthus section and the similar “elites massively out-breeding commoners due to war or taxation” section. You’re probably familiar with Genghis Khan by now. 

We’ve said that the top dogs usually had higher-than-average fertility, which is true, but there have been important exceptions… The most common mistake must have been living in cities, which have almost always been population sinks, mostly because of infectious disease. 

They’re still population sinks. Just look at Singapore. Or Tokyo. Or London. 

The case of silphium, a natural contraceptive and abortifacient eaten to extinction during the Classical era, bears an interesting parallel to our own society’s falling fertility rates. 

And of course, states domesticate their people: 

Farmers don’t benefit from competition between their domesticated animals or plants… Since the elites were in a very real sense raising peasants, just as peasants raised cows, there must have been a tendency for them to cull individuals who were more aggressive than average, which over time would have changed the frequencies of those alleles that induced such aggressiveness.

On the one hand, this is a very logical argument. On the other hand, it seems like people can turn on or off aggression to a certain degree–uber peaceful Japan was rampaging through China only 75 years ago, after all. 

Have humans been domesticated? 

(Note: the Indians captured by the Puritans during the Pequot War may have refused to endure the yoke, but they did practice agriculture–they raised corn, squash and beans, in typical style. Still, they probably had not endured under organized states for as long as the Puritans.)

There is then a fascinating discussion of the origins of the scientific revolution–an event I am rather fond of. 

Although we do not as yet fully understand the true causes of the scientific and industrial revolution, we must now consider the possibility that continuing human evolution contributed to that process. It could explain some of the odd historical patterns that we see.

Well, that’s enough for today. Let’s continue with Chapter 5 next week.

How about you? What are your thoughts on the book?

Book Club: The 10,000 Year Explosion, Part 3

940px-Centres_of_origin_and_spread_of_agriculture.svg
The spread of agriculture

Welcome back to the Book Club. Today we’re reading Chapter 3: Agriculture, from Cochran and Harpending’s The 10,000 Year Explosion: How Civilization Accelerated Human Evolution.

One of my fine readers asked for “best of” recommendations for Cochran and Harpending’s blog, West Hunter. This is a good question, and as I have not yet found a suitable list, I thought I would make my own.

However, the West Hunter is long, so I’m only doing the first year for now:

My friend the Witch Doctor:

Only a handful of Herero shared my skepticism about witchcraft. People in the neighborhood as well as several other employees were concerned about Kozondo’s problem. They told me that he had to be taken to a well known local witch doctor. “Witch doctor” I said, “you all have been watching too many low budget movies. We call them traditional healers these days, not witch doctors”. They all, including Kozondo, would have none of it. “They are bad and very dangerous people, not healers” he said. It quickly became apparent that I was making a fool of myself trying to explain why “traditional healer” was a better way to talk than “witch doctor”. One of our group had some kind of anti-anxiety medicine. We convinced Kozondo to try one but it had no effect at all. Everyone agreed that he must consult the witch doctor so we took him. …

That evening we had something like a seminar with our employees and neighbors about witchcraft. Everyone except the Americans agreed that witchcraft was a terrible problem, that there was danger all around, and that it was vitally important to maintain amicable relations with others and to reject feelings of anger or jealousy in oneself. The way it works is like this: perhaps Greg falls and hurts himself, he knows it must be witchcraft, he discovers that I am seething with jealousy of his facility with words, so it was my witchcraft that made him fall. What is surprising is that I was completely unaware of having witched him so he bears me no ill will. I feel bad about his misfortune and do my best to get rid of my bad feelings because with them I am a danger to friends and family. Among Herero there is no such thing as an accident, there is no such thing as a natural death, witchcraft in some form is behind all of it. Did you have a gastrointestinal upset this morning? Clearly someone slipped some pink potion in the milk. Except for a few atheists there was no disagreement about this. Emotions get projected over vast distances so beware.

Even more interesting to us was the universal understanding that white people were not vulnerable to witchcraft and could neither feel it nor understand it. White people literally lack a crucial sense, or part of the brain. An upside, I was told, was that we did not face the dangers that locals faced. On the other hand our bad feelings could be projected so as good citizens we had to monitor carefully our own “hearts”.

Amish Paradise:

French Canadian researchers have shown that natural selection has noticeably sped up reproduction among the inhabitants of Île aux Coudres, an island in the St. Lawrence River –  in less than 150 years. Between 1799 and 1940, the age at which women had their first child dropped from 26 to 22, and analysis shows this is due to genetic change.

… Today the French of Quebec must  differ significantly (in those genes that influence this trait)  from people in France, which has had relatively slow population growth.  …

The same must be the case for old American types whose ancestors – Puritans, for example – arrived early and went through a number of high-fertility generations in colonial days.  It’s likely the case for the Mormons, who are largely descended from New Englanders. I’ve heard of odd allele frequencies  in CEU  (involving FSH) that may relate to this.

Something similar must be true of the Boers as well.

I would guess that a similar process operated among the first Amerindians that managed to get past  the ice in North America.  America south of the glaciers would have been a piece of cake for anyone tough enough to make a living as a hunter in Beringia – lush beyond belief, animals with no experience of humans.

Six Black Russians:

(Black Russians are, I think, an alcoholic beverage.)

Every now and then, I notice someone, often an anthropologist,  saying that human cognitive capability just has to be the same in all populations.  According to Loring Brace, “Human cognitive capacity , founded on the ability to learn a language, is of equal survival value to all human groups, and consequently there is no valid reason to expect that there should be average differences in intellectual ability among living human populations. ”

There are a lot of ideas and assumptions in that quote, and as far as I can tell, all of them are wrong.  …

Populations vary tremendously in the fraction that contributes original work in science and technology – and that variation mostly agrees with the distribution of IQ.

And Your Little Dog, Too!

As I have mentioned before,  the mtDNA of European hunter-gathers seems to be very different from that of modern Europeans. The ancient European mtDNA pool was about 80% U5b – today that lineage is typically found at 10% frequency or lower, except in northern Scandinavia. Haplogroup H, currently the most common in Europe, has never been found in early Neolithic or  pre-Neolithic Europeans.  …

Interestingly, there is a very similar  pattern in canine mtDNA.  Today Europeans dogs fall into four haplotypes: A (70%), B(16%), C (6%), and D(8%).  But back in the day, it seems that the overwhelming majority of dogs (88%)  were type C,  12% were in group A, while B and D have not been detected at all.

Lewontin’s Argument 

(always bears re-addressing)

Richard Lewontin argued that since most (> 85%) genetic variation in humans is within-group, rather than between groups, human populations can’t be very different. Of course, if this argument is valid, it should apply to any genetically determined trait. Thus the  variation in skin color within a population should be larger than the skin color differences between populations – except that it’s not. The difference in skin color between Europeans and Pygmies is large, so large that there is no overlap at all.

The Indo-European Advantage

There is a large region of homogeneity on European haplotypes with the mutation [for lactose tolerance], telling us that it has arisen to high frequency within the last few thousand years. …

In a dairy culture where fresh milk was readily available, children who could drink it obtained about 40% more calories from milk than children who were not LT.

Consider that 1 Liter of cow’s milk has

* 250 Cal from lactose
* 300 Cal from fat
* 170 Cal from protein

or 720 Calories per liter. But what if one is lactose intolerant? Then no matter whether or not flatulence occurs that person does not get the 250 Calories of lactose from the liter of milk, but only gets 470.

The Hyperborean Age

I was contemplating Conan the Barbarian, and remembered the essay that Robert E. Howard wrote about the  background of those stories – The Hyborian Age.  I think that the flavor of Howard’s pseudo-history is a lot more realistic than the picture of the human past academics preferred over the past few decades. …

Given the chance (sufficient lack of information), American anthropologists assumed that the Mayans were peaceful astronomers. Howard would have assumed that they were just another blood-drenched snake cult: who came closer? …

Most important, Conan, unlike the typical professor, knew what was best in life.

Class, Caste, and Genes: 

If there is any substantial heritability of merit, where merit is whatever leads to class mobility, then mobility ought to turn classes into hereditary castes surprisingly rapidly.

A start at looking into genetic consequences of meritocracy is to create the simplest possible model and follow its implications. Consider free meritocracy in a two class system, meaning that each generation anyone in the lower class who has greater merit than someone in the upper class immediately swaps class with them. …

Back to the book. Chapter 3: Agriculture: The Big Change

This chapter’s thesis is the crux of the book: agriculture simultaneously exposed humans to new selective pressured and allowed the human population to grow, creating a greater quantity of novel mutations for natural selection to work on.

Sixty thousand years ago, before the expansion out of Africa, there were something like a quarter of a million modern humans. By the Bronze Age, 3,000 years ago that number was roughly 60 million. 

Most random mutations fall somewhere between “useless” and “kill you instantly,” but a few, like lactase persistence, are good. I’m just making up numbers, but suppose 1 in 100 people has good, novel mutation. If your group has 100 people in it (per generation), then you get one good mutation. If your group has 1,000 people, then you get 10 good mutations.

Evolution isn’t like getting bitten by a radioactive spider–it can only work on the genetic variation people actually have. More genetic variation=more chances at getting a good gene that helps people survive.

Or to put it another way, we can look at a population and use “time” as one of our dimensions. Imagine a rectangle of people–all of the people in a community, over time–100 people in the first generation, 100 in the second, etc. After enough time, (10 generations or about 200 years,) you will have 1,000 people and of course hit 10 favorable mutations.

Increasing the population per generation simply increases the speed with which you get those 10 good mutations.

Interestingly:

One might think that it would take much longer for a favorable mutation to spread through such a large population than it would for one to spread through a population as small as the one that existed in the Old Stone Ag. But sine the frequency of an advantageous allele increases exponentially with time in a well-mixed population, rather like the flu, it takes only twice as long to spread through a population of 100 million as it does to spread through a population of 10,000.

The authors note that larger populations can generate more good, creative ideas, not just genes.

Agriculture–and its attendant high population densities–brought about massive cultural changes to human life, from the simple fact of sedentism (for non-pastoralists) to the ability to store crops for the winter, build long-term housing, and fund governments, which in turn created and enforced laws which further changed how humans lived and interacted. 

(Note: “government” pre-dates agriculture, but was rather different when people had no surplus grain to take as taxes.)

Agriculture also triggered the spread of plagues, as people now lived in groups large (and often squalid) enough to breed and transmit them. Related: Evidence for the Plague in Neolithic Farmers’ Teeth.

Plagues have been kind of a big deal in the history of civilization.

On government:

Combined with sedentism, these developments eventually led to the birth of governments, which limited local violence. Presumably, governments did this because it let them extract more resources from their subjects…

Peasants fighting among themselves interferes with the economy. Governments don’t like it and will tend to hang the people involved.

Some people call it self-domestication.

Recent studies have found hundreds of ongoing [genetic] sweeps–sweeps begun thousands of years ago that are still in progress today. Some alleles have gone to fixation, more have intermediate frequencies, and most are regional. Many are very recent: the rate of origination peaks at around 5,000 years ago in the European and Chinese samples, and about 8,500 years ago in the African sample.

I assume that these genes originating about 5,000 years ago are mostly capturing the Indo-European (pastoralist) and Anatolian (farming) expansions. I don’t know what happened in China around 5,000 years ago, but I wouldn’t be surprised if whatever triggered the Indo-Europeans to start moving in central Asia were connected with events further to the east.

IIRC, 8,500 years ago is too early for the Bantu expansion in Africa, so must be related to something else.

There is every reason to think that early farmers developed serious health problems from this low-protein, vitamin -short, high-carbohydrate diet. Infant mortality increased, and the poor diet was likely one of the causes. you can see the mismatch between the genes and the environment in the skeletal evidence Humans who adopted agriculture shrank: average height dropped  by almost five inches.

I have seen this claim many times, and still find it incredible. I am still open to the possibility of it having been caused by a third, underlying factor, like “more people surviving diseases that had formerly killed them.”

There are numerous signs of pathology in the bones of early agriculturalists. In the Americas, the introduction of maize led to widespread tooth decay and anemia due to iron deficiency…

Of course, over time, people adapted to their new diets. You are not a hunter-gatherer. (Probably. If you are, hello!)

…Similarly, vitamin D shortages in the new die may have driven the evolution of light skin in Europe and northern Asia. Vitamin D is produced by ultraviolet radiation from the sun acting on our skin… Since there is plenty of vitamin D in fresh meat, hunter-gatherers in Europe may not have suffered from vitamin D shortages and thus may have been able to get by with fairly dark skin. In fact, this must have been the case, since several of the major mutations causing light skin color appear to have originated after the birth of agriculture. vitamin D was not abundant in the new cereal-based diet, and any resulting shortages would have been serious, since they could lead to bone malformations (rickets,) decreased resistance to infectious diseases, and even cancer. …

I have read that of the dark-skinned peoples who have recently moved to Britain, the vegetarians among them have been the hardest-hit by vitamin D deficiency. Meat is protective. 

Peoples who have farmed since shortly after the end of the Ice Age (such as the inhabitants of the Middle East) must have adapted most thoroughly to agriculture. In areas where agriculture is younger, such as Europe or China, we’d expect to see fewer adaptive changes… In groups that had remained foragers, there would presumably be no such adaptive changes…

Populations that have never farmed or that haven’t farmed for long, such as the Australian Aborigines and many Amerindians, have characteristic health problems today when exposed to Western diets.

EG, Type 2 diabetes.

Dr. (of dentistry) Weston Price has an interesting book, Nutrition and Physical Degeneration, that describes people Price met around the world, their dental health, and their relationship to Western or traditional diets. (Written/published back in the 1930s.) I’m a fan of the book; I am not a fan of the kind of weird organization that publishes it. That organization promotes fringe stuff like drinking raw milk, but as far as I can recall, I didn’t see anything about drinking raw milk in the entirety of Dr. Price’s tome; Dr. Price wasn’t pushing anything fringe, but found uncontroversial things like “poverty-stricken children during the Great Depression did better in school when given nutritious lunches.” Price was big on improper nutrition as the cause of tooth decay and was concerned about the effects of industrialization and Western diets on people’s bones and teeth.

So we’ve reached the end of Chapter 3. What did you think? Do you agree with Greg and Henry’s model of how Type 2 Diabetes arises, or with the “thrifty genotype” promulgated by James Neel? And why do metabolic syndromes seem to affect poor whites more than affluent ones?

What about the higher rates of FAS among African Americans than the French (despite the French love of alcohol) or the straight up ban on alcohol in many Islamic (ancient farming) cultures? What’s going on there?

And any other thoughts you want to add.

See you next Wednesday for chapter 4.

800 Posts! Open Thread + a graph on farming around the world

HT Pseudoerasmus

Hello and welcome! Today I realized that the blog has just reached 800 posts (slightly more than 800 by the time you read this.

Here’s the full article the graph to the right hails from–Productivity Growth in Global Agriculture Shifting to Developing Countries. (PDF). The right-hand axis shows agricultural output per worker–most countries in most parts of the world have seen gains in output per worker over the past almost-60 years. The left-hand axis shows output per hectare of land–the sort of improvements you get by adding fertilizer.

If one farmer on one hectare doubled his output, (again, suppose fertilizer) he and his land would move up at a 45 degree angle. If one farmer doubled his output by using a tractor to farm twice as much land, he would move directly to the right on this graph. If the land became twice as productive, and so each individual farmer cut back and farmed half as much land, then you’d see a line heading straight up.

So what do we see? North America and Oceana are producing the most food per farmer. Oceana gets very little food per hectare, though (“Oceana” here means New Zealand and Australia, which has some rather large sheep ranches.)

Northeast Asia–that is, Japan, Korea, and Taiwan, even though Taiwan isn’t really in the north–gets the most food per acre. These are very densely populated countries. Europe hovers in the middle, perhaps having already achieved rather good productivity per acre before the study began and having recently improved more in productivity per farmer.

Africa and South Asia (India and Pakistan?) are notable for trending upward more than rightward–in these areas, improved agricultural production has allowed existing fields to be sub-divided. This suggests that, while population growth is being accommodated, farmers lack the ability to benefit from selling excess produce (hence why they do not bother to farm more than their own families eat) and people are not moving into other, non-subsistence occupations.

Anyway, how are you, my faithful readers? As we celebrate 800 posts, what would you like to see more of in the future? Less of? Any books you’d like to see reviewed or blog features expanded (or contracted)?

I am thinking of collecting and editing some of my best posts into a book; which posts have you enjoyed?

I’d like to thank you all for all of the great and interesting comments over the years; after all, if it weren’t for readers, this blog would just be me shouting into the void. Readers make all of this effort fun.

Have a wonderful day.

Re: Eurozine’s How to Change Human History

Some of you have asked  for my opinions on Davids Graeber and Wengrow’s recently published an article, How to change the course of human history (at least, the part that’s already happened):

The story we have been telling ourselves about our origins is wrong, and perpetuates the idea of inevitable social inequality. David Graeber and David Wengrow ask why the myth of ‘agricultural revolution’ remains so persistent, and argue that there is a whole lot more we can learn from our ancestors.

The article is long and difficult to excerpt, so I’m going to summarize:

The traditional tale of how our idyllic, peaceful, egalitarian, small-group hunter-gatherer past gave way to our warlike, sexist, racist, violent, large-city agrarian present gives people the impression that hierarchy and violence are inevitable parts of our economic system. However, the traditional tale is wrong–the past was actually a lot more complicated than you’ve been told. Therefore, there is no historical pattern and the real source of all bad things is actually the family.

The final conclusion is pulled out of nowhere:

Egalitarian cities, even regional confederacies, are historically quite commonplace. Egalitarian families and households are not. Once the historical verdict is in, we will see that the most painful loss of human freedoms began at the small scale – the level of gender relations, age groups, and domestic servitude – the kind of relationships that contain at once the greatest intimacy and the deepest forms of structural violence. If we really want to understand how it first became acceptable for some to turn wealth into power, and for others to end up being told their needs and lives don’t count, it is here that we should look. Here too, we predict, is where the most difficult work of creating a free society will have to take place.

Since “inequality begins in the family” is supported nowhere in the text, we will ignore it.

  1. What about the “traditional narrative”? Did hunter-gathers live in small, peaceful, egalitarian, idyllic communities? Or are the Davids correct that this is a myth?

It’s a myth. Mostly.

While we have almost no information about people’s opinions on anything before the advent of writing, there’s no evidence from any hunter-gatherer society we have actually been able to observe that hunter-gathering leads naturally to egalitarianism or peacefulness.

For example, among the Inuit (Eskimo), hunter-gatherers of the arctic, polyandry (the marriage of one woman to multiple men) didn’t exist because they had particularly enlightened views about women and marriage, but because they had a habit of killing female babies. Too much female infanticide => not enough adult women to go around => men making do.

Why do some groups have high rates of female infanticide? Among other reasons, because in the Arctic, the men do the hunting (seal, fish, caribou, etc.) and the women gather… not a whole lot. (Note: I’m pretty sure the modern Inuit do not practice sex-selective infanticide.)

Polyandry can also be caused by polygamy and simple lack of resources–men who cannot afford to support a wife and raise their own children may content themselves with sharing a wife and contributing what they can to the raising of offspring who might be theirs.

I have yet to encounter in all of my reading any hunter-gatherer or “primitive” society that has anything like our notion of “gender equality” in which women participate equally in the hunting and men do 50% of the child-rearing and gathering, (though some Pygmies are reported to be excellent fathers.) There are simple physical limits here: first, hunter-gatherers don’t have baby formula and men don’t lactate, so the duties of caring for small children fall heavily on their mothers. Many hunter-gatherers don’t even have good weaning foods, and so nurse their children for years longer than most Westerners. Second, hunting tends to require great physical strength, both in killing the animals (stronger arms will get better and more accurate draws on bows and spears) and in hauling the kills back to the tribe (you try carrying a caribou.)

In many horticultural societies, women do a large share of the physical labor of building houses and producing food, but the men do not make up for this by tending the babies. A similar division of labor exists in modern, lower-class African American society, where the women provide for their families and raise the children and then men are largely absent. Modern Rwanda, which suffers a dearth of men due to war and mass genocide, also has a “highly equitable” division of labor; not exactly an egalitarian paradise.

Hunter-gatherers, horticulturalists, and other folks living outside formal states, have very high rates of violence. The Yanomami/o, for example, (who combine horticulture and hunting/foraging) are famous for their extremely high rates of murder and constant warfare. The Aborigines of Australia, when first encountered by outsiders, also had very high rates of interpersonal violence and warfare.

Graph from the Wikipedia
See also my post, “No, Hunter Gatherers were not Peaceful Paragons of Gender Egalitarianism.”

The Jivaro are an Amazonian group similar to the Yanomamo; the Mae Enga, Dugum Dani, Huli, and Gebusi are horticulturalists/hunters from PNG; Murngin are Australian hunter-gatherers.

I know, I know, horticulturalists are not pure hunter-gatherers, even if they do a lot of hunting and gathering. As we’ll discuss below, the transition from hunter-gathering to agriculture is complicated and these are groups that we might describe as “in between”. The real question isn’t whether they bury a few coconuts if they happen to sprout before getting eaten, but whether they have developed large-scale social organization, cities, and/or formal states.

The article protests against using data from any contemporary forager societies, because they are by definition not ancient hunter-gatherers and have been contaminated by contact with non-foraging neighbors (I propose that the Australian Aborigines, however, at first contact were pretty uncontaminated,) but then the article goes on to use data from contemporary forager societies to bolster its own points… so I feel perfectly entitled to do the same thing.

However, we do have some data on ancient violence, eg:

According to this article, 12-14% of skeletons from most (but not all) ancient, pre-agricultural hunter-gatherer groups show signs of violence. Here’s a case of a band of hunter-gatherers–including 6 small children–who were slaughtered by another band of hunter-gatherers 10,000 years ago.

Warfare appears to have been part of the human experience as far back as we look–even chimps wage war against each other, as Jane Goodall documented in her work in the Gombe.

Then there’s the cannibalism. Fijians, for example, who practiced a mixed horticulture/hunter-gathering lifestyle (fishing is a form hunting that looks a lot like gathering,) were notorious cannibals when first encountered by outsiders. (Though they did have something resembling a state at the time.)

Neanderthals butchered each other; 14,700 years ago, hunter-gatherers were butchering and eating each other in Cheddar Gorge, England. (This is the same Cheddar Gorge as the famous Cheddar Man hails from, but CM is 5,000 years younger than these cannibals and probably no relation, as an intervening glacier had forced everyone out of the area for a while. CM also died a violent death, though.)

Or as reported by Real Anthropology:

Increasing amount of archaeological evidence, such as fortifications of territories and pits containing dead humans blown by axes, indicates that warfare originated from prehistoric times, long before the establishment of state societies. Recently, researchers studying the animal bones in Mesolithic layer of Coves de Santa Maira accidentally discovered thirty human bone remains of the pre-Neolithic hunter-gatherer with anthropic marks, indicating behaviors of human cannibalism.

The article would like to emphasize, however, that we don’t really know why these people engaged in cannibalism. Starvation? Funeral rituals? Dismemberment of an enemy they really hated? Like I said, it’s hard to know what people were really thinking without written records.

There was a while in anthropology/archaeology when people were arguing that the spread of pots didn’t necessarily involve the spread of people, as a new pottery style could just spread because people liked it and decided to adopt it; it turns out that sometimes the spread is indeed of pots, and sometimes it’s of people. Similarly, certain anthropologists took to describing hunter-gatherers as “harmless“, but this didn’t involve any actual analysis of violence rates among hunter-gatherers (yes, I’ve read the book.)

In sum: The narrative that our ancestors were peaceful egalitarians is, in most cases, probably nonsense.

  • 2. The Davids also argue that the transition from hunter-gathering to agriculture was more complex than the “traditional narrative” claims.

This is also true. As we’ve already touched on above, there are many economic systems that fall somewhere in between exclusive hunter-gathering and pure agriculture. Nomadic hunters who followed and exploited herds of animals gradually began protecting them from other predators (like wolves) and guiding the animals to areas with food and shelter. The domestication of goats pre-dates the beginning of agriculture (and dogs pre-date goats;) the domestication of reindeer was much more recent, (I reviewed a book on reindeer economies here, here, here, and here.) Again, there is no absolute line between hunters like the Eskimo who annually exploit migrating wild caribou and Lapp (Sami) ranchers who occasionally round up their herds of “domestic” reindeer. The reindeer appreciate that we humans kill off their natural predators (ie wolves) and provide a source of valuable salts (ie urine.) The origin of domestic goats and sheep probably looked similar, though the domestication of cattle was probably a more conscious decision given the bovines’ size.

The hunting of fish also looks a lot more like gathering or even farming, as a single resource area (eg, a bend in the river or a comfortable ocean bay) may be regularly exploited via nets, traps, rakes, weirs, etc.

Horticulture is a form of low-intensity agriculture (literally, gardening.) Some horticulturalists get most of their food from their gardens; others plant a few sprouted coconuts and otherwise get most of their food by hunting and fishing. Horticulture doesn’t require much technology (no plows needed) and typically doesn’t produce that many calories.

It is likely that many “hunter gatherers” understood the principle of “seeds sprout and turn into plants” and strategically planted seeds or left them in places where they wanted plants to grow for centuries or millennia before they began actively tending the resulting plants.

Many hunter-gatherer groups also practice active land management techniques. For example, a group of Melanesians in PNG that hunts crocodiles periodically burns the swamp in which the crocodiles live in order to prevent woody trees from taking over and making the swamp less swampy. By preserving the crocodiles’ habitat, they ensure there are plenty of crocodiles around for them to hunt. (I apologize for the lack of a link to a description of the group, but I saw it in a documentary about hunter-gatherers available on Netflix.)

Large-scale environment management probably also predates the adoption of formal agriculture by thousands of years.

Where the article goes wrong:

  1. Just because something is more complicated than the “simplified” version you commonly hear doesn’t mean, “There is no pattern, all is unknowable, nihilism now.”

Any simplified version of things is, by definition, simplified.

The idea that hunter-gatherers were uniquely peaceful and egalitarian is nonsense; if anything, the opposite may be true. Once you leave behind your preconceptions, you realize that the pattern isn’t “random noise” but but actually that all forms of violence and oppression appear to be decreasing over time. Economies where you can get ahead by murdering your neighbors and stealing their wives have been largely replaced by economies where murdering your neighbors lands you in prison and women go to college. There’s still noise in the data–times we humans kill a lot of each other–but that doesn’t mean there is no pattern.

  • 2. Most hunter-gatherers did, in fact, spend most of their time in small communities

The Davids make a big deal out of the fact that hunter-gatherers who exploit seasonally migrating herds sometimes gather in large groups in order to exploit those herds.  They cite, for example:

Another example were the indigenous hunter-gatherers of Canada’s Northwest Coast, for whom winter – not summer – was the time when society crystallised into its most unequal form, and spectacularly so. Plank-built palaces sprang to life along the coastlines of British Columbia, with hereditary nobles holding court over commoners and slaves, and hosting the great banquets known as potlatch. Yet these aristocratic courts broke apart for the summer work of the fishing season, reverting to smaller clan formations, still ranked, but with an entirely different and less formal structure. In this case, people actually adopted different names in summer and winter, literally becoming someone else, depending on the time of year.

Aside from the fact that they are here citing a modern people as an argument about prehistoric ones (!), the Pacific North West is one of the world’s lushest environments with an amazing natural abundance of huntable (fishable) food. If I had to pick somewhere to ride out the end of civilization, the PNW (and New Zealand) would be high on my list. The material abundance available in the PNW is available almost nowhere else in the world–and wasn’t available to anyone before the First Nations arrived in the area around 13,000 years ago. Our stone-age ancestors 100,000 years ago in Africa certainly weren’t exploiting salmon in British Columbia.

Hunter-gatherers who exploit migrating resources sometimes get all of their year’s food in only 3 or 4 massive hunts. These hunts certainly can involve lots of people, as whole clans will want to work together to round up, kill, and process thousands of animals within the space of a few days.

Even the most massive of these gatherings, however, did not compare in size and scope to our modern cities. A few hundred Inuit might gather for the short arctic summer before scattering back to their igloos; the Mongol capital of Ulan Bator was oft described as nearly deserted as the nomadic herdsmen had little reason to remain in the capital when court was not in session.

(Also, the Davids’ description of Inuit life is completely backwards from the actual anthropology I have read; I’m wondering if he accidentally mixed up the Yupik Eskimo who don’t go by the term “Inuit” with the Canadian Eskimo who do go by “Inuit;” I have not read about the Yupik, but if their lifestyles are different from the Inuit, this would explain the confusion.)

The Davids also cite the behavior of the 19th century Plains Indians, but this is waaay disconnected from any “primitive” lifestyle. Most of the Plains Indians had formerly been farmers before disease, guns, and horses, brought by the Spaniards, disrupted their lives. Without horses (or plows) the great plains and their bison herds were difficult to exploit, and people preferred to live in towns along local riverbanks, growing corn, squash, and beans.

We might generously call these towns “cities,” but none of them were the size of modern cities.

  • 3. Production of material wealth

Hunter-gathering, horticulture, fishing, and herding–even at their best–do not produce that much extra wealth. They are basically subsistence strategies; most people in these societies are directly engaged in food production and so can’t spend their time producing other goods. Nomads, of course, have the additional constraint that they can’t carry much with them under any circumstances.

A society can only have as much hierarchy as it can support. A nomadic tribe can have one person who tells everyone when to pack up and move to the next pasture, but it won’t produce enough food to support an entire class of young adults who do things other than produce food.

By contrast, in our modern, industrial society, less than 2% of people are farmers/ranchers. The other 98% of us are employed in food processing of some sort, careers not related to food at all, or unemployed.

This is why our society can produce parking lots that are bigger and more complex than the most impressive buildings ever constructed by hunter-gatherers.

The fact that, on a few occasions, hunter-gatherers managed to construct large buildings (and Stonehenge was not built by hunter-gatherers but by farmers; the impressive, large stones of Stonehenge were not part of the original layout but erected by a later wave of invaders who killed off 90% of Stonehenge’s original builders) does not mean the average hunter-gatherer lived in complex societies most of the time. They did not, because hunter-gathering could not support complex society, massive building projects, nor aristocracies most of the time.

It is only with the advent of agriculture that people started accumulating enough food that there were enough leftover for any sort of formal, long-term state to start taxing. True, this doesn’t necessarily mean that agriculture has to result in formal states with taxes; it just means that it’s very hard to get that without agriculture. (The one exception is if a nomadic herding society like the Mongols conquers an agricultural state and takes their taxes.)

In sum, yes, the “traditional story” is wrong–but not completely. History was more complicated, violent, and unequal, than portrayed, but the broad outlines of “smaller, simpler” hunter gatherer societies to “bigger, more complex” agricultural societies is basically correct. If anything, the lesson is that civilization has the potential to be a great force for good.

Anthropology Friday: Numbers and the Making of Us, by Caleb Everett pt. 4

Yes, but which 25% of us is grape?

Welcome to our final post on Numbers and the Making of Us: Counting and the Course of Human Cultures, by Caleb Everett. Today I just want to highlight a few interesting passages.

On DNA:

For example, there is about 25% overlap between the human genome and that of grapes. (And we have fewer genes than grapes!) So some caution should be exercised before reading too much into percentages of genomic correspondence across species. I doubt, after all that you consider yourself one-quarter grape. … canine and bovine species generally exhibit about an 85% rate of genomic correspondence with humans. … small changes in genetic makeup can, among other influences, lead to large changes in brain size.

On the development of numbers:

Babylonian math homework

After all, for the vast majority of our species’ existence, we lived as hunters and gatherers in Africa … A reasonable interpretation of the contemporary distribution of cultural and number-system types, then, is that humans did not rely on complex number system for the bulk of their history. We can also reasonably conclude that transitions to larger, more sedentary, and more trade-based cultures helped pressure various groups to develop more involved numerical technologies. … Written numerals, and writing more generally, were developed first in the Fertile Crescent after the agricultural revolution began there. … These pressures ultimately resulted in numerals and other written symbols, such as the clay-token based numerals … The numerals then enabled new forms of agriculture and trade that required the exact discrimination and representation of quantities. The ancient Mesopotamian case is suggestive, then, of the motivation for the present-day correlation between subsistence and number types: larger agricultural and trade-based economies require numerical elaboration to function. …

Intriguingly, though, the same maybe true of Chinese writing, the earliest samples of which date to the Shang Dynasty and are 3,000 years old. The most ancient of these samples are oracle bones. These bones were inscribed with nuemerals quantifying such items as enemy prisoners, birds and animals hunted, and sacrificed animals. … Ancient writing around the world is numerically focused.

Changes in the Jungle as population growth makes competition for resources more intense and forces people out of their traditional livelihoods:

Consider the case of one of my good friends, a member of an indigenous group known as the Karitiana. … Paulo spent the majority of his childhood, in the 1980s and 1990s in the largest village of his people’s reservation. … While some Karitiana sought to make a living in nearby Porto Velho, many strived to maintain their traditional way of life on their reservation. At the time this was feasible, and their traditional subsistence strategies of hunting, gathering, and horticulture could be realistically practiced. Recently, however, maintaining their conventional way of life has become a less tenable proposition. … many Karitiana feel they have little choice but to seek employment in the local Brazilian economy… This is certainly true of Paulo. He has been enrolled in Brazilian schools for some time, has received some higher education, and is currently employed by a governmental organization. To do these things, of course, Paulo had to learn Portuguese grammar and writing. And he had to learn numbers and math, also. In short, the socioeconomic pressures he has felt to acquire the numbers of another culture are intense.

Everett cites a statistic that >90% of the world’s approximately 7,000 languages are endangered.

They are endangered primarily because people like Paulo are being conscripted into larger nation-states, gaining fluency in more economically viable languages. … From New Guinea to Australia to Amazonia and elsewhere, the mathematizing of people is happening.

On the advantages of different number systems:

Recent research also suggests that the complexity of some non-linguistic number systems have been under appreciated. Many counting boards and abaci that have been used, and are still in use across the world’s culture, present clear advantages to those using them … the abacus presents some cognitive advantages. That is because, research now suggests, children who are raised using the abacus develop a “mental abacus” with time. … According to recent cross-cultural findings, practitioners of abacus-based mathematical strategies outperform those unfamiliar with such strategies,a t least in some mathematical tasks. The use of the Soroban abacus has, not coincidentally, now been adopted in many schools throughout Asia.

The zero is a dot in the middle of the photo–earliest known zero, Cambodia

I suspect these higher math scores are more due to the mental abilities of the people using the abacus than the abacus itself. I have also just ordered an abacus.

… in 2015 the world’s oldest known unambiguous inscription of a circular zero was rediscovered in Cambodia. The zero in question, really a large dot, serves as a placeholder in the ancient Khmer numeral for 605. It is inscribed on a stone tablet, dating to 683 CE, that was found only kilometers from the faces of Bayon and other ruins of Angkor Wat and Angkor Thom. … the Maya also developed a written form for zero, and the Inca encoded the concept in their Quipu.

In 1202, Fibonacci wrote the Book of Calculation, which promoted the use of the superior Arabic (yes Hindu) numerals (zero included) over the old Roman ones. Just as the introduction of writing jump-started the Cherokee publishing industry, so the introduction of superior numerals probably helped jump-start the Renaissance.

Cities and the rise of organized religion:

…although creation myths, animistic practices, and other forms of spiritualism are universal or nearly universal, large-scale hierarchical religions are restricted to relatively few cultural lineages. Furthermore, these religions… developed only after people began living in larger groups and settlements because of their agricultural lifestyles. … A phalanx of scholars has recently suggested that the development of major hierarchical religions, like the development of hierarchical governments, resulted from the agglomeration of people in such places. …

Organized religious beliefs, with moral-enforcing deities and priest case, were a by-product of the need for large groups of people to cooperate via shared morals and altruism. As the populations of cultures grew after the advent of agricultural centers… individuals were forced to rely on shared trust with many more individuals, including non-kin, than was or is the case in smaller groups like bands or tribes. … Since natural selection is predicated on the protection of one’s genes, in-group altruism and sacrifice are easier to make sense of in bands and tribes. But why would humans in much larger populations–humans who have no discernible genetic relationship… cooperate with these other individuals in their own culture? … some social mechanism had to evolve so that larger cultures would not disintegrate due to competition among individuals and so that many people would not freeload off the work of others. One social mechanism that foster prosocial and cooperative behavior is an organized religion based on shared morals and omniscient deities capable of keeping track of the violation of such morals. …

When Moses descended from Mt. Sinai with his stone tablets, they were inscribed with ten divine moral imperatives. … Why ten? … Here is an eleventh commandment that could likely be uncontroversially adopted by many people: “thou shalt not torture.” … But then the list would appear to lose some of its rhetorical heft. “eleven commandments’ almost hints of a satirical deity.

Technically there are 613 commandments, but that’s not nearly as catchy as the Ten Commandments–inadvertently proving Everett’s point.

Overall, I found this book frustrating and repetitive, but there were some good parts. I’ve left out most of the discussion of the Piraha and similar cultures, and the rather fascinating case of Nicaraguan homesigners (“homesigners” are deaf people who were never taught a formal sign language but made up their own.) If you’d like to learn more about them, you might want to look up the book at your local library.

Anthropology Friday: Numbers and the Making of Us, by Caleb Everett, pt 3

Welcome back to our discussion of Numbers and the Making of Us: Counting and the Course of Human Cultures, by Caleb Everett.

The Pirahã are a small tribe (about 420) of Amazonian hunter-gatherers whose language is nearly unique: it has no numbers, and you can whistle it. Everett spent much of his childhood among the Piraha because his parents were missionaries, which probably makes him one of the world’s foremost non-Piraha experts on the Piraha.

Occasionally as a child I would wake up in the jungle to the cacophony of people sharing their dreams with one another–impromptu monologues followed by spurts of intense feedback. The people in question, a fascinating (to me anyhow) group known as the Piraha, are known to wake up and speak to their immediate neighbors at all hours of the night. … the voices suggested the people in the village were relaxed and completely unconcerned with my own preoccupations. …

The Piraha village my family lived in was reachable via a one-week sinuous trip along a series of Amazonian tributaries, or alternatively by a one-or flight in a Cessna single-engine airplane.

Piraha culture is, to say the least, very different from ours. Everett cites studies of Piraha counting ability in support of his idea that our ability to count past 3 is a culturally acquired process–that is, we can only count because we grew up in a numeric society where people taught us numbers, and the Piraha can’t count because they grew up in an anumeric society that not only lacks numbers, but lacks various other abstractions necessary for helping make sense of numbers. Our innate, genetic numerical abilities, (the ability to count to three and distinguish between small and large amounts,) he insists, are the same.

You see, the Piraha really can’t count. Line up 3 spools of thread and ask them to make an identical line, and they can do it. Line up 4 spools of thread, and they start getting the wrong number of spools. Line up 10 spools of thread, and it’s obvious that they’re just guessing and you’re wasting your time. Put five nuts in a can, then take two out and ask how many nuts are left: you get a response on the order of “some.”*

And this is not for lack of trying. The Piraha know other people have these things called “numbers.” They once asked Everett’s parents, the missionaries, to teach them numbers so they wouldn’t get cheated in trade deals. The missionaries tried for 8 months to teach them to count to ten and add small sums like 1 + 1. It didn’t work and the Piraha gave up.

Despite these difficulties, Everett insists that the Piraha are not dumb. After all, they survive in a very complex and demanding environment. He grew up with them; many of the are his personal friends and he regards them as mentally normal people with the exact same genetic abilities as everyone else who just lack the culturally-acquired skill of counting.

After all, on a standard IQ scale, someone who cannot even count to 4 would be severely if not profoundly retarded, institutionalized and cared for by others. The Piraha obviously live independently, hunt, raise, and gather their own food, navigate through the rainforest, raise their own children, build houses, etc. They aren’t building aqueducts, but they are surviving perfectly well outside of an institution.

Everett neglects the possibility that the Piraha are otherwise normal people who are innately bad at math.

Normally, yes, different mental abilities correlate because they depend highly on things like “how fast is your brain overall” or “were you neglected as a child?” But people also vary in their mental abilities. I have a friend who is above average in reading and writing abilities, but is almost completely unable to do math. This is despite being raised in a completely numerate culture, going to school, etc.

This is a really obvious and life-impairing problem in a society like ours, where you have to use math to function; my friend has been marked since childhood as “not cognitively normal.” It would be a completely invisible non-problem in a society like the Piraha, who use no math at all; in Piraha society, my friend would be “a totally normal guy” (or at least close.)

Everett states, explicitly, that not only are the Piraha only constrained by culture, but other people’s abilities are also directly determined by their cultures:

What is probably more remarkable about the relevant studies, though, is that they suggest that climbing any rungs of the arithmetic ladder requires numbers. How high we climb the ladder is not the result of our own inherent intelligence, but a result of the language we speak and of the culture we are born into. (page 136)

This is an absurd claim. Even my own children, raised in identically numerate environments and possessing, on the global scale, nearly identical genetics, vary in math abilities. You are probably not identical in abilities to your relatives, childhood classmates, next door neighbors, spouse, or office mates. We observe variations in mathematical abilities within cultures, families, cities, towns, schools, and virtually any group you chose that isn’t selected for math abilities. We can’t all do calculus just because we happen to live in a culture with calculus textbooks.

In fact, there is an extensive literature (which Everett ignores) on the genetics of intelligence:

Various studies have found the heritability of IQ to be between 0.7 and 0.8 in adults and 0.45 in childhood in the United States.[6][18][19] It may seem reasonable to expect that genetic influences on traits like IQ should become less important as one gains experiences with age. However, that the opposite occurs is well documented. Heritability measures in infancy are as low as 0.2, around 0.4 in middle childhood, and as high as 0.8 in adulthood.[7] One proposed explanation is that people with different genes tend to seek out different environments that reinforce the effects of those genes.[6] The brain undergoes morphological changes in development which suggests that age-related physical changes could also contribute to this effect.[20]

A 1994 article in Behavior Genetics based on a study of Swedish monozygotic and dizygotic twins found the heritability of the sample to be as high as 0.80 in general cognitive ability; however, it also varies by trait, with 0.60 for verbal tests, 0.50 for spatial and speed-of-processing tests, and 0.40 for memory tests. In contrast, studies of other populations estimate an average heritability of 0.50 for general cognitive ability.[18]

In 2006, The New York Times Magazine listed about three quarters as a figure held by the majority of studies.[21]

Thanks to Jayman

In plain speak, this means that intelligence in healthy adults is about 70-80% genetic and the rest seems to be random chance (like whether you were dropped on your head as a child or had enough iodine). So far, no one has proven that things like whole language vs. phonics instruction or two parents vs. one in the household have any effect on IQ, though they might effect how happy you are.

(Childhood IQ is much more amenable to environmental changes like “good teachers,” but these effects wear off as soon as children aren’t being forced to go to school every day.)

A full discussion of the scientific literature is beyond our current scope, but if you aren’t convinced about the heritability of IQ–including math abilities–I urge you to go explore the literature yourself–you might want to start with some of Jayman’s relevant FAQs on the subject.

Everett uses experiments done with the Piraha to support his claim that mathematical ability is culturally dependent, but this is dependent on is claim that the Piraha are cognitively identical to the rest of us in innate mathematical ability. Given that normal people are not cognitively identical in innate mathematical abilities, and mathematical abilities vary, on average, between groups (this is why people buy “Singapore Math” books and not “Congolese Math,”) there is no particular reason to assume Piraha and non-Piraha are cognitively identical. Further, there’s no reason to assume that any two groups are cognitively identical.

Mathematics only really got started when people invented agriculture, as they needed to keep track of things like “How many goats do I have?” or “Have the peasants paid their taxes?” A world in which mathematical ability is useful will select for mathematical ability; a world where it is useless cannot select for it.

Everett may still be correct that you wouldn’t be able to count if you hadn’t been taught how, but the Piraha can’t prove that one way or another. He would first have to show that Piraha who are raised in numerate cultures (say, by adoption,) are just as good at calculus as people from Singapore or Japan, but he cites no adoption studies nor anything else to this end. (And adoption studies don’t even show that for the groups we have studied, like whites, blacks, or Asians.)

Let me offer a cognitive contrast:

The Piraha are an anumeric, illiterate culture. They have encountered both letters and numbers, but not adopted them.

The Cherokee were once illiterate: they had no written language. Around 1809, an illiterate Cherokee man, Sequoyah, observed whites reading and writing letters. In a flash of insight, Sequoyah understand the concept of “use a symbol to encode a sound” even without being taught to read English. He developed his own alphabet (really a syllabary) for writing Cherokee sounds and began teaching it to others. Within 5 years of the syllabary’s completion, a majority of the Cherokee were literate; they soon had their own publishing industry producing Cherokee-language books and newspapers.

The Cherokee, though illiterate, possessed the innate ability to be literate, if only exposed to the cultural idea of letters. Once exposed, literacy spread rapidly–instantly, in human cultural evolution terms.

By contrast, the Piraha, despite their desire to adopt numbers, have not been able to do so.

(Yet. With enough effort, the Piraha probably can learn to count–after all, there are trained parrots who can count to 8. It would be strange if they permanently underperformed parrots. But it’s a difficult journey.)

That all said, I would like to make an anthropological defense of anumeracy: numeracy, as in ascribing exact values to specific items, is more productive in some contexts than others.

Do you keep track of the exact values of things you give your spouse, children, or close friends? If you invite a neighbor over for a meal, do you mark down what it cost to feed them and then expect them to feed you the same amount in return? Do you count the exact value of gifts and give the same value in return?

In Kabloona, de Poncin discusses the quasi-communist nature of the Eskimo economic system. For the Eskimo, hunter-gatherers living in the world’s harshest environment, the unit of exchange isn’t the item, but survival. A man whom you keep alive by giving him fish today is a man who can keep you alive by giving you fish tomorrow. Declaring that you will only give a starving man five fish because he previously gave you five fish will do you no good at all if he starves from not enough fish and can no longer give you some of his fish when he has an excess. The fish have, in this context, no innate, immutable value–they are as valuable as the life they preserve. To think otherwise would kill them.

It’s only when people have goods to trade, regularly, with strangers, that they begin thinking of objects as having defined values that hold steady over different transactions. A chicken is more valuable if I am starving than if I am not, but it has an identical value whether I am trading it for nuts or cows.

So it is not surprising that most agricultural societies have more complicated number systems than most hunter-gatherer societies. As Everett explains:

Led by Patience Epps of the University of Texas, a team of linguists recently documented the complexity of the number systems in many of the world’s languages. In particular, the researchers were concerned with the languages’ upper numerical limit–the highest quantity with a specific name. …

We are fond of coining new names for numbers in English, but the largest commonly used number name is googol (googolplex I define as an operation,) though there are bigger one’s like Graham’s.

The linguistic team in question found the upper numerical limits in 193 languages of hunter-gatherer cultures in Australia, Amazonia, Africa, and North America. Additionally, they examined the upper limits of 204 languages spoken by agriculturalists and pastoralists in these regions. They discovered that the languages of hunter-gatherer groups generally have low upper limits. This is particularly true in Australia and Amazonia, the regions with so-called pure hunter-gatherer subsistence strategies.

In the case of the Australian languages, the study in question observed that more than 80 percent are limited numerically, with the highest quantity represetned in such cases being only 3 or 4. Only one Australian language, Gamilaraay, was found to have an upper limit above 10, an dits highest number is for 20. … The association [between hunter-gathering and limited numbers] is also robust in South America and Amazonia more specifically. The languages of hunter-gatherer cultures in this region generally have upper limits below ten. Only one surveyed language … Huaorani, has numbers for quantities greater than 20. Approximately two-thirds of the languages of such groups in the region have upper limits of five or less, while one-third have an upper limit of 10. Similarly, about two-thirds of African hunter-gatherer languages have upper limits of 10 or less.

There are a few exceptions–agricultural societies with very few numbers, and hunter-gatherers with relatively large numbers of numbers, but:

…there are no large agricultural states without elaborate number systems, now or in recorded history.

So how did the first people develop numbers? Of course we don’t know, but Everett suggests that at some point we began associating collections of things, like shells, with the cluster of fingers found on our hands. One finger, one shell; five fingers, five shells–easy correspondences. Once we mastered five, we skipped forward to 10 and 20 rather quickly.

Everett proposes that some numeracy was a necessary prerequisite for agriculture, as agricultural people would need to keep track of things like seasons and equinoxes in order to know when to plant and harvest. I question this on the grounds that I myself don’t look at the calendar and say, “Oh look, it’s the equinox, I’d better plant my garden!” but instead look outside and say, “Oh, it’s getting warm and the grass is growing again, I’d better get busy.” The harvest is even more obvious: I harvest when the plants are ripe.

Of course, I live in a society with calendars, so I can’t claim that I don’t look at the calendar. I look at the calendar almost every day to make sure I have the date correct. So perhaps I am using my calendrical knowledge to plan my planting schedule without even realizing it because I am just so used to looking at the calendar.

“What man among you, if he has 100 sheep and has lost 1 of them, does not leave the 99 in the open pasture and go after the one which is lost until he finds it? When he has found it, he lays it on his shoulders, rejoicing.” Luke 15:3-5

Rather than develop numbers and then start planting barley and millet, I propose that humans first domesticated animals, like pigs and goats. At first people were content to have “a few,” “some,” or “many” animals, but soon they were inspired to keep better track of their flocks.

By the time we started planting millet and wheat (a couple thousand years later,) we were probably already pretty good at counting sheep.

Our fondness for tracking astronomical cycles, I suspect, began for less utilitarian reasons: they were there. The cycles of the sun, moon, and other planets were obvious and easy to track, and we wanted to figure out what they meant. We put a ton of work into tracking equinoxes and eclipses and the epicycles of Jupiter and Mars (before we figured out heliocentrism.) People ascribed all sorts of import to these cycles (“Communicator Mercury is retrograde in outspoken Sagittarius from December 3-22, mixing up messages and disrupting pre-holiday plans.”) that turned out to be completely wrong. Unless you’re a fisherman or sailor, the moon’s phases don’t make any difference in your life; the other planets’ cycles turned out to be completely useless unless you’re trying to send a space probe to visit them. Eclipses are interesting, but don’t have any real effects. For all of the effort we’ve put into astronomy, the most important results have been good calendars to keep track of dates and allow us to plan multiple years into the future.

Speaking of dates, let’s continue this discussion in a week–on the next Anthropology Friday.

*Footnote: Even though I don’t think the Piraha prove as much as Everett thinks they do, that doesn’t mean Everett is completely wrong. Maybe already having number words is (in the vast majority of cases) a necessary precondition for learning to count.

One potentially illuminating case Everett didn’t explore is how young children in numerate culture acquire numbers. Obviously they grow up in an environment with numbers, but below a certain age can’t really use them. Can children at these ages duplicate lines of objects or patterns? Or do they master that behavior only after learning to count?

Back in October I commented on Schiller and Peterson’s claim in Count on Math (a book of math curriculum ideas for toddlers and preschoolers) that young children must learn mathematical “foundation” concepts in a particular order, ie:

Developmental sequence is fundamental to children’s ability to build conceptual understanding. … The chapters in this book present math in a developmental sequence that provides children a natural transition from one concept to the next, preventing gaps in their understanding. …

When children are allowed to explore many objects, they begin to recognize similarities and differences of objects.

When children can determine similarities and differences, they can classify objects.

When children can classify objects, they can see similarities and difference well enough to recognize patterns.

When children can recognize, copy, extend and create patterns, they can arrange sets in a one-to-one relationship.

When children can match objects one to one, they can compare sets to determine which have more and which have less.

When children can compare sets, they can begin to look at the “manyness” of one set and develop number concepts.

This developmental sequence provides a conceptual framework that serves as a springboard to developing higher level math skills.

The Count on Math curriculum doesn’t even introduce the numbers 1-5 until week 39 for 4 year olds (3 year olds are never introduced to numbers) and numbers 6-10 aren’t introduced until week 37 for the 5 year olds!

Note that Schiller and Everett are arguing diametrical opposites–Everett says the ability to count to three and distinguish the “manyness” of sets is instinctual, present even in infants, but that the ability to copy patterns and match items one-to-one only comes after long acquaintance and practice with counting, specifically number words.

Schiller claims that children only develop the ability to distinguish manyness and count to three after learning to copy patterns and match items one-to-one.

As I said back in October, I think Count on Math’s claim is pure bollocks. If you miss the “comparing sets” day at preschool, you aren’t going to end up unable to multiply. The Piraha may not prove as much as Everett wants them to, but the neuroscience and animal studies he cites aren’t worthless. In general, I distrust anyone who claims that you must introduce this long a set of concepts in this strict an order just to develop a basic competency that the vast majority of people seem to acquire without difficulty.

Of course, Lynne Peterson is a real teacher with a real teacher’s certificate and a BA in … it doesn’t say, and Pam Schiller was Vice President of Professional Development for the Early childhood Division at McGraw Hill publishers and president of the Southern Early Childhood Association. She has a PhD in… it doesn’t say. Here’s some more on Dr. Schiller’s many awards. So maybe they know better than Everett, who’s just an anthropologist. But Everett has some actual evidence on his side.

But I’m a parent who has watched several children learn to count… and Schiller and Peterson are wrong.

Anthropology Friday: Indian Warriors and their Weapons, (4/4) the Blackfeet, Apache, and Navajo

Map of Algonquian Language Family distribution

Hey everyone, today we’re wrapping up our look at om Hofsinde Gray-Wolf’s account of Native American cultures in Indian Warriors and their Weapons, with a look at the Blackfeet, Apache, and Navajo.

The Blackfeet live primarily in Canada and partly in northern America, and speak an Algonquin language–Algonquin languages are (were) otherwise dominant primarily in eastern Canada and the US. The Apache and Navajo are related peoples from the American southwest who speak an Athabaskan language. The rest of the Athabaskan speakers, oddly, live primarily in northern Canada and inland Alaska (Inuit/Eskimo/Aleut cultures live on the Alaskan coasts.)

Map of Athabaskan Language Distribution

According to Wikipedia:

Historically, the member peoples of the [Blackfeet] Confederacy were nomadic bison hunters and trout fishermen, who ranged across large areas of the northern Great Plains of Western North America, specifically the semi-arid shortgrass prairie ecological region. They followed the bison herds as they migrated between what are now the United States and Canada, as far north as the Bow River. In the first half of the 18th century, they acquired horses and firearms from white traders and their Cree and Assiniboine go-betweens. The Blackfoot used these to expand their territory at the expense of neighboring tribes. Now riding horses, the Blackfoot and other Plains tribes could also extend the range of their buffalo hunts.

The systematic commercial bison hunting by white hunters in the 19th century nearly ended the bison herds and permanently changed Native American life on the Great Plains, since their primary food source was no longer abundant. Periods of starvation and deprivation followed, and the Blackfoot tribe was forced to adopt ranching and farming, settling in permanent reservations. In the 1870s, they signed treaties with both the United States and Canada, ceding most of their lands in exchange for annuities of food and medical aid, as well as help in learning to farm. Nevertheless, the Blackfoot have worked to maintain their traditional language and culture in the face of assimilationist policies of both the U.S. and Canada.

“Historically” as Wikipedia uses it here merely refers to “in the 17 and 1800s.” The Blackfeet’s linguistic cousins on the eastern coast of the US, such as Pocahontas of the Tsenacommacah or Squanto of the Patuxet, were settled, agriculturalist people who raised corn, squash, and beans. It seems likely that the Blackfeet were originally similarly agricultural, only moving out into the Great Plains and adopting their nomadic, buffalo-based lifestyle after European colonists introduced horses to the New World. Without horses, following the herds on foot would have been very difficult–though perhaps they managed it.

Alfred Jacob Miller, Hunting Buffalo

According to Hofsinde Gray-Wolf:

“The traditional enemies of the Blackfeet were the Shoshoni, the Assiniboine, the Cree, and especially the Crow. Hostilities between these tribes were kept alive by continued raids upon each other, usually for revenge or to steal horses.

“The Blackfeet gave their highest tribal honor to the brave who captured an enemy’s horse, weapons, or ceremonial gear. … Parents asked him to perform the naming ceremony for their newborn baby boy. He was elected to perform special services at rituals and social affairs. These services added to the man’s wealth.”

EvX: I wonder if anyone has attempted to replicate Napoleon Chagnon’s quantitative work on reproductive success among the Yanomamo with other tribal societies. I’d love to know if warriors were similarly successful among the Blackfeet, for example. Back to Hofsinde Gray-Wolf:

“In the early 1800s the Missouri Fur Company started to construct a post at the mouth of the Bighorn River in Crow country. The Blackfeet thought these white people had allied themselves with the Crow. That alone was enough to set the Blackfeet on the war trail against them. … Time and time again the white men were killed, and their guns, their personal belongings were taking. The Indians traded the furs to the British posts.

“After a few of these raids, most of the trappers gave up and were ready to seek their furs in less dangerous parts of the country. For years thereafter, few white men dared enter the Blackfeet country.”

According to Wikipedia:

Up until around 1730, the Blackfoot traveled by foot and used dogs to carry and pull some of their goods. They had not seen horses in their previous lands, but were introduced to them on the Plains, as other tribes, such as the Shoshone, had already adopted their use.[17]

Horses revolutionised life on the Great Plains and soon came to be regarded as a measure of wealth. Warriors regularly raided other tribes for their best horses. Horses were generally used as universal standards of barter. … An individual’s wealth rose with the number of horses accumulated, but a man did not keep an abundance of them. The individual’s prestige and status was judged by the number of horses that he could give away. …

After having driven the hostile Shoshone and Arapaho from the Northwestern Plains, the Niitsitapi began in 1800 a long phase of keen competition in the fur trade with their former Cree allies, which often escalated militarily. … by mid-century an adequate supply of horses became a question of survival. Horse theft was at this stage not only a proof of courage, but often a desperate contribution to survival, for many ethnic groups competed for hunting in the grasslands.

The Cree and Assiniboine continued horse raiding against the Gros Ventre … They had to withstand attacks of enemies with guns. In retaliation for Hudson’s Bay Company (HBC) supplying their enemies with weapons, the Gros Ventre attacked and burned in 1793 South Branch House of the HBC on the South Saskatchewan River near the present village of St. Louis, Saskatchewan.

Meanwhile, further south:

“Long ago the Apache and Navaho tribes of the Southwest were once people. Between the years 1200 and 1400, these Indians came down from the far north of Canada and Alaska, following a route along the eastern slopes of the Rocky Mountains. The tribes lived in small family camps instead of permanent villages, and their personal belongings were meager. A little over 400 yeas ago the Navajo separated from their Apache brothers. …

“The Apache were raiders. They raided for food, clothing, horses, guns, and slaves. To them raiding was a business, and a dangerous business, but the Apache raider was a past master at commando tactics, and he did not take risks. … He tried not to kill those he raided. In Apache wars it was considered far better to take the enemy as slaves, and threby enlarge the tribe.”

EvX: It appears that the constant warfare had such a debilitating effect on tribal numbers that many tribes ended up relying on captives to keep their own numbers steady–though we must keep in mind that these tribes had also suffered unimaginable losses due to Western diseases. I have seen estimates that as much as 90% of the Indian population had already died before whites arrived in significant numbers in America, simply because their diseases spread much faster than they did.

Here is Wikipedia’s account of early Navajo history:

The Navajos are speakers of a Na-Dené Southern Athabaskan language … It is closely related to the Apache language, as the Navajos and Apaches are believed to have migrated from northwestern Canada and eastern Alaska, where the majority of Athabaskan speakers reside.[4] Speakers of various other Athabaskan languages located in Canada may still comprehend the Navajo language despite the geographic and linguistic deviation of the languages.[5]

Archaeological and historical evidence suggests the Athabaskan ancestors of the Navajos and Apaches entered the Southwest around 1400 CE.[7][8] The Navajo oral tradition is said to retain references of this migration.[citation needed]

Until contact with Pueblos and the Spanish, the Navajos were largely hunters and gatherers. The tribe adopted crop-farming techniques from the Pueblo peoples, growing mainly corn, beans, and squash. When the Spanish arrived, the Navajos began herding sheep and goats* as a main source of trade and food, with meat becoming an essential component of the Navajo diet. Sheep also became a form of currency and status symbols among the Navajos based on the overall quantity of herds a family maintained.[9][10] In addition, the practice of spinning and weaving wool into blankets and clothing became common and eventually developed into a form of highly valued artistic expression.

*Note that sheep and goats are not native to the Americas.

Geronimo, chief of the Apache

I find this progression of economic systems fascinating. Here we have three groups–first a group of Athabaskan hunter-gatherers decided, for unknown reasons, to leave their frigid, far northern homeland and migrate to the baking heat of the American Southwest. (Perhaps they were driven out of their original homes by the arrival of the Inuit/Eskimo?) Here they encountered already established Pueblo peoples, who IIRC are related to the Aztecs of Mexico, an advanced civilization. The Pueblo people built cities and raised crops, a lifestyle the Athabaskan newcomers started adopting, or at least trading with.

Then the Spaniards arrived, with their domesticated animals. One group of Athabaskans, the Navajo, decided to adopt sheep and goats, becoming pastoralist/agriculturalists. Another group, the Apache, decided to adopt the horse and fully realize their hunter-gatherer potential.

But back to Hofsinde Gray-Wolf:

“Although the Apache method of attack was devious, it was not cowardly. Cochise, with less than two hundred warriors, held off the United States army for more than ten years. He was a great leader and did not risk the life of any of his warriors in attacks on wagon trains or supply trains. He did not even attack small caravan patrols outright; instead he literally wore them down.

“A typical attack followed this pattern: from high on the rocks and cliffs an Apache band followed a group of white travelers, showing themselves from time to time, then silently vanishing again. Ahead and behind them the travelers saw smoke rising from signal fire, never knowing what i might mean. With the Apaches trailing them night and day, the nerves of the white men became frayed. They had little time for rest and even less for sleep. Water holes were few and far between, and when they finally reached one, it was usually occupied by hostile Apache. … When at long last nerves had been strained to the breaking point… it was time to expect a raid. …

“The Apache were excellent horsemen, and small groups of them were able to raid and terrorize large areas. These raids, thefts, and captures lasted for two hundred years. Only after the Americans arrived around 1850 was any attempt made to stop them, and this effort took forty years.

“When the Apache first migrated into the Southwest, one weapon they possessed was the arctic-type bow. It was of Asiatic origin, and far superior to any bow then made in their new homeland. …

“The sign of the cross existed in much of the Apache symbolism, but it held no Christian meaning for them. It represented the four cardinal points and the four winds. Thus a warrior painted a cross on the foot of his moccasins before he went into strange country, in hopes that it would keep him from becoming lost. …

“As early as 1538 a Spanish priest wrote about the Navaho and called them Apache del Navahu. …

“Even Navaho women went to war, and thereby gained high positions within the tribe. War usually meant a raid on one of the peaceful Pueblo tribes or on a Mexican village. …

“Raids on other tribes were conducted primarily to capture slaves. … Unlike the Apache, they did not torture their captives, though at times they did take scalps.”

EvX: This brings us to the end of this series; I hope you have enjoyed it, not just for the glances back at the history of the peoples of America (and Canada,) but also for a look at the sort of books children in the 50s were reading.

 

Anthropology Friday: Indian Warriors and their Weapons: Iroquois Confederacy (2/4)

Welcome back to Anthropology Friday. Today we’re continuing with Hofsinde Gray-Wolf’s series about Native American culture with selections from Indian Warriors and their Weapons. We’ll specifically be reading about the Iroquois Confederacy, also known as the Six Nations (nee Five Nations.)

As usual, I’ll be using “” instead of blockquotes for Hofsinde’s portions.

“The confederacy of the Iroquois, called the Five Nations, was formed, in part, to keep peace among the member tribes. … Around 1722 the Tuscarora from the Carolinas joined the Longhouse, after having been driven out of their own land by the white men. As the Tuscarora were of Iroquois linguistic stock, they were readily admitted by the original members, and the name of the league was changed to the Six Nations.

Map of the New York tribes before European arrival, Iroquois in pink, Algonquin in orange (a great many also lived in Canada.)

“The Iroquois lived in northern New York. As warriors, they were so fierce that by the end of the seventeenth century they controlled the land and many of the tribes, from the Ottawa River in Ohio south to the Cumberland River in Tennessee, and westward from Maine to Lake Michigan. They made friends with the early Dutch, from whom they obtained firearms, and with these new weapons of war they became even bolder. Iroquois moccasins left imprints as far west as the Black hills of South Dakota. The warriors fought the Catawbas in South Carolina, and they invaded the villages of the Creeks in Florida. …

“Most Indians usually formed small war parties under a leader, but the Iroquois often mustered large armies. In 1654, for example, a party of 1800 Iroquois attacked a village of the Erie, a Pennsylvania tribe of Iroquois blood, which had between 3000 and 4000 warriors. So fiercely did the New York Iroquois fight that even against such odds they were victorious. At another time in their bloody history, a party of Mohawk and Seneca Indians numbering close to 1000 invaded the Huron north of Toronto, Canada. In two days of fighting they burned two Huron towns, took untold captives, and returned home with much loot.

“Captive, including men, women, and children, were always taken on such raids. The captive men replaced Iroquois husbands or sons lost in battle. The children were adopted into families, and the captive women often married into the tribe. Those not so fortunate became slaves… Captives served to keep the tribe large and strong.”

EvX: The Wikipedia page on the Iroquois Confederacy is pretty interesting. In the debate over etymology section, this historical bit stood out:

Peter Bakker has proposed a Basque origin for “Iroquois”. Basque fishermen and whalers are known to have frequented the waters of the Northeast in the 1500s, so much so that a Basque-based pidgin developed for communication with the Algonquian tribes of the region. Bakker claims that it is unlikely that “-quois” derives from a root specifically used to refer to the Iroquois, citing as evidence that several other Indian tribes of the region were known to the French by names terminating in the same element, e.g. “Armouchiquois”, “Charioquois”, “Excomminquois”, and “Souriquois”. He proposes instead that the word derives from hilokoa (via the intermediate form irokoa), from the Basque roots hil “to kill”, ko (the locative genitive suffix), and a (the definite article suffix). In favor of an original form beginning with /h/, Bakker cites alternate spellings such as “hyroquois” sometimes found in documents from the period, and the fact that in the Southern dialect of Basque the word hil is pronounced il. He also argues that the /l/ was rendered as /r/ since the former is not attested in the phonemic inventory of any language in the region (including Maliseet, which developed an /l/ later). Thus the word according to Bakker is translatable as “the killer people,” and is similar to other terms used by Eastern Algonquian tribes to refer to the Iroquois which translate as “murderers”.[12][13]

*Adds this to her list of speculations about Basque and Portuguese fishing routes*

With the formation of the League, the impact of internal conflicts was minimized, the council of fifty thereafter ruled on disputes,[36] displacing raiding traditions and most of the impulsive actions by hotheaded warriors onto surrounding peoples. This allowed the Iroquois to increase in numbers while pushing down rival nations’ numbers.[36] The political cohesion of the Iroquois rapidly became one of the strongest forces in 17th- and 18th-century northeastern North America; though only occasionally used as representations of all five tribes until about 1678,[36] when negotiations between the governments of Pennsylvania and New York seemed to awake the power.[36] Thereafter, the editors of American Heritage write the Iroquois became very adroit at playing the French off against the British,[36] as individual tribes had played the Swedes, Dutch, and English.[36]

Iroquoisball

Anyway, since the Iroquois Confederacy predates the arrival of written records in the area, it’s not clear exactly when it formed. Some people claim 1142 AD; others claim around 1450. I’m sure these claims are fraught with personal/political ideologies and biases, but someone has to be correct.

The Iroquois are a mix of horticulturalists, farmers, fishers, gatherers and hunters, though their main diet traditionally has come from farming. The main crops they cultivated are corn, beans and squash, which were called the three sisters (De-oh-há-ko) and are considered special gifts from the Creator. These crops are grown strategically. The cornstalks grow, the bean plants climb the stalks, and the squash grow beneath, inhibiting weeds and keeping the soil moist under the shade of their broad leaves. In this combination, the soil remained fertile for several decades. The food was stored during the winter, and it lasted for two to three years. When the soil in one area eventually lost its fertility, the Haudenosaunee moved their village.

Gathering is the traditional job of the women and children. Wild roots, greens, berries and nuts were gathered in the summer. During spring, sap is tapped from the maple trees and boiled into maple syrup, and herbs are gathered for medicine. The Iroquois hunted mostly deer but also other game such as wild turkey and migratory birds. Muskrat and beaver were hunted during the winter. Fishing was also a significant source of food because the Iroquois had villages mostly in the St.Lawrence area. They fished salmon, trout, bass, perch and whitefish until the St. Lawrence became too polluted by industry. In the spring the Iroquois netted, and in the winter fishing holes were made in the ice.[112] Allium tricoccum is also a part of traditional Iroquois cuisine.[113]

Apparently the Cherokee are also an Iroquoian-speaking people (not all Iroquoian-language-speaking peoples were part of the Confederacy.) I’ll be writing more about the Cherokee later, but I find this rather significant–the Cherokee are notable for having developed their own writing system after simply observing Europeans reading letters, and soon had their own printing presses, newspapers, books, etc. The Iroquois had a stable, long-term political organization based on mutual agreement rather than conquest. The Cherokee sent aid to the Irish during the Great Potato Famine; the Iroquois declared war on Germany in 1917 and again in 1942.

When Europeans first arrived in North America, the Haudenosaunee were based in what is now the northeastern United States, primarily in what is referred to today as Central New York west of the Hudson River and through the Finger Lakes region, and upstate New York along the St. Lawrence River area downstream to today’s Montreal.[26]

French, Dutch and British colonists in both Canada and the Thirteen Colonies recognized a need to gain favor with the Iroquois people, who occupied a significant portion of lands west of colonial settlements. In addition, these peoples established lucrative fur trading with the Iroquois, which was favorable to both sides. The colonists also sought to establish positive relations to secure their borders.

For nearly 200 years the Iroquois were a powerful factor in North American colonial policy-making decisions. Alignment with Iroquois offered political and strategic advantages to the colonies but the Iroquois preserved considerable independence. Some of their people settled in mission villages along the St. Lawrence River, becoming more closely tied to the French. While they participated in French raids on Dutch and later English settlements, where some Mohawk and other Iroquois settled, in general the Iroquois resisted attacking their own peoples.

The Iroquois remained a politically unique, undivided, large Native American polity up until the American Revolution. The League kept its treaty promises to the British Crown. But when the British were defeated, they ceded the Iroquois territory without consultation; many Iroquois had to abandon their lands in the Mohawk Valley and elsewhere and relocate in the northern lands retained by the British. …

The explorer Robert La Salle in the 17th century identified the Mosopelea as among the Ohio Valley peoples defeated by the Iroquois[47] in the early 1670s, whereas the Erie and peoples of the upper Allegheny valley were known to have fallen earlier during the Beaver Wars, while by 1676 the Susquehannock[e] were known to be broken as a power between three years of epidemic disease, war with the Iroquois, and frontier battles as settlers took advantage of the weakened tribe.[36]

According to one theory of early Iroquois history, after becoming united in the League, the Iroquois invaded the Ohio River Valley in the territories that would become the eastern Ohio Country down as far as present-day Kentucky to seek additional hunting grounds. They displaced about 1200 Siouan-speaking tribepeople of the Ohio River valley, such as the Quapaw (Akansea), Ofo (Mosopelea), and Tutelo and other closely related tribes out of the region. These tribes migrated to regions around the Mississippi River and the piedmont regions of the east coast.[48] …

Beginning in 1609, the League engaged in a decades-long series of wars, the so-called Beaver Wars, against the French, their Huron allies, and other neighboring tribes, including the Petun, Erie, and Susquehannock. Trying to control access to game for the lucrative fur trade, they put great pressure on the Algonquian peoples of the Atlantic coast (the Lenape or Delaware), the Anishinaabe peoples of the boreal Canadian Shield region, and not infrequently fought the English colonies as well. During the Beaver Wars, they were said to have defeated and assimilated the Huron (1649), Petun (1650), the Neutral Nation (1651),[53][54]Erie Tribe (1657), and Susquehannock (1680).[55] The traditional view is that these wars were a way to control the lucrative fur trade in order to access European goods on which they had become dependent.[56][page needed][57][page needed]

Recent scholarship has elaborated on this view, arguing that the Beaver Wars were an escalation of the “Mourning Wars”, which were an integral part of early Iroquoian culture.[58] This view suggests that the Iroquois launched large-scale attacks against neighboring tribes in order to avenge or replace the massive number of deaths resulting from battles or smallpox epidemics.

According to Wikipedia, “Total population for the five nations has been estimated at 20,000 before 1634. After 1635 the population dropped to around 6,800, chiefly due to the epidemic of smallpox introduced by contact with European settlers.[109]”

By the time of the American Revolution, their small numbers compared to the settlers combined with the loss of their alliance with Britain spelled the end of Confederacy as a significant strategic force in the area. Today, though, their population has increased to 125,000 people, 45k in Canada and 80k in the US.

Finally:

Although the Iroquois are sometimes mentioned as examples of groups who practiced cannibalism, the evidence is mixed as to whether such a practice could be said to be widespread among the Six Nations, and to whether it was a notable cultural feature. Some anthropologists have found evidence of ritual torture and cannibalism at Iroquois sites, for example, among the Onondaga in the sixteenth century.[133][134] However, other scholars, most notably anthropologist William Arens in his controversial book, The Man-Eating Myth, have challenged the evidence, suggesting the human bones found at sites point to funerary practices, asserting that if cannibalism was practiced among the Iroquois, it was not widespread.[135] Modern anthropologists seem to accept the probability that cannibalism did exist among the Iroquois,[136] with Thomas Abler describing the evidence from the Jesuit Relations and archaeology as making a “case for cannibalism in early historic times…so strong that it cannot be doubted.”.[137] Scholars are also urged to remember the context for a practice that now shocks the modern Western society. Sanday reminds us that the ferocity of the Iroquois’ rituals “cannot be separated from the severity of conditions … where death from hunger, disease, and warfare became a way of life”.[138]

The missionaries Johannes Megapolensis and François-Joseph Bressani, and the fur trader Pierre-Esprit Radisson present first-hand accounts of cannibalism among the Mohawk. A common theme is ritualistic roasting and eating the heart of a captive who has been tortured and killed.[110] “To eat your enemy is to perform an extreme form of physical dominance.”[139]

 

Entropy, Life, and Welfare (pt 1)

340px-dna_structurekeylabelled-pn_nobb

(This is Part 1. Part 2 and Part 3 are here.)

All living things are basically just homeostatic entropy reduction machines. The most basic cell, floating in the ocean, uses energy from sunlight to order its individual molecules, creating, repairing, and building copies of itself, which continue the cycle. As Jeremy England of MIT demonstrates:

From the standpoint of physics, there is one essential difference between living things and inanimate clumps of carbon atoms: The former tend to be much better at capturing energy from their environment and dissipating that energy as heat. Jeremy England … has derived a mathematical formula that he believes explains this capacity. The formula, based on established physics, indicates that when a group of atoms is driven by an external source of energy (like the sun or chemical fuel) and surrounded by a heat bath (like the ocean or atmosphere), it will often gradually restructure itself in order to dissipate increasingly more energy. …

This class of systems includes all living things. England then determined how such systems tend to evolve over time as they increase their irreversibility. “We can show very simply from the formula that the more likely evolutionary outcomes are going to be the ones that absorbed and dissipated more energy from the environment’s external drives on the way to getting there,” he said. …

“This means clumps of atoms surrounded by a bath at some temperature, like the atmosphere or the ocean, should tend over time to arrange themselves to resonate better and better with the sources of mechanical, electromagnetic or chemical work in their environments,” England explained.

Self-replication (or reproduction, in biological terms), the process that drives the evolution of life on Earth, is one such mechanism by which a system might dissipate an increasing amount of energy over time. As England put it, “A great way of dissipating more is to make more copies of yourself.” In a September paper in the Journal of Chemical Physics, he reported the theoretical minimum amount of dissipation that can occur during the self-replication of RNA molecules and bacterial cells, and showed that it is very close to the actual amounts these systems dissipate when replicating.

usenergy2009Energy isn’t just important to plants, animals, and mitochondria. Everything from molecules to sand dunes, cities and even countries absorb and dissipate energy. And like living things, cities and countries use energy to grow, construct buildings, roads, water systems, and even sewers to dispose of waste. Just as finding food and not being eaten are an animal’s first priority, so are energy policy and not being conquered are vital to a nation’s well-being.

Hunter-gatherer societies are, in most environments, the most energy-efficient–hunter gatherers expend relatively little energy to obtain food and build almost no infrastructure, resulting in a fair amount of time left over for leisure activities like singing, dancing, and visiting with friends.

But as the number of people in a group increases, hunter-gathering cannot scale. Putting in more hours hunting or gathering can only increase the food supply so much before you simply run out.

energyvsorganizationHorticulture and animal herding require more energy inputs–hoeing the soil, planting, harvesting, building fences, managing large animals–but create enough food output to support more people per square mile than hunter-gathering.

Agriculture requires still more energy, and modern industrial agriculture more energy still, but support billions of people. Agricultural societies produced history’s first cities–civilizations–and (as far as I know) its first major collapses. Where the land is over-fished, over-farmed, or otherwise over-extracted, it stops producing and complex systems dependent on that production collapse.

Senenu, an Egyptian scribe, grinding grain by hand, ca. 1352-1336 B.C
Senenu, an Egyptian scribe, grinding grain by hand, ca. 1352-1336 B.C

I’ve made a graph to illustrate the relationship between energy input (work put into food production) and energy output (food, which of course translates into more people.) Note how changes in energy sources have driven our major “revolutions”–the first, not in the graph, was the taming and use of fire to cook our food, releasing more nutrients than mere chewing ever could. Switching from jaw power to fire power unlocked the calories necessary to fund the jump in brain size that differentiates humans from our primate cousins, chimps and gorillas.

That said, hunter gatherers (and horticulturalists) still rely primarily on their own power–foot power–to obtain their food.

Scheme of the Roman Hierapolis sawmill, the earliest known machine to incorporate a crank and connecting rod mechanism.
Scheme of the Roman Hierapolis sawmill, the earliest known machine to incorporate a crank and connecting rod mechanism. Note the use of falling water to perform the work, rather than human muscles.

The Agricultural Revolution harnessed the power of animals–mainly horses and oxen–to drag plows and grind grain. The Industrial Revolution created engines and machines that released the power of falling water, wind, steam, coal, and oil, replacing draft animals with grist mills, tractors, combines, and trains.

Modern industrial societies have achieved their amazing energy outputs–allowing us to put a man on the moon and light up highways at night–via a massive infusion of energy, principally fossil fuels, vital to the production of synthetic fertilizers:

Nitrogen fertilizers are made from ammonia (NH3), which is sometimes injected into the ground directly. The ammonia is produced by the Haber-Bosch process.[5] In this energy-intensive process, natural gas (CH4) supplies the hydrogen, and the nitrogen (N2) is derived from the air. …

Deposits of sodium nitrate (NaNO3) (Chilean saltpeter) are also found in the Atacama desert in Chile and was one of the original (1830) nitrogen-rich fertilizers used.[12] It is still mined for fertilizer.[13]

Actual mountain of corn
Actual mountain of corn, because industrial agriculture is just that awesome

Other fertilizers are made of stone, mined from the earth, shipped, and spread on fields, all courtesy of modern industrial equipment, run on gasoline.

Without the constant application of fertilizer, we wouldn’t have these amazing crop yields:

In 2014, average yield in the United States was 171 bushels per acre. (And the world record is an astonishing 503 bushels, set by a farmer in Valdosta, Ga.) Each bushel weighs 56 pounds and each pound of corn yields about 1,566 calories. That means corn averages roughly 15 million calories per acre. (Again, I’m talking about field corn, a.k.a. dent corn, which is dried before processing. Sweet corn and popcorn are different varieties, grown for much more limited uses, and have lower yields.)

per-capita-world-energy-by-sourceAs anyone who has grown corn will tell you, corn is a nutrient hog; all of those calories aren’t free. Corn must be heavily fertilized or the soil will run out and your farm will be worthless.

We currently have enough energy sources that the specific source–fossil fuels, hydroelectric, wind, solar, even animal–is not particularly important, at least for this discussion. Much more important is how society uses and distributes its resources. For, like all living things, a society that misuses its resources will collapse.

To be continued…Go on to Part 2 and Part 3.