Is Racism an Instinct?

Everyone is a little bit racist–Hillary Clinton

If everyone in the world exhibits a particular behavior, chances are it’s innate. But I have been informed–by Harvard-educated people, no less–that humans do not have instincts. We are so smart, you see, that we don’t need instincts anymore.

This is nonsense, of course.

One amusing and well-documented human instinct is the nesting instinct, experienced by pregnant women shortly before going into labor. (As my father put it, “When shes starts rearranging the furniture, get the ready to head to the hospital.”) Having personally experienced this sudden, overwhelming urge to CLEAN ALL THE THINGS multiple times, I can testify that it is a real phenomenon.

Humans have other instincts–babies will not only pick up and try to eat pretty much anything they run across, to every parent’s consternation, but they will also crawl right up to puddles and attempt to drink out of them.

But we’re getting ahead of ourselves: What, exactly, is an instinct? According to Wikipedia:

Instinct or innate behavior is the inherent inclination of a living organism towards a particular complex behavior. The simplest example of an instinctive behavior is a fixed action pattern (FAP), in which a very short to medium length sequence of actions, without variation, are carried out in response to a clearly defined stimulus.

Any behavior is instinctive if it is performed without being based upon prior experience (that is, in the absence of learning), and is therefore an expression of innate biological factors. …

Instincts are inborn complex patterns of behavior that exist in most members of the species, and should be distinguished from reflexes, which are simple responses of an organism to a specific stimulus, such as the contraction of the pupil in response to bright light or the spasmodic movement of the lower leg when the knee is tapped.

The go-to example of an instinct is the gosling’s imprinting instinct. Typically, goslings imprint on their mothers, but a baby gosling doesn’t actually know what its mother is supposed to look like, and can accidentally imprint on other random objects, provided they are moving slowly around the nest around the time the gosling hatches.

Stray dog nursing kittens
Stray dog nursing kittens

Here we come to something I think may be useful for distinguishing an instinct from other behaviors: an instinct, once triggered, tends to keep going even if it has been accidentally or incorrectly triggered. Goslings look like they have an instinct to follow their mothers, but they actually have an instinct to imprint on the first large, slowly moving object near their nest when they hatch.

So if you find people strangely compelled to do something that makes no sense but which everyone else seems to think makes perfect sense, you may be dealing with an instinct. For example, women enjoy celebrity gossip because humans have an instinct to keep track of social ranks and dynamics within their own tribe; men enjoy watching other men play sports because it conveys the vicarious feeling of defeating a neighboring tribe at war.

So what about racism? Is it an instinct?

Strictly speaking–and I know I have to define racism, just a moment–I don’t see how we could have evolved such an instinct. Races exist because major human groups were geographically separated for thousands of years–prior to 1492, the average person never even met a person of another race in their entire life. So how could we evolve an instinct in response to something our ancestors never encountered?

Unfortunately, “racism” is a chimera, always changing whenever we attempt to pin it down, but the Urban Dictionary gives a reasonable definition:

An irrational bias towards members of a racial background. The bias can be positive (e.g. one race can prefer the company of its own race or even another) or it can be negative (e.g. one race can hate another). To qualify as racism, the bias must be irrational. That is, it cannot have a factual basis for preference.

Of course, instincts exist because they ensured our ancestors’ survival, so if racism is an instinct, it can’t exactly be “irrational.” We might call a gosling who follows a scientist instead of its mother “irrational,” but this is a misunderstanding of the gosling’s motivation. Since “racist” is a term of moral judgment, people are prone to defending their actions/beliefs towards others on the grounds that it can’t possibly be immoral to believe something that is actually true.

The claim that people are “racist” against members of other races implies, in converse, that they exhibit no similar behaviors toward members of their own race. But even the most perfunctory overview of history reveals people acting in extremely “racist” ways toward members of their own race. During the Anglo-Boer wars, the English committed genocide against the Dutch South Africans (Afrikaners.) During WWII, Germans allied with the the Japanese and slaughtered their neighbors, Poles and Jews. (Ashkenazim are genetically Caucasian and half Italian.) If Hitler were really racist, he’d have teamed up with Stalin and Einstein–his fellow whites–and dropped atomic bombs on Hiroshima. (And for their part, the Japanese would have allied with the Chinese against the Germans.)

picture-2Some quotes from the NewScientist article:

The murder victim, a West African chimpanzee called Foudouko, had been beaten with rocks and sticks, stomped on and then cannibalised by his own community. …

“When you reverse that and have almost two males per every female — that really intensifies the competition for reproduction. That seems to be a key factor here,” says Wilson.

Jill Pruetz at Iowa State University, who has been studying this group of chimpanzees in south-eastern Senegal since 2001, agrees. She suggests that human influence may have caused this skewed gender ratio that is likely to have been behind this attack. In Senegal, female chimpanzees are poached to provide infants for the pet trade. …

Early one morning, Pruetz and her team heard loud screams and hoots from the chimps’ nearby sleep nest. At dawn, they found Foudouko dead, bleeding profusely from a bite to his right foot. He also had a large gash in his back and a ripped anus. Later he was found to have cracked ribs. Pruetz says Foudouko probably died of internal injuries or bled out from his foot wound.

Foudouko also had wounds on his fingers. These were likely to have been caused by chimps clamping them in their teeth to stretch his arms out and hold him down during the attack, says Pruetz.

After his death, the gang continued to abuse Foudouko’s body, throwing rocks and poking it with sticks, breaking its limbs, biting it and eventually eating some of the flesh.

“It was striking. The female that cannibalised the body the most, she’s the mother of the top two high-ranking males. Her sons were the only ones that really didn’t attack the body aggressively,” Pruetz says …

Historically, the vast majority of wars and genocides were waged by one group of people against their neighbors–people they were likely to be closely related to in the grand scheme of things–not against distant peoples they’d never met. If you’re a chimp, the chimp most likely to steal your banana is the one standing right in front of you, not some strange chimp you’ve never met before who lives in another forest.

Indeed, in Jane Goodall’s account of the Gombe Chimpanzee War, the combatants were not members of two unrelated communities that had recently encountered each other, but members of a single community that had split in two. Chimps who had formerly lived peacefully together, groomed each other, shared bananas, etc., now bashed each other’s brains out and cannibalized their young. Poor Jane was traumatized.

I think there is an instinct to form in-groups and out-groups. People often have multiple defined in-groups (“I am a progressive, a Christian, a baker, and a Swede,”) but one of these identities generally trumps the others in importance. Ethnicity and gender are major groups most people seem to have, but I don’t see a lot of evidence suggesting that the grouping of “race” is uniquely special, globally, in people’s ideas of in- and out-.

For example, as I am writing today, people are concerned that Donald Trump is enacting racist policies toward Muslims, even though “Muslim” is not a race and most of the countries targeted by Trump’s travel/immigration ban are filled with fellow Caucasians, not Sub-Saharan Africans or Asians.

Race is a largely American obsession, because our nation (like the other North and South American nations,) has always had whites, blacks, and Asians (Native Americans). But many countries don’t have this arrangement. Certainly Ireland didn’t have an historical black community, nor Japan a white one. Irish identity was formed in contrast to English identity; Japanese in contrast to Chinese and Korean.

Only in the context where different races live in close proximity to each other does it seem that people develop strong racial identities; otherwise people don’t think much about race.

Napoleon Chagnon, a white man, has spent years living among the Yanomamo, one of the world’s most murderous tribes, folks who go and slaughter their neighbors and neighbors’ children all the time, and they still haven’t murdered him.

Why do people insist on claiming that Trump’s “Muslim ban” is racist when Muslims aren’t a race? Because Islam is an identity group that appears to function similarly to race, even though Muslims come in white, black, and Asian.

If you’ve read any of the comments on my old post about Turkic DNA, Turkey: Not very Turkic, you’ll have noted that Turks are quite passionate about their Turkic identity, even though “Turkic” clearly doesn’t correspond to any particular ethnic groups. (It’s even more mixed up than Jewish, and that’s a pretty mixed up one after thousands of years of inter-breeding with non-Jews.)

Group identities are fluid. When threatened, groups merged. When resources are abundant and times are good, groups split.

What about evidence that infants identify–stare longer at–faces of people of different races than their parents? This may be true, but all it really tells us is that babies are attuned to novelty. It certainly doesn’t tell us that babies are racist just because they find people interesting who look different from the people they’re used to.

What happens when people encounter others of a different race for the first time?

We have many accounts of “first contacts” between different races during the Age of Exploration. For example, when escaped English convict William Buckley wandered into an uncontacted Aborigine tribe, they assumed he was a ghost, adopted him, taught him to survive, and protected him for 30 years. By contrast, the last guy who landed on North Sentinel Island and tried to chat with the natives there got a spear to the chest and a shallow grave for his efforts. (But I am not certain the North Sentinelese haven’t encountered outsiders at some point.)

But what about the lunchroom seating habits of the wild American teenager?

If people have an instinct to form in-groups and out-groups, then races (or religions?) may represent the furthest bounds of this, at least until we encounter aliens. All else held equal, perhaps we are most inclined to like the people most like ourselves, and least inclined to like the people least like ourselves–racism would thus be the strongest manifestation of this broader instinct. But what about people who have a great dislike for one race, but seem just fine with another, eg, a white person who likes Asians but not blacks, or a black who like Asians but not whites? And can we say–per our definition above–that these preferences are irrational, or are they born of some lived experience of positive or negative interactions?

Again, we are only likely to have strong opinions about members of other races if we are in direct conflict or competition with them. Most of the time, people are in competition with their neighbors, not people on the other side of the world. I certainly don’t sit here thinking negative thoughts about Pygmies or Aborigines, even though we are very genetically distant from each other, and I doubt they spend their free time thinking negatively about me.

Just because flamingos prefer to flock with other flamingos doesn’t mean they dislike horses; for the most part, I think people are largely indifferent to folks outside their own lives.

Antagonistic Selection and Invading Armies

We don't naturally look like this
We don’t naturally look like this

Evolution is a fabulous principle, but it can only do so much. It has yet to give us titanium bones or x-ray vision, nor has it solved the problem of death. It even gives us creatures like praying mantises, who eat their mates.

Genetically speaking, men and women are actually quite similar, at least compared to, say, trees. There’s a great deal of overlap between male and female instincts–we both get hungry, we both fall in love, we both think the Ghostbusters remake was an abomination.

While evolution would like* to code for perfect men and perfect women, since we are the same species and ever male has a mom and every female has a dad, genetics ultimately can’t code for perfect men and perfect women. *yes I am anthropomorphizing

Remember, there are only two chromosomes which code for sexual development, the so called XX (female) and XY (male). Both men and women have at least one X, but no women have a Y.

It doesn’t work out that men are, like, expressing half female genes and half male genes, since the Y chromosome blocks the expression of some of the female genes. However, men still have those genes.

Sexual antagonism or “sexual conflict” occurs when a genetic trait that makes one sex better at reproducing makes the opposite sex worse at reproducing:

Interlocus sexual conflict is the interaction of a set of antagonistic alleles at one or more loci in males and females.[6] An example is conflict over mating rates. Males frequently have a higher optimal mating rate than females because in most animal species, they invest fewer resources in offspring than their female counterparts. Therefore, males have numerous adaptations to induce females to mate with them. Another well-documented example of inter-locus sexual conflict is the seminal fluid of Drosophila melanogaster, which up-regulates females’ egg-laying rate and reduces her desire to re-mate with another male (serving the male’s interests), but also shortens the female’s lifespan reducing her fitness.

From, A recent bottleneck of Y chromosome diversity coincides with a global change in culture
From: A recent bottleneck of Y chromosome diversity coincides with a global change in culture

In humans, for example, women benefit from being thin and short, while men benefit from being tall and bulky. But a short, thin woman is more likely to have a short, thin, son, which is not beneficial, and a tall, bulky man is likely to have a tall, bulky daughter–also not beneficial.

Whatever instincts we see in one gender, we likely see–in some form–in at least some members of the opposite gender. So If there is–as some folks around these parts allege–an instinct which makes women submissive to invading armies, then it likely affects some men, too.

For the few men who do survive an invasion, not protesting as your wife is gang raped might keep you alive to later reproduce, too

Hence the recent rise of cuckoldry fetishes.

Cannibalism, Abortion, and R/K Selection.

Reindeer herder, from "Quarter of a Million Reindeers to be Butched... after Anthrax Outbreak" : "Serbian officials have demanded a huge cull of a 250,000 reindeers by Christmas over the risk of an anthrax outbreak. Currently 730,000 animals are being kept in the Yamal Peninsula and the rest of the Yamalo-Nenets region."
Reindeer herder, from Quarter of a Million Reindeers to be Butched… after Anthrax Outbreak: “Currently 730,000 animals are being kept in the Yamal Peninsula and the rest of the Yamalo-Nenets region.”

In Hunters, Pastoralists, and Ranchers: Reindeer Economies and their Transformations [PDF,] Ingold describes the social distribution of food among hunter-gatherers. In normal times, when food is neither super-abundant nor scarce, each family basically consumes what it brings in, without feeling any particular compulsion to share with their neighbors. In times of super-abundance, food is distributed throughout the tribe, often quite freely:

Since harvested animals, unlike a plant crop, will not reproduce, the multiplicative accumulation of material wealth is not possible within the framework of hunting relations of production. Indeed, what is most characteristic of hunting societies everywhere is the emphasis not on accumulation but on its obverse: the sharing of the kill, to varying degrees, amongst all those associated with the hunter. …

The fortunate hunter, when he returns to camp with his kill, is expected to play host to the rest of the community, in bouts of extravagant consumption.

The other two ethnographies I have read of hunter-gatherers (The Harmless People, about the Bushmen of the Kalahari, and Kabloona, about the Eskimo aka Inuit) both support this: large kills are communal feasts. Hunter gatherers often have quite strict rules about how exactly a kill is to be divided, but the most important thing is that everyone gets some.

And this is eminently sensible–you try eating an entire giraffe by yourself, in the desert, before it rots.

Even in the arctic, where men can (in part of the year) freeze food for the future, your neighbor’s belly is as good as a freezer, because the neighbor you feed today will feed you tomorrow. Hunting is an activity that can be wildly successful one day and fail completely the next, so if hunters did not share with each other, soon each one would starve.

Whilst the successful hunter is required to distribute his spoils freely amongst his camp fellows, he does so with the assurance that in any future eventuality, when through bad luck he fails to find game, or through illness or old age he can no longer provide for himself and his family, he will receive in his turn. Were each hunter to produce only for his own domestic needs, everyone would eventually perish from hunger (Jochelson 1926:124). Thus, through its contribution to the survival and reproduction of potential producers, sharing ensures the perpetuation of society as a whole. …

Yet he is also concerned to set aside stocks of food to see his household through at least a part of the coming winter. The meat that remains after the obligatory festive redistribution is therefore placed in the household’s cache, on which the housewife can draw specifically for the provision of her own domestic group (Spencer 1959:149). After the herds have passed by, domestic autonomy is re-establisheddraws on its own reserves of stored food.

But what happens at the opposite extreme, not under conditions of abundance, but when everyone‘s stocks run out? Ingold claims that in times of famine, the obligation to share what little food one has with one’s neighbors is also invoked:

We find, therefore, that the incidence of generalized reciprocity tends to peak towards the two extremes of scarcity and abundance… The communal feast that follows a successful hunting drive involves the same heightening of band solidarity, and calls into play the same functions of leadership in the apportionment of food, as does the consumption of famine rations.

I am reminded here of a scene in The Harmless People in which there was not enough food to go around, but the rules of distribution were still followed, each person just cutting their piece smaller. Thomas described one of the small children, hungry, trying to grab the food bowl–not the food itself–to stop their mother from giving away their food to the next person in the chain of obligation.

Here Ingold pauses to discuss a claim by Sahlins that such social order will (or should) break down under conditions of extreme hunger:

Probably every primitive organization has its breaking-point, or at least its turning-point. Every one might see the time when co-operation is overwhelmed by the scale of disaster and chicanery becomes the order of the day. The range of assistance contracts progressively to the family level; perhaps even these bonds dissolve and, washed away, reveal an inhuman, yet most human, self-interest. Moreover, by the same measure that the circle of charity is
compressed that of ‘negative reciprocity* is potentially expanded. People who helped each other in normal times and through the first stages of disaster display now an indifference to each others’ plight, if they do not exacerbate a mutual downfall by guile, haggle and theft.

Ingold responds:

I can find no evidence, either in my reading of circumpolar ethnography, or in the material cited by Sahlins, for the existence of such a ‘turning-point’ in hunting societies. On the contrary, as the crisis deepens, generalized reciprocity proceeds to the point of dissolution of domestic group boundaries. ‘Negative reciprocity’, rather than closing in from beyond the frontiers of the household, will be expelled altogether from the wider social field, only to make its appearance within the heart of the domestic group itself.

Thus the women of the household, who are allowed to eat only after the appetites of their menfolk have been satisfied, may be left in times of want with the merest scraps of food. Among the Chipewyan, ‘when real distress approaches, many of them are permitted to starve, when the males are amply provided for’…

In situations of economic collapse, negative reciprocity afflicts not only the domestic relations between husband and wife, but those between mother and child, and between parent and grandparent. If the suckling of children is the purest expression of generalized reciprocity, in the form of a sustained one-way flow, then infanticide must surely represent the negative extreme. Likewise, old or sick members of the household will be the first to be abandoned when provisions run short. Even in normal times, individuals who are past labour have to scavenge the left-overs of food and skins (Hearne 1911:326). In the most dire circumstances of all, men will consume their starving wives and children before turning upon one another.

Drawing on Eskimo material, Hoebel derives the following precepts of cannibal conduct: Not unusually . . . parents kill their own children to be eaten. This act is no different from infanticide. A man may kill and eat his wife; it is his privilege. Killing and eating a relative will produce no legal consequences. It is to be presumed, however, that killing a non-relative for food is murder. (1941:672, cited in Eidlitz 1969:132)

In short, the ‘circle of charity’ is not compressed but inverted: as the threat of starvation becomes a reality, the legitimacy of killing increases towards the centre. The act is ‘inhuman’ since it strips the humanity of the victim to its organic, corporeal substance. If altruism is an index of sociability, then its absolute negation annuls the sodality of the recipient: persons, be they human or animal, become things.

297px-world_population_v3-svgThis is gruesome, but let us assume it is true (I have not read the accounts Ingold cites, so I must trust him, and I do not always trust him but for now we will.)

The cold, hard logic of infanticide is that a mother can produce more children if she loses one, but a child who has lost its mother will likely die as well, along with all of its siblings. One of my great-great grandmothers suffered the loss of half her children in infancy and still managed to raise 5+ to adulthood. Look around: even with abortion and birth control widely available, humanity is not suffering a lack of children. ETA: As BaruchK correctly noted, today’s children are largely coming from people who don’t use birth control or have legal access to abortion; fertility rates are below replacement throughout the West, with the one exception AFAIK of Israel.

c08pnclw8aapot6Furthermore, children starve faster and are easier to kill than parents; women are easier to kill than men; people who live with you are easier to kill than people who don’t.

Before we condemn these people, let us remember that famine is a truly awful, torturous way to die, and that people who are on the brink of starving to death are not right in their minds. As “They’re not human”: How 19th-century Inuit coped with a real-life invasion of the Walking Dead recounts:

“Finally, as the footsteps stopped just outside the igloo, it was the old man who went out to investigate.

“He emerged to see a disoriented figure seemingly unaware of his presence. The being was touching the outside of the igloo with curiosity, and raised no protest when the old man reached his hand out to touch its cheek.

“His skin was cold. …

The figures, of course, were the last survivors of the Franklin Expedition. They had buried their captain. They had seen their ship entombed by ice. They had eaten the dead to survive. …

Inuit nomads had come across streams of men that “didn’t seem to be right.” Maddened by scurvy, botulism or desperation, they were raving in a language the Inuit couldn’t understand. In one case, hunters came across two Franklin Expedition survivors who had been sleeping for days in the hollowed-out corpses of seals. …

The figures were too weak to be dangerous, so Inuit women tried to comfort the strangers by inviting them into their igloo. …

The men spit out pieces of cooked seal offered to them. They rejected offers of soup. They grabbed jealous hold of their belongings when the Inuit offered to trade.

When the Inuit men returned to the camp from their hunt, they constructed an igloo for the strangers, built them a fire and even outfitted the shelter with three whole seals. …

When a small party went back to the camp to retrieve [some items], they found an igloo filled with corpses.

The seals were untouched. Instead, the men had eaten each other. …

In 1854, Rae had just come back from a return trip to the Arctic, where he had been horrified to discover that many of his original Inuit sources had fallen to the same fates they had witnessed in the Franklin Expedition.

An outbreak of influenza had swept the area, likely sparked by the wave of Franklin searchers combing the Arctic. As social mores broke down, food ran short.

Inuit men that Rae had known personally had chosen suicide over watching the slow death of their children. Families had starved for days before eating their dog teams. Some women, who had seen their families die around them, had needed to turn to the “last resource” to survive the winter.

Infanticide, cannibalism, and human sacrifice were far more common prior to 1980 or so than we like to think; God forbid we should ever know such fates.

According to Wikipedia:

“Many Neolithic groups routinely resorted to infanticide … Joseph Birdsell believed that infanticide rates in prehistoric times were between 15% and 50% of the total number of births,[10] while Laila Williamson estimated a lower rate ranging from 15% to 20%.[6]:66 Comparative anthropologists have calculated that 50% of female newborn babies were killed by their parents during the Paleolithic era.[12] Decapitated skeletons of hominid children have been found with evidence of cannibalism.[13]

400px-Magliabchanopage_73r“Three thousand bones of young children, with evidence of sacrificial rituals, have been found in Sardinia. Pelasgians offered a sacrifice of every tenth child during difficult times. Syrians sacrificed children to Jupiter and Juno. Many remains of children have been found in Gezer excavations with signs of sacrifice. Child skeletons with the marks of sacrifice have been found also in Egypt dating 950-720 BCE. In Carthage “[child] sacrifice in the ancient world reached its infamous zenith.”[11]:324  …

“According to Shelby Brown, Carthaginians, descendants of the Phoenicians, sacrificed infants to their gods.[25] Charred bones of hundreds of infants have been found in Carthaginian archaeological sites. One such area harbored as many as 20,000 burial urns.[25]

Picture 4Plutarch (c. 46–120 AD) mentions the practice, as do Tertullian, Orosius, Diodorus Siculus and Philo. The Hebrew Bible also mentions what appears to be child sacrifice practiced at a place called the Tophet (from the Hebrew taph or toph, to burn) by the Canaanites. Writing in the 3rd century BCE, Kleitarchos, one of the historians of Alexander the Great, described that the infants rolled into the flaming pit. Diodorus Siculus wrote that babies were roasted to death inside the burning pit of the god Baal Hamon, a bronze statue.

“… the exposure of newborns was widely practiced in ancient Greece, it was even advocated by Aristotle in the case of congenital deformity — “As to the exposure of children, let there be a law that no deformed child shall live.”[30]

“The practice was prevalent in ancient Rome, as well. … A letter from a Roman citizen to his sister, or a pregnant wife from her husband,[35] dating from 1 BC, demonstrates the casual nature with which infanticide was often viewed:

“I am still in Alexandria. … I beg and plead with you to take care of our little child, and as soon as we receive wages, I will send them to you. In the meantime, if (good fortune to you!) you give birth, if it is a boy, let it live; if it is a girl, expose it.” [36][37]

CgxAZrOUYAEeANF“In some periods of Roman history it was traditional for a newborn to be brought to the pater familias, the family patriarch, who would then decide whether the child was to be kept and raised, or left to die by exposure.[39] The Twelve Tables of Roman law obliged him to put to death a child that was visibly deformed. …

“According to William L. Langer, exposure in the Middle Ages “was practiced on gigantic scale with absolute impunity, noticed by writers with most frigid indifference”.[47]:355–356 At the end of the 12th century, notes Richard Trexler, Roman women threw their newborns into the Tiber river in daylight.[48]” …

400px-Kodeks_tudela_21“Philosopher Han Fei Tzu, a member of the ruling aristocracy of the 3rd century BC, who developed a school of law, wrote: “As to children, a father and mother when they produce a boy congratulate one another, but when they produce a girl they put it to death.”[63]

“Buddhist belief in transmigration allowed poor residents of the country to kill their newborn children if they felt unable to care for them, hoping that they would be reborn in better circumstances. Furthermore, some Chinese did not consider newborn children fully “human”, and saw “life” beginning at some point after the sixth month after birth.[65]

“Contemporary writers from the Song dynasty note that, in Hubei and Fujian provinces, residents would only keep three sons and two daughters (among poor farmers, two sons and one daughter), and kill all babies beyond that number at birth.[66]”

Sex Ratio at birth in the People's Republic of China
Sex Ratio at birth in the People’s Republic of China

“It was not uncommon that parents threw a child to the sharks in the Ganges River as a sacrificial offering. The British colonists were unable to outlaw the custom until the beginnings of the 19th century.[82]:78

“According to social activists, female infanticide has remained a problem in India into the 21st century, with both NGOs and the government conducting awareness campaigns to combat it.[83] …

“In the Eastern Shoshone there was a scarcity of Indian women as a result of female infanticide.[100] For the Maidu Native Americans twins were so dangerous that they not only killed them, but the mother as well.[101] In the region known today as southern Texas, the Mariame Indians practiced infanticide of females on a large scale. Wives had to be obtained from neighboring groups.[102]

Meanwhile in the Americas:

In 2005 a mass grave of one- to two-year-old sacrificed children was found in the Maya region of Comalcalco. The sacrifices were apparently performed for consecration purposes when building temples at the Comalcalco acropolis.[2] …

Archaeologists have found the remains of 42 children sacrificed to Tlaloc (and a few to Ehecátl Quetzalcóatl) in the offerings of the Great Pyramid of Tenochtitlan. In every case, the 42 children, mostly males aged around six, were suffering from serious cavities, abscesses or bone infections that would have been painful enough to make them cry continually. Tlaloc required the tears of the young so their tears would wet the earth. As a result, if children did not cry, the priests would sometimes tear off the children’s nails before the ritual sacrifice.[7]

And don’t get me started on cannibalism.

James Cook witnessing human sacrifice in Tahiti
James Cook witnessing human sacrifice in Tahiti

It is perhaps more profitable to ask which cultures didn’t practice some form of infanticide/infant sacrifice/cannibalism than which ones did. The major cases Wikipedia notes are Ancient Egypt, Judaism, Christianity, and Islam (we may note that Judaism in many ways derived from ancient Egypt, and Christianity and Islam from Judaism.) Ancient Egypt stands out as unique among major the pre-modern, pre-monotheistic societies to show no signs of regular infanticide–and even in the most infamous case where the Egyptian pharaoh went so far as to order the shocking act, we find direct disobedience in his own household:

3 And when she [Jochebed] could not longer hide him [the baby], she took for him an ark of bulrushes, and daubed it with slime and with pitch, and put the child therein; and she laid it in the flags by the river’s brink.4 And his sister stood afar off, to wit what would be done to him.

pharaohs_daughter-15 And the daughter of Pharaoh came down to wash herself at the river; and her maidens walked along by the river’s side; and when she saw the ark among the flags, she sent her maid to fetch it.

6 And when she had opened it, she saw the child: and, behold, the babe wept. And she had compassion on him, and said, “This is one of the Hebrews’ children.”

7 Then said his sister to Pharaoh’s daughter, “Shall I go and call to thee a nurse of the Hebrew women, that she may nurse the child for thee?”

8 And Pharaoh’s daughter said to her, “Go.” And the maid went and called the child’s mother.

9 And Pharaoh’s daughter said unto her, “Take this child away, and nurse it for me, and I will give thee thy wages.” And the women took the child, and nursed it.

10 And the child grew, and she brought him unto Pharaoh’s daughter, and he became her son. And she called his name Moses: and she said, “Because I drew him out of the water.”

–Exodus 2:3-10

I don’t know the actual infanticide numbers in modern Muslim countries (le wik notes that poverty in places like Pakistan still drives infanticide) but it is officially forbidden by Islam.

According to Abortions in America: • Black women are five times more likely to abort than white women. • 69% of pregnancies among Blacks are unintended, while that number is 54% among Hispanics and 40% of pregnancies among Whites. • Planned Parenthood, ... has located 80% of its abortion clinics in minority neighborhoods
According to Abortions in America:
• Black women are five times more likely to abort than white women.
• 69% of pregnancies among Blacks are unintended, while that number is 54% among Hispanics and 40% of pregnancies among Whites.
• Planned Parenthood, … has located 80% of its abortion clinics in minority neighborhoods

Today, between the spread of Abrahamic religions, Western Values, and general prosperity, the infanticide rate has been cut and human sacrifice and cannibalism have been all but eliminated. Abortion, though, is legal–if highly controversial–throughout the West and Israel.

According to the CDC, the abortion rate for 2013 was 200 abortions per 1,000 live births, or about 15% of pregnancies. (The CDC also notes that the abortion rate has been falling since at least 2004.) Of these, “91.6% of abortions were performed at ≤13 weeks’ gestation; … In 2013, 22.2% of all abortions were early medical abortions.”

To what can we attribute this anti-infanticide sentiment of modern monotheistic societies? Is it just a cultural accident, a result of inheritance from ancient Egypt, or perhaps the lucky effects of some random early theologian? Or as the religious would suggest, due to God’s divine decree? Or is it an effect of the efforts parents must expend on their few children in societies where children must attend years of school in order to succeed?

According to Wikipedia:

In ecology, r/K selection theory relates to the selection of combinations of traits in an organism that trade off between quantity and quality of offspring. The focus upon either increased quantity of offspring at the expense of individual parental investment of r-strategists, or reduced quantity of offspring with a corresponding increased parental investment of K-strategists, varies widely, seemingly to promote success in particular environments. …

In r/K selection theory, selective pressures are hypothesised to drive evolution in one of two generalized directions: r– or K-selection.[1] These terms, r and K, are drawn from standard ecological algebra as illustrated in the simplified Verhulst model of population dynamics:[7]

d N d t = r N ( 1 − N K ) {\frac {dN}{dt}}=rN\left(1-{\frac {N}{K}}\right)

where r is the maximum growth rate of the population (N), K is the carrying capacity of its local environmental setting, and the notation dN/dt stands for the derivative of N with respect to t (time). Thus, the equation relates the rate of change of the population N to the current population size and expresses the effect of the two parameters. …

As the name implies, r-selected species are those that place an emphasis on a high growth rate, and, typically exploit less-crowded ecological niches, and produce many offspring, each of which has a relatively low probability of surviving to adulthood (i.e., high r, low K).[8] A typical r species is the dandelion Taraxacum genus.

In unstable or unpredictable environments, r-selection predominates due to the ability to reproduce quickly. There is little advantage in adaptations that permit successful competition with other organisms, because the environment is likely to change again. Among the traits that are thought to characterize r-selection are high fecundity, small body size, early maturity onset, short generation time, and the ability to disperse offspring widely. …

By contrast, K-selected species display traits associated with living at densities close to carrying capacity, and typically are strong competitors in such crowded niches that invest more heavily in fewer offspring, each of which has a relatively high probability of surviving to adulthood (i.e., low r, high K). In scientific literature, r-selected species are occasionally referred to as “opportunistic” whereas K-selected species are described as “equilibrium”.[8]

In stable or predictable environments, K-selection predominates as the ability to compete successfully for limited resources is crucial and populations of K-selected organisms typically are very constant in number and close to the maximum that the environment can bear (unlike r-selected populations, where population sizes can change much more rapidly).

Traits that are thought to be characteristic of K-selection include large body size, long life expectancy, and the production of fewer offspring, which often require extensive parental care until they mature.

Of course you are probably already aware of Rushton’s R/K theory of human cultures:

Rushton’s book Race, Evolution, and Behavior (1995) uses r/K selection theory to explain how East Asians consistently average high, blacks low, and whites in the middle on an evolutionary scale of characteristics indicative of nurturing behavior. He first published this theory in 1984. Rushton argues that East Asians and their descendants average a larger brain size, greater intelligence, more sexual restraint, slower rates of maturation, and greater law abidingness and social organization than do Europeans and their descendants, who average higher scores on these dimensions than Africans and their descendants. He theorizes that r/K selection theory explains these differences.

I’d be remiss if I didn’t also mention that the article states, “Rushton’s application of r/K selection theory to explain differences among racial groups has been widely criticised. One of his many critics is the evolutionary biologist Joseph L. Graves, who has done extensive testing of the r/K selection theory with species of Drosophila flies. …”

Genetics or culture, in dense human societies, people must devote a great deal of energy to a small number of children they can successfully raise, leading to the notion that parents are morally required to put this effort into their children. But this system is at odds with the fact that without some form of intervention, the average married couple will produce far more than two offspring.

Ultimately, I don’t have answers, only theories.

Source: CDC data, I believe
Source: CDC data, I believe

Creationism, Evolutionism, and Categories

I’ve been thinking about the progression of ideas about natural categories, such as “men” and “women,” “cows” and “mules,” “English” and “Polynesian.” Not exactly our high philosophical progression, but a somewhat commoner one.

It seems that 100 years or so ago, most people would have explained the differences between things with a simple, “Because God wanted them to be that way.” And if God wants it that way, then the way they are is good and you should leave them alone.

I have heard my [sibling] wax practically poetic about the way God made mules and horses for farm work, and why you should not yoke together an ox and a donkey. (One of the interesting parts of meeting my siblings for the first time as an adult was realizing that dorkiness is genetic.)

The evolutionary perspective is that evolution created things (or, as we like to call it around here, GNON, the God of Nature and Nature’s God.) Gnon and God are functionally rather similar, for Gnon also made things in natural categories, and while we may refrain from deeming them “good” in quite the same way as religious people, we certainly believe that each group’s features serve purposes that have helped members of that group survive where others did not.

The conservative creationist denies the role of evolution, but he does not deny that categories exist. He merely disputes their method of creation. To quote Answers in Genesis:

So, a good rule of thumb is that if two things can breed together, then they are of the same created kind. …

As an example, dogs can easily breed with one another, whether wolves, dingoes, coyotes, or domestic dogs. When dogs breed together, you get dogs; so there is a dog kind. It works the same with chickens. There are several breeds of chickens, but chickens breed with each other and you still get chickens. So, there is a chicken kind. The concept is fairly easy to understand.

But in today’s culture, where evolution and millions of years are taught as fact, many have been led to believe that animals and plants (that are classed as a specific “species”) have been like this for tens of thousands of years and perhaps millions of years. So, when they see things like lions or zebras, they think they have been like this for an extremely long time.

From a biblical perspective, though, land animals like wolves, zebras, sheep, lions, and so on have at least two ancestors that lived on Noah’s Ark, only about 4,300 years ago. These animals have undergone many changes since that time. But dogs are still part of the dog kind, cats are still part of the cat kind, and so on. God placed variety within the original kinds, and other variation has occurred since the Fall due to genetic alterations.

For all that people accuse the Answers in Genesis folks of being crazy, and for all that they are trying awfully hard to re-invent the wheel, this is an unobjectionable approach to species and hybridization.

By contrast, the liberal creationist, since she cannot fall back on God in his rejection of Gnon, asserts that the categories themselves do not exist. “Race is a social construct. Gender is a social construct.” etc. Dr. Zuleyka Zevallos, who is definitely not a crazy Creationist with no respect for science, writes:

When people talk about the differences between men and women they are often drawing on sex – on rigid ideas of biology – rather than gender, which is an understanding of how society shapes our understanding of those biological categories.

Gender is more fluid – it may or may not depend upon biological traits. [bold mine.] More specifically, it is a concept that describes how societies determine and manage sex categories; the cultural meanings attached to men and women’s roles; and how individuals understand their identities including, but not limited to, being a man, woman, transgender, intersex, gender queer and other gender positions. …

The sociology of gender examines how society influences our understandings and perception  of differences between masculinity (what society deems appropriate behaviour for a “man”) and femininity (what society deems appropriate behaviour for a “woman”). We examine how this, in turn, influences identity and social practices. We pay special focus on the power relationships that follow from the established gender order in a given society, as well as how this changes over time.

And the New York Times writes:

Race is not biological. It is a social construct. There is no gene or cluster of genes common to all blacks or all whites. Were race “real” in the genetic sense, racial classifications for individuals would remain constant across boundaries. Yet, a person who could be categorized as black in the United States might be considered white in Brazil or colored in South Africa.

Answers in Genesis understands genetics better than the New York times or people with doctorates from actual universities. That is pretty damn pathetic.

crayon map of racial distribution. Not guaranteed correct
crayon map of racial distribution. Not guaranteed correct

Of course, some of our ideas about what “men” and “women”  or “blacks” and “whites” are like are cultural (especially any that involve technology, since technology has changed radically over the past 100 years.) As an amateur anthropologist, I am quite aware that different cultures have different ideas on these subjects. This does not negate the fact that “maleness” and “femaleness” are basically biologically-driven. Female interest in babies and male interest in violence has its roots in biology, not culture. Genetics have a huge effect on personality. Likewise, races are absolutely real, biological categories, which no doctor attempting an organ transplant can afford to ignore.

The idea that races don’t exist in some kind of genetic way is absurd. Let’s just take the EDAR gene:

Ectodysplasin A receptor (EDAR) is a protein that in humans is encoded by the EDAR gene. EDAR is a cell surface receptor for ectodysplasin A which plays an important role in the development of ectodermal tissues such as the skin.[3][4][5] …

A derived G-allele point mutation (SNP) with pleiotropic effects in EDAR, 370A or rs3827760, found in most modern East Asians and Native Americans but not common in African or European populations, is thought to be one of the key genes responsible for a number of differences between these populations, including the thicker hair, more numerous sweat glands, smaller breasts, and dentition characteristic of East Asians.[7]… The 370A mutation arose in humans approximately 30,000 years ago, and now is found in 93% of Han Chinese and in the majority of people in nearby Asian populations. The derived G-allele is a mutation of the ancestral A-allele, the version found in most modern non-East Asian and non-Native American populations.

Most East Asians and Native Americans (that is, the greater Asian Race,) have the G-allele of EDAR. Most non-Asians have the A-allele.

World map of Y-DNA Haplotypes
World map of Y-DNA Haplotypes

If you don’t have some form of causality to explain how the world’s variation came to exist, I guess you fall back on “it’s totally random and meaningless.”

Noah’s Twitter Deluge

To be alive today is to drown in data…

Noah's Ark by Edward Hicks, 1846
Noah’s Ark by Edward Hicks, 1846

Now the earth was corrupt in GNON’s sight and was full of violence. So GNON said to Noah, “I am going to put an end to all people, for the earth is filled with violence because of them. So make yourself an ark of cypress wood; make rooms in it and coat it with pitch inside and out. Make a roof for it, leaving below the roof an opening one cubit high all around. Put a door in the side of the ark and make lower, middle and upper decks, and make it immune to Twitter, Facebook, and cable TV.  I am going to bring a deluge of information, unending news, tweets, and endless status updates on the earth to distract all life under the heavens, every creature that has the breath of life in it, until they fade from existence.

…modernity is selecting for those who resist modernity.

Why are Mammals Brown? (pt. 2)

Rainbow leaf beetle
Rainbow leaf beetle

As I was saying in part 1, compared to colorful fish, lizards, birds, and even ladybugs, we mammals are downright drab. Blue and purple fur are non-existent because these colors are difficult to produce as pigments, and so most animals with these colors produce them structurally rather than chemically, but hair is not a good medium for structural color. We are limited to pigments.

But this only explains blue and purple. Why are there so few mammals with bright red, pink, orange, or green fur? Wouldn’t green offer convenient camouflage for tree-dwelling sloths or lemurs? So on to the second reason we’re drab:

mantis_shrimp_12. Compared to other animals, mammals have bad color perception.

For example, according to the guy who writes The Oatmeal, which is totally a reputable scientific source, dogs can only see two colors, blue and green. Humans can see three colors–green, blue, and red–which we combine to make the rest of the colors we see. Butterflies, non-mammals, can perceive 5 colors–we have no idea what that actually means, since we can’t even imagine the colors they see. And the mantis shrimp perceives an incredible 16 different colors.

The majority of mammals run closer to dogs than humans in color-perception.

But this only inspires a new question: why do we have bad eyesight?

The original mammals were small, shrew-like creatures that tried to avoid being eaten by dinosaurs back in the Triassic, about 200 million years ago.

Read the full comic over at The Oatmeal
Read the full comic over at The Oatmeal

Lizards, being mostly cold-blooded, are forced to be active primarily during the day, when it’s warm. Our warm-blooded ancestors therefore probably found it easy to avoid reptilian predators by doing their hunting and foraging at night.

According to Wikipedia:

The nocturnal bottleneck hypothesis is an hypothesis to explain several mammal traits. The hypothesis states that mammals were mainly or even exclusively nocturnal through most of their evolutionary story, starting with their origin 225 million years ago, and only ending with the demise of the dinosaurs 65 millions years ago. While some mammal groups have later evolved to fill diurnal niches, the 160 million years spent as nocturnal animals has left a lasting legacy on basal anatomy and physiology, and most mammals are still nocturnal.[1]

Between the nocturnal and the crepuscular, most mammals are only awake at times when color isn’t particularly relevant. Most mammals, therefore, have evolved eyes that aren’t very good at perceiving color, in order to optimize for seeing in dim light.

We have more rods, which perceive light; diurnal animals have more cones, which perceive colors.

Animals use their colors for three main purposes: to signal to each other, to hide, and to signal to predators.

Since most mammals can’t see many colors, even if they had a peacock’s spots, they couldn’t use them for mate selection. Few mammals are poisonous (if any,) so we don’t have the poison dart frog’s use for bright color. And you might want to be green to blend in with the trees during the day, but at night, trees are dark.

In short, we are optimized for the dark.

So even though we humans like being awake during the day, we’re unlikely to trade in our drab pelts for the macaw’s rainbow hues anytime soon.


Why are Mammals Brown? (pt. 1)

We don't naturally look like this
We don’t naturally look like this

Compared to colorful fish, lizards, birds, and even ladybugs, we mammals are downright drab. I see no particular environmental reason for this–plenty of mammals live in areas with trees or grass where green fur or spots might help them blend in, or have such striking patterns–like a zebra–that I hardly think a blue stripe would result in more lion attacks.

I think there are two main reasons mammals are mostly brown, instead of showing the vibrant colors of other species:

1. Some colors are difficult to produce.

Blue, for example. Walk into the forest or a meadow on an average day, and you’ll see a lot of green. Anything not green is likely brown. Outside a garden, there are very few naturally blue or purple plants.

This guy, however, does
This guy, however, does

It’s no coincidence that early human art uses colors that could be easily produced from the natural environment, like brown, black, (charcoal,) and yellow. By the Roman era, we could produce purple dye, but it was so hard to obtain from such rare sources (shells) that it was prohibitively expensive for mere mortals, hence why it was called “royal purple.” The European tradition of painting the Virgin Mary’s cloak blue also hails from the days when blue pigments were expensive, and thus a sign of exalted status.

A purple dye cheap enough for average people to buy and wear wasn’t invented until 1856, by William Henry Perkin.

I’m not sure exactly why blue and purple are so hard to produce, but I think it’s because light toward the violent end of the spectrum is higher energy than light toward the red end. As Bulina et al state:

Pigments in nature play important roles ranging from camouflage coloration and sunscreen to visual reception and participation in biochemical pathways. Considering the spectral diversity of pigment-based coloration in animals one can conclude that blue pigments occur relatively rare (as a rule blue coloration results from light diffraction or scattering rather than the presence of a blue pigment). At least partially this fact is explained by an inevitably more complex structure of blue pigments compared to yellow-reds. To appear blue a compound must contain an extended and usually highly polarized system of the conjugated π-electrons.

Okay… So, because blue and purple are more energetic, they require molecules that have more double bonds and are less common in nature. (Why double bonds are less common is a matter I’ll leave for a chemistry discussion.)

You’re probably used to thinking of color as an inherent property of the objects around you–that a green leaf is green, or a red bucket is red, in the same way that the leaf and bucket have a particular mass and are made of their particular atoms.

low energy to the left, high to the right
Low energy to the left, high to the right

But turn off the lights, and suddenly color goes away. (Mass doesn’t.)

The colors we see are created by light “bouncing” (really, being absorbed and then re-emitted) off objects. Within the visible spectrum, red light requires the least energy to produce (because it has the widest wavelength,) and violent takes the most energy.

But nature, being creative, has come up with alternative way to produce blues and purples that doesn’t depend on electron energy levels: structure.

Unless you are a color scientist you are probably accustomed to dealing with chemical colors. For example, if you take a handful of blue pigment powder, mix it with water, paint it onto a chair, let it dry, then scrape it off the chair, and grind it back into powder, you expect it to remain blue at all stages in the process (except if you get a bit of chair mixed in with it.)

Blue Morpho butterfly
Blue Morpho butterfly

By contrast, if you scraped the scales off a blue morpho butterfly’s wings, you’d just end up with a pile of grey dust and a sad butterfly. By themselves, blue morpho scales are not “blue,” even under regular light. Rather, their scales are arranged so that light bounces between them, like light bouncing from molecule to molecule in the air. Or as Ask Nature puts it:

Many types of butterflies use light-interacting structures on their wing scales to produce color. The cuticle on the scales of these butterflies’ wings is composed of nano- and microscale, transparent, chitin-and-air layered structures. Rather than absorb and reflect certain light wavelengths as pigments and dyes do, these multi scale structures cause light that hits the surface of the wing to diffract and interfere.

The same process is at work in the peacock’s plumage and bluebird’s blue:

Male eastern bluebird
Male eastern bluebird

Soft condensed matter physics has been particularly useful in understanding the production of the amorphous nanostructures that imbue the feathers of certain bird species with intensely vibrant hues. The blue color of the male Eastern bluebird (Sialia sialis), for example, is produced by the selective scattering of blue light from a complex nanostructure of b-keratin channels and air pockets in the hairlike branches called feather barbs that give the quill its lift. The size of the air pockets determines the wavelengths that are selectively amplified.

When the bluebird’s feathers are developing, feather barb cells known as medullary keratinocytes expand to their boxy final shape and deposit solid keratin around the periphery of the cell—essentially turning the walled-in cells into soups of ß-keratin suspended in cytoplasm. Next, b-keratin filaments free in the cytoplasm start to bind to each other to form larger bundles. As these filaments become less water-soluble, they begin to come out of solution—a process known as phase separation—ultimately forming solid bars that surround twisted channels of cytoplasm. These nanoscale channels of keratin remain in place after the cytoplasm dries out and the cell dies, resulting in the nanostructures observed in the feathers of mature adults.

“The bluebird doesn’t lay down a squiggly architecture and then put the array of the protein molecules on top of it,” Prum explains. “It lets phase separation, the same process that would occur in oil and vinegar unmixing, create this spatial structure itself.”

The point at which the phase separation halts determines the color each feather produces.

Decades old pollia fruit retains its structural brilliance
Decades old Pollia fruit retains its structural brilliance

This kind of structural color works great if your medium is scales, feathers, carapaces, berries, or even CDs, but just doesn’t work with hair, which we mammals have. Unlike the carefully hooked together structure of a feather or the details of a butterfly’s scales, hair moves. It shakes. It would have to be essentially solid to create structural color, and it’s not.

So for the most part, bright colors like green, blue, and purple are expensive, energy-wise, to produce chemically, and mammals don’t have the option birds, fish, lizards, and insects have of producing them structurally.

To be continued…

YES Two Out of Africa Events! (Also, Aborigines)

I’ve long suspected (given the archaeological evidence, like 80,000 year old human remains in China,) that there were two Out of Africa (OOA) events–an early one that headed east, toward Australia, and a later one that headed everywhere (including Australia)–and now it looks like this has been genetically confirmed:

Graphic created by the Estonian genetics team cited in the NY Times article. Their full article: Genomic analyses inform on migration events during the peopling of Eurasia
Graphic created by the Estonian genetics team cited in the NY Times article. Their full article: Genomic analyses inform on migration events during the peopling of Eurasia

Isn’t this a great graphic? My hat’s off to the Estonians. Beautiful work.

Graphic created by the Estonian genetics team cited in the NY Times article. Their full article: Genomic analyses inform on migration events during the peopling of Eurasia

Here’s another one they made (sadly small) with less color and more detail on the Eurasian lines. (IIRC, Chinese have more Neanderthal ancestry than Europeans, so technically the schematic ought to be a wee bit more complicated than this, but it’s already complicated enough and this is a solid general overview.)

It might just be the sleep dep + lots of coffee talking, but I am so excited about this.

Some quotes from the NY Times article:

In Israel, for example, researchers found a few distinctively modern human skeletons that are between 120,000 and 90,000 years old. In Saudi Arabia and India, sophisticated tools date back as far as 100,000 years.

Last October, Chinese scientists reported finding teeth belonging to Homo sapiens that are at least 80,000 years old and perhaps as old as 120,000 years. …

Examining their data separately, all three groups came to the same conclusion: People everywhere descend from a single migration of early humans from Africa. The estimates from the studies point to an exodus somewhere between 80,000 and 50,000 years. …

n Papua New Guinea, Dr. Metspalu and his colleagues found, 98 percent of each person’s DNA can be traced to that single migration from Africa. But the other 2 percent seemed to be much older.

Dr. Metspalu concluded that all people in Papua New Guinea carry a trace of DNA from an earlier wave of Africans who left the continent as long as 140,000 years ago, and then vanished.

Obviously, in science, replication and caution are key. Don’t get too excited. These results might turn out to be wrong–sometimes samples get contaminated or data coded incorrectly and we get results that turn out to be completely wrong. And, okay, this isn’t really “huge” in the grand scheme of things–we’re only talking about 2% of Papuans’ ancestors, not, like, 40% of them. But it does explain all of those anomalously old findings.

Now someone needs to explain the Red Deer Cave People:

The Red Deer Cave People were the most recently known prehistoric Hominin population that did not look like modern humans. Fossils dated to between 14,500 and 11,500 years old were found in Red Deer Cave and Longlin Cave in China. Having a mix of archaic and modern features, they are (tentatively) thought to be a separate species of humans that persisted until recent times and became extinct without contributing to the gene pool of modern humans.[1]

On a related note, we have some awesome news about Aborigine DNA/language trees: A genomic history of Australia and Why Australia is home to one of the Largest Language Families in the World. (Well duh it’s because Aborigines spent thousands of years as tiny bands of hunter gatherers, in which each isolated band started developing its own language.) These articles have an oddly inverted structure, (burying the lead, I guess,) so let’s rearrange the abstract for coherency:

We estimate that Aboriginal Australians and Papuans diverged from Eurasians 51–72 kya, following a single out-of-Africa dispersal, and subsequently admixed with archaic populations. … Papuan and Aboriginal Australian ancestors diversified 25–40 thousand years ago (kya), suggesting pre-Holocene population structure in the ancient continent of Sahul (Australia, New Guinea and Tasmania). However, all of the studied Aboriginal Australians descend from a single founding population that differentiated ~10–32 kya. We infer a population expansion in northeast Australia during the Holocene epoch (past 10,000 years) associated with limited gene flow from this region to the rest of Australia, consistent with the spread of the Pama–Nyungan languages.

(kya = thousand years ago). So about 10-32 thousand years ago, one group of Australians conquered all of the other groups of Australians.

The science article notes:

To the researchers’ amazement, the genetic pattern mirrored the linguistic one. “It’s incredible that those two trees match. None of us expected that,” says paleoanthropologist Michael Westaway of Griffith University, Nathan, in Australia, a co-author on the Willerslev paper. “But it’s confusing: The [genetic splits] date to 30,000 years ago or more but the linguistic divisions are only maybe 6000 years old.”

Willerslev says he first thought the languages must be much older than thought. “But the linguists told me, ‘no way.'”

Both types of data also show that the population expanded from the northeast to the southwest. This migration occurred within the last 10,000 years and likely came in successive waves, Bowern says, in which existing languages were overlaid by new ones. This expansion also seems to correspond with a stone tool innovation called a backed edge blade. But the accompanying gene flow was just a trickle, suggesting that only a few people had an outsize cultural impact, Willerslev says. “It’s like you had two men entering a village, convincing everyone to speak a new language and adopt new tools, having a little sexual interaction, then disappearing,” he says. Then the new languages continued to develop, following the older patterns of population separation. “It’s really strange but it’s the best way we can interpret the data at this stage.”

Three things going on here. 1. The group from the north conquered the group from the south, raped their women, and imposed their language. They were able to do this because they had better weapons (“backed edge blades.”) But the group from the north was not very big, and so did not leave a very big genetic signature.

2. They conquered an existing population structure, at which point their language got absorbed into that structure, probably picking up some linguistic substrate from the groups’ previous languages along the way. Since most people learn language from their parents, it’s not too surprising to find cases where language and genetics line up. (Note that people do not always learn languages from their parents.)

3. Intellectuals are kind of naive.

The other really interesting thing here is that the linguistics team came to their conclusions by feeding a big database of Aboriginal words into a computer and having it run similar algorithms to the ones geneticists use for examining human ancestry (see the lovely graphics above.) I’ve been wondering for a long time why they don’t just do this, and am excited that they finally are.

Now please someone put all of the languages + reconstructed proto-langauges into the computer and find the most likely trees.

(Sorry, Nick. The regularly scheduled Anthropology Friday is going to have to wait a week. There just aren’t enough days.)

When did Whites Evolve?

Defining exactly who is white is contentious and difficult–so I shan’t. If you want to debate among yourselves whether or not the Irish or Hindus count, that’s your own business.

Picture 1 Picture 2

Here’s Haak et al’s full graph of human genomes from around the world, (see here and here for various discussions.) The genomes on the far left are ancient European skeletons; everything from the “pink” section onward is modern. The “African” genomes all have bright blue at their bottoms; Asian (and American Indian) genomes all have yellow. The European countries tend to have a triple-color profile, reflecting their recent (evolutionarily speaking) mix of European hunter-gatherers (dark blue), Middle Eastern farmers (orange), and a “teal” group that came in with the Indo-European speakers, but whose origins we have yet to uncover:


Unsurprisingly, the Basque have less of this “teal.” Middle Easterners, as you can see, are quite similar genetically, but tend to have “purple” instead of “dark blue”

1024px-PSM_V52_D323_Global_hair_texture_mapPhysically, of course, whites’ most distinctive feature is pale skin. They are also unique among human clades in their variety of hair and eye colors, ranging from dark to light, and tend to have wavy hair that is “oval” in cross-section. (Africans tend to have curly hair that is flat in cross section, and Asians tend to have straight hair that is cylindrical in cross section. See map for more hair details.)

There are other traits–the Wikipedia page on “Caucasian race” (not exactly synonymous with “whites”) notes:

According to George W. Gill and other modern forensic anthropologists, physical traits of Caucasoid crania are generally distinct from those of the Mongoloid and Negroid races. They assert that they can identify a Caucasoid skull with an accuracy of up to 95% by the following features: [20][21][22][23][24]

  • An orthognathic profile, with minimal protrusion of the lower part of the face (little or no prognathism).
  • Retreating zygomatic bones (cheekbones), making the face look more “pointed”.
  • Narrow nasal aperture, with a tear-shaped nasal cavity (nasal fossa).

Bodyhair_map_according_to_American_Journal_of_Physical_Anthropology_and_other_sourcesBut I am not going to deal with any of these, unless I hear of something like the EDAR gene coding for a bunch of traits.

Old racial classifications made use of language groups as stand-ins for racial groups. This turns out to be not very reliable, as we’ve found that in many cases, a small group of conquerors has managed to impose its language without imposing its genetics, as you’ve discovered in real life if you’ve ever met an African or Indian who speaks English.

europe-hair0223--light-hThe first known modern humans in Europe (IE, not Neanderthals nor Homo Erectuses,) popularly known as Cro-Magnons and unpopularly known as European early modern humans, (because anthropologists hate being understood dislike sounding like commoners,) lived around 43,ooo-45,000 years ago in Italy. By 41,000 years ago, Cro-Magnons had reached the southern coast of England.

Humanity's path out of Africa
Humanity’s path out of Africa

(Incidentally, Mungo Man, found in south-east Australia, is also estimated to be about 40,000 years old, suggesting that either:

A. People took a much longer route from Africa to Europe than to Australia
B. Europe was difficult to enter when folks left Africa, possibly because of glaciers or Neanderthals
C. There were multiple Out-of-Africa events, or
D. Our knowledge is incomplete.

D is obviously true, and I favor C regardless of Mungo’s true age.)

source: Wikipedia
source: Wikipedia

These Cro-Magnons appear to have been brown skinned, brown eyed, and black haired–they likely looked more like their close relatives in the Middle East (whatever they looked like,) than their distant descendants in modern Europe. (Despite all of the mixing and conquering of the centuries, I think modern Europeans are partially descended from Cro-Magnons, but I could be wrong.)

The Cro-Magnons carved the famous “Venus of Willendorf” (we don’t really know if the figurine was intended as a “goddess” or a fertility figure or just a portrait of a local lady or what, but it’s a nice name,) among many other similar figurines, some of them quite stylized.

Venus of Monruz
Venus of Monruz
Venus of Willendorf
Venus of Willendorf
Venus of Brassempouy
Venus of Brassempouy

Some people think the figurines look African, with cornrows or peppercorn hair and steatopygia. Others suggest the figurines are wearing hats or braids, and point out that not all of them are fat or have large butts.



So when did Europeans acquire their modern appearances? Here’s what I’ve found so far:

Wikipedia states:

Variations in the KITL gene have been positively associated with about 20% of melanin concentration differences between African and non-African populations. One of the alleles of the gene has an 80% occurrence rate in Eurasian populations.[52][53] The ASIP gene has a 75–80% variation rate among Eurasian populations compared to 20–25% in African populations.[54] Variations in the SLC24A5 gene account for 20–25% of the variation between dark and light skinned populations of Africa,[55]and appear to have arisen as recently as within the last 10,000 years.[56] The Ala111Thr or rs1426654 polymorphism in the coding region of the SLC24A5 gene reaches fixation in Europe, but is found across the globe, particularly among populations in Northern Africa, the Horn of Africa, West Asia, Central Asia and South Asia.[57][58][59]

maps-europelighteyesThe Guardian reports:

According to a team of researchers from Copenhagen University, a single mutation which arose as recently as 6-10,000 years ago was responsible for all the blue-eyed people alive on Earth today.

The team, whose research is published in the journal Human Genetics, identified a single mutation in a gene called OCA2, which arose by chance somewhere around the northwest coasts of the Black Sea in one single individual, about 8,000 years ago.

Haplogroups_europeWikipedia again:

The hair color gene MC1R has at least seven variants in Europe giving the continent a wide range of hair and eye shades. Based on recent genetic research carried out at three Japanese universities, the date of the genetic mutation that resulted in blond hair in Europe has been isolated to about 11,000 years ago during the last ice age.[25]

Recent archaeological and genetic study published in 2014 found that, seven “Scandinavian hunter-gatherers” found in 7700-year-old Motala archaeological site in southern Sweden had both light skin gene variants, SLC24A5 and SLC45A2, they also had a third gene, HERC2/OCA2, which causes blue eyes and also contribute to lighter skin and blond hair.[29]

Genetic research published in 2014, 2015 and 2016 found that Yamnaya Proto-Indo-Europeans, who migrated to Europe in early bronze age were overwhelmingly dark-eyed (brown), dark-haired and had a skin colour that was moderately light, though somewhat darker than that of the average modern European.[49] While light pigmentation traits had already existed in pre-Indo-European Europeans (both farmers and hunter-gatherers) and long-standing philological attempts to correlate them with the arrival of Indo-Europeans from the steppes were misguided.[50]

According to genetic studies, Yamnaya Proto-Indo-European migration to Europe lead to Corded Ware culture, where Yamnaya Proto-Indo-Europeans mixed with “Scandinavian hunter-gatherer” women who carried genetic alleles HERC2/OCA2, which causes combination of blue eyes and blond hair.[51][52][53] Descendants of this “Corded Ware admixture”, split from Corded Ware culture in every direction forming new branches of Indo-European tree, notably Proto-Greeks, Proto-Italio-Celtic, Proto-Indo-Iranians and Proto-Anatolians.[54] Proto-Indo-Iranians who split from Corded ware culture, formed Andronovo culture and are believed to have spread genetic alleles HERC2/OCA2 that causes blonde hair to parts of West Asia, Central Asia and South Asia.[52]

Genetic analysis in 2014 also found that Afanasevo culture which flourished in Altai Mountains were genetically identical to Yamnaya Proto-Indo-Europeans and that they did not carry genetic alleles for blonde hair or light eyes.[55][51][52] Afanasevo culture was later replaced by second wave of Indo-European invaders from Andronovo culture, who were product of Corded Ware admixture that took place in Europe, and carried genetic alleles that causes blond hair and light eyes.[55][51][52]

Dienekes writes:

An interesting finding [in Ancient human genomes suggest three ancestral populations for present-day Europeans] is that the Luxembourg hunter-gatherer probably had blue eyes (like a Mesolithic La Brana Iberian, a paper on which seems to be in the works) but darker skin than the LBK farmer who had brown eyes but lighter skin. Raghavan et al. did not find light pigmentation in Mal’ta (but that was a very old sample), so with the exception of light eyes that seem established for Western European hunter-gatherers (and may have been “darker” in European steppe populations, but “lighter” in Bronze Age South Siberians?), the origin of depigmentation of many recent Europeans remains a mystery.

Beleza et al, in The Timing of Pigmentation Lightening in Europeans, write:

… we estimate that the onset of the sweep shared by Europeans and East Asians at KITLG occurred approximately 30,000 years ago, after the out-of-Africa migration, whereas the selective sweeps for the European-specific alleles at TYRP1, SLC24A5, and SLC45A2 started much later, within the last 11,000–19,000 years, well after the first migrations of modern humans into Europe.

And finally from Wikipedia:

In a 2015 study based on 230 ancient DNA samples, researchers traced the origins of several genetic adaptations found in Europe.[46] The original mesolithic hunter-gatherers were dark skinned and blue eyed.[46] The HERC2 and OCA2 variations for blue eyes are derived from the original mesolithic hunter-gatherers, and the genes were not found in the Yamna people.[46] The HERC2 variation for blue eyes first appears around 13,000 to 14,000 years ago in Italy and the Caucasus.[38]

The migration of the neolithic farmers into Europe brought along several new adaptations.[46] The variation for light skin color was introduced to Europe by the neolithic farmers.[46] After the arrival of the neolithic farmers, a SLC22A4 mutation was selected for, a mutation which probably arose to deal with ergothioneine deficiency but increases the risk of ulcerative colitis, celiac disease, and irritable bowel disease.

The genetic variations for lactose persistence and greater height came with the Yamna people.[46]

To sum:

Skin: 10,000 years, 11-19,000 years, possibly arriving after blue eyes

Blond hair: 11,000 years

Blue eyes: 6-10,000 years ago, 13,000 to 14,000 years ago

It looks like some of these traits emerged in different populations and later combined as they spread, but they all look like they arose at approximately the same time.
Obviously I have neglected red and brown hair, green and hazel eyes, but the genetics all seem to be related.

Why Geneticists get touchy about Epigenetics

Disclaimer: I am not a geneticist. For those of you who are new here, this is basically a genetics fan blog. I am trying to learn about genetics, and you know what?

Genetics is complicated.

I fully admit that here’s a lot of stuff that I don’t know yet, nor fully understand.

Luckily for me, there are a few genetics basics that are easy enough to understand that even a middle school student can master them:

  1. “Evolution” is the theory that species change over time due to some individuals within them being better at getting food, reproducing, etc., than other individuals, and thereby passing on their superior traits to their children.
  2. “Genes,” (or “DNA,”) are the biological code for all life, and the physical mechanism by which traits are passed down from parent to child.
  3. “Mendel squares” work for modeling the inheritance of simple traits
  4. More complicated trait are modeled with more complicated math
  5. Lamarckism doesn’t work.

Lamarck was a naturalist who, in the days before genes were discovered, theorized that creatures could pass on “acquired” characteristics. For example, an animal with a relatively normal neck in an area with tall trees might stretch its neck in order to reach the tastiest leaves, and then pass on this longer-neck to its children, who would also stretch their necks and then pass on the trait to their children, until you get giraffes.

A fellow with similar ideas, Lysenko, was a Soviet Scientist who thought he could make strains of cold-tolerant wheat simply by exposing wheat kernels to the cold.

We have the luxury of thinking that Lysenko’s ideas sound silly. The Soviet peasants had to actually try to grow his wheat, and scientists who pointed out that this was nonsense got sent to the gulag.

The problem with Lamarckism is that it doesn’t work. You can’t make wheat grow in Antarctica by sticking it in your freezer for a few months and animals don’t have taller babies just because you stretch their necks.

So what does this have to do with epigenetics?

Pop science articles talk about epigenetics as if it were Lamarckism. Through the magic of epigenetic markers, acquired traits can supposedly be passed down to one’s children and grandchildren, infinitely.

Actual epigenetics, as scientists actually study it, is a real and interesting field. But the effects of epigenetic changes are not so large and permanent as to substantially change most of the way we model genetic inheritance.


Epigenetics is, in essence, part of how you learn. Suppose you play a disturbing noise every time a mouse smells cherries. Pretty soon, the mouse would learn to associate “fear” and “cherry smell,” and according to Wikipedia, this gets encoded at the epigenetic level. Great, the mouse has learned to be afraid of cherries.

If these epigenetic traits get passed on to the mouse’s children–I am not convinced this is possible but let’s assume it is–then those children can inherit their mother’s fear of cherries.

This is pretty neat, but people take it too far when they assume that as a result, the mouse’s fear will persist over many generations, and that you have essentially just bred a new, cherry-fearing strain of mice.

You, see, you learn new things all the time. So do mice. Your epigenetics therefore keep changing throughout your life. The older you are, the more your epigenetics have changed since you were born. This is why even identical twins differ in small ways from each other. Sooner or later, the young mice will figure out that there isn’t actually any reason to be afraid of cherries, and they’ll stop being afraid.

If people were actually the multi-generational heirs of their ancestors’ trauma, pretty much everyone in the world would be affected, because we all have at least one ancestor who endured some kind of horrors in their life. The entire continent of Europe should be a PTSD basket case due to WWI, WWII, and the Depression.

Thankfully, this is not what we see.

Epigenetics has some real and very interesting effects, but it’s not Lamarckism 2.0.