When Did Black People Evolve?

In previous posts, we discussed the evolution of Whites and Asians, so today we’re taking a look at people from Sub-Saharan Africa.

Modern humans only left Africa about 100,000 to 70,000 yeas ago, and split into Asians and Caucasians around 40,000 years ago. Their modern appearances came later–white skin, light hair and light eyes, for example, only evolved in the past 20,000 and possibly within the past 10,000 years.

What about the Africans, or specifically, Sub-Saharans? (North Africans, like Tunisians and Moroccans, are in the Caucasian clade.) When did their phenotypes evolve?

The Sahara, an enormous desert about the size of the United States, is one of the world’s biggest, most ancient barriers to human travel. The genetic split between SSAs and non-SSAs, therefore, is one of the oldest and most substantial among human populations. But there are even older splits within Africa–some of the ancestors of today’s Pygmies and Bushmen may have split off from other Africans 200,000-300,000 years ago. We’re not sure, because the study of archaic African DNA is still in its infancy.

Some anthropologists refer to Bushmen as “gracile,” which means they are a little shorter than average Europeans and not stockily built

The Bushmen present an interesting case, because their skin is quite light (for Africans.) I prefer to call it golden. The nearby Damara of Namibia, by contrast, are one of the world’s darkest peoples. (The peoples of South Sudan, eg Malik Agar, may be darker, though.) The Pygmies are the world’s shortest peoples; the peoples of South Sudan, such as the Dinka and Shiluk, are among the world’s tallest.

Sub-Saharan Africa’s ethnic groups can be grouped, very broadly, into Bushmen, Pygmies, Bantus (aka Niger-Congo), Nilotics, and Afro-Asiatics. Bushmen and Pygmies are extremely small groups, while Bantus dominate the continent–about 85% of Sub Saharan Africans speak a language from the Niger-Congo family. The Afro-Asiatic groups, as their name implies, have had extensive contact with North Africa and the Middle East.

Most of America’s black population hails from West Africa–that is, the primarily Bantu region. The Bantus and similar-looking groups among the Nilotics and Afro-Asiatics (like the Hausa) are, therefore, have both Africa’s most iconic and most common phenotypes.

For the sake of this post, we are not interested in the evolution of traits common to all humans, such as bipedalism. We are only interested in those traits generally shared by most Sub-Saharans and generally not shared by people outside of Africa.

detailed map of African and Middle Eastern ethnicities in Haaks et al’s dataset

One striking trait is black hair: it is distinctively “curly” or “frizzy.” Chimps and gorrilas do not have curly hair. Neither do whites and Asians. (Whites and Asians, therefore, more closely resemble chimps in this regard.) Only Africans and a smattering of other equatorial peoples like Melanesians have frizzy hair.

Black skin is similarly distinct. Chimps, who live in the shaded forest and have fur, do not have high levels of melanin all over their bodies. While chimps naturally vary in skin tone, an unfortunate, hairless chimp is practically “white.

Humans therefore probably evolved both black skin and frizzy hair at about the same time–when we came out of the shady forests and began running around on the much sunnier savannahs. Frizzy hair seems well-adapted to cooling–by standing on end, it lets air flow between the follicles–and of course melanin is protective from the sun’s rays. (And apparently, many of the lighter-skinned Bushmen suffer from skin cancer.)

Steatopygia also comes to mind, though I don’t know if anyone has studied its origins.

According to Wikipedia, additional traits common to Sub-Saharan Africans include:

In modern craniofacial anthropometry, Negroid describes features that typify skulls of black people. These include a broad and round nasal cavity; no dam or nasal sill; Quonset hut-shaped nasal bones; notable facial projection in the jaw and mouth area (prognathism); a rectangular-shaped palate; a square or rectangular eye orbit shape;[21] a large interorbital distance; a more undulating supraorbital ridge;[22] and large, megadontic teeth.[23] …

Modern cross-analysis of osteological variables and genome-wide SNPs has identified specific genes, which control this craniofacial development. Of these genes, DCHS2, RUNX2, GLI3, PAX1 and PAX3 were found to determine nasal morphology, whereas EDAR impacts chin protrusion.[27] …

Ashley Montagu lists “neotenous structural traits in which…Negroids [generally] differ from Caucasoids… flattish nose, flat root of the nose, narrower ears, narrower joints, frontal skull eminences, later closure of premaxillarysutures, less hairy, longer eyelashes, [and] cruciform pattern of second and third molars.”[28]

The Wikipedia page on Dark Skin states:

As hominids gradually lost their fur (between 4.5 and 2 million years ago) to allow for better cooling through sweating, their naked and lightly pigmented skin was exposed to sunlight. In the tropics, natural selection favoured dark-skinned human populations as high levels of skin pigmentation protected against the harmful effects of sunlight. Indigenous populations’ skin reflectance (the amount of sunlight the skin reflects) and the actual UV radiation in a particular geographic area is highly correlated, which supports this idea. Genetic evidence also supports this notion, demonstrating that around 1.2 million years ago there was a strong evolutionary pressure which acted on the development of dark skin pigmentation in early members of the genus Homo.[25]

About 7 million years ago human and chimpanzee lineages diverged, and between 4.5 and 2 million years ago early humans moved out of rainforests to the savannas of East Africa.[23][28] They not only had to cope with more intense sunlight but had to develop a better cooling system. …

Skin colour is a polygenic trait, which means that several different genes are involved in determining a specific phenotype. …

Data collected from studies on MC1R gene has shown that there is a lack of diversity in dark-skinned African samples in the allele of the gene compared to non-African populations. This is remarkable given that the number of polymorphisms for almost all genes in the human gene pool is greater in African samples than in any other geographic region. So, while the MC1Rf gene does not significantly contribute to variation in skin colour around the world, the allele found in high levels in African populations probably protects against UV radiation and was probably important in the evolution of dark skin.[57][58]

Skin colour seems to vary mostly due to variations in a number of genes of large effect as well as several other genes of small effect (TYR, TYRP1, OCA2, SLC45A2, SLC24A5, MC1R, KITLG and SLC24A4). This does not take into account the effects of epistasis, which would probably increase the number of related genes.[59] Variations in the SLC24A5 gene account for 20–25% of the variation between dark and light skinned populations of Africa,[60] and appear to have arisen as recently as within the last 10,000 years.[61] The Ala111Thr or rs1426654 polymorphism in the coding region of the SLC24A5 gene reaches fixation in Europe, and is also common among populations in North Africa, the Horn of Africa, West Asia, Central Asia and South Asia.[62][63][64]

That’s rather interesting about MC1R. It could imply that the difference in skin tone between SSAs and non-SSAs is due to active selection in Blacks for dark skin and relaxed selection in non-Blacks, rather than active selection for light skin in non-Blacks.

The page on MC1R states:

MC1R is one of the key proteins involved in regulating mammalianskin and hair color. …It works by controlling the type of melanin being produced, and its activation causes the melanocyte to switch from generating the yellow or red phaeomelanin by default to the brown or black eumelanin in replacement. …

This is consistent with active selection being necessary to produce dark skin, and relaxed selection producing lighter tones.

Studies show the MC1R Arg163Gln allele has a high frequency in East Asia and may be part of the evolution of light skin in East Asian populations.[40] No evidence is known for positive selection of MC1R alleles in Europe[41] and there is no evidence of an association between MC1R and the evolution of light skin in European populations.[42] The lightening of skin color in Europeans and East Asians is an example of convergent evolution.

However, we should also note:

Dark-skinned people living in low sunlight environments have been recorded to be very susceptible to vitamin D deficiency due to reduced vitamin D synthesis. A dark-skinned person requires about six times as much UVB than lightly pigmented persons.

PCA graph and map of sampling locations. Modern people are indicated with gray circles.

Unfortunately, most of the work on human skin tones has been done among Europeans (and, oddly, zebra fish,) limiting our knowledge about the evolution of African skin tones, which is why this post has been sitting in my draft file for months. Luckily, though, two recent studies–Loci Associated with Skin Pigmentation Identified in African Populations and Reconstructing Prehistoric African Population Structure–have shed new light on African evolution.

In Reconstructing Prehistoric African Population Structure, Skoglund et al assembled genetic data from 16 prehistoric Africans and compared them to DNA from nearby present-day Africans. They found:

  1. The ancestors of the Bushmen (aka the San/KhoiSan) once occupied a much wider area.
  2. They contributed about 2/3s of the ancestry of ancient Malawi hunter-gatherers (around 8,100-2,500 YA)
  3. Contributed about 1/3 of the ancestry of ancient Tanzanian hunter-gatherers (around 1,400 YA)
  4. Farmers (Bantus) spread from west Africa, completely replacing hunter-gatherers in some areas
  5. Modern Malawians are almost entirely Bantu.
  6. A Tanzanian pastoralist population from 3,100 YA spread out across east Africa and into southern Africa
  7. Bushmen ancestry was not found in modern Hadza, even though they are hunter-gatherers and speak a click language like the Bushmen.
  8. The Hadza more likely derive most of their ancestry from ancient Ethiopians
  9. Modern Bantu-speakers in Kenya derive from a mix between western Africans and Nilotics around 800-400 years ago.
  10. Middle Eastern (Levant) ancestry is found across eastern Africa from an admixture event that occurred around 3,000 YA, or around the same time as the Bronze Age Collapse.
  11. A small amount of Iranian DNA arrived more recently in the Horn of Africa
  12. Ancient Bushmen were more closely related to modern eastern Africans like the Dinka (Nilotics) and Hadza than to modern west Africans (Bantus),
  13. This suggests either complex relationships between the groups or that some Bantus may have had ancestors from an unknown group of humans more ancient than the Bushmen.
  14. Modern Bushmen have been evolving darker skins
  15. Pygmies have been evolving shorter stature
Automated clustering of ancient and modern populations (moderns in gray)

I missed #12-13 on my previous post about this paper, though I did note that the more data we get on ancient African groups, the more likely I think we are to find ancient admixture events. If humans can mix with Neanderthals and Denisovans, then surely our ancestors could have mixed with Ergaster, Erectus, or whomever else was wandering around.

Distribution of ancient Bushmen and Ethiopian DNA in south and east Africa

#15 is interesting, and consistent with the claim that Bushmen suffer from a lot of skin cancer–before the Bantu expansion, they lived in far more forgiving climates than the Kalahari desert. But since Bushmen are already lighter than their neighbors, this begs the question of how light their ancestors–who had no Levantine admixture–were. Could the Bantus’ and Nilotics’ darker skins have evolved after the Bushmen/everyone else split?

Meanwhile, in Loci Associated with Skin Pigmentation Identified in African Populations, Crawford et al used genetic samples from 1,570 people from across Africa to find six genetic areas–SLC24A5, MFSD12, DDB1, TMEM138, OCA2 and HERC2–which account for almost 30% of the local variation in skin color.

Bantu (green) and Levantine/pastoralist DNA in modern peoples

SLC24A5 is a light pigment introduced to east Africa from the Levant, probably around 3,000 years ago. Today, it is common in Ethiopia and Tanzania.

Interestingly, according to the article, “At all other loci, variants associated with dark pigmentation in Africans are identical by descent in southern Asian and Australo-Melanesian populations.”

These are the world’s other darkest peoples, such as the Jarawas of the Andaman Islands or the Melanesians of Bougainville, PNG. (And, I assume, some groups from India such as the Tamils.) This implies that these groups 1. had dark skin already when they left Africa, and 2. Never lost it on their way to their current homes. (If they had gotten lighter during their journey and then darkened again upon arrival, they likely would have different skin color variants than their African cousins.)

This implies that even if the Bushmen split off (around 200,000-300,000 YA) before dark skin evolved, it had evolved by the time people left Africa and headed toward Australia (around 100,000-70,000 YA.) This gives us a minimum threshold: it most likely evolved before 70,000 YA.

(But as always, we should be careful because perhaps there are even more skin color variant that we don’t know about yet in these populations.)

MFSD12 is common among Nilotics and is related to darker skin.

And according to the abstract, which Razib Khan posted:

Further, the alleles associated with skin pigmentation at all loci but SLC24A5 are ancient, predating the origin of modern humans. The ancestral alleles at the majority of predicted causal SNPs are associated with light skin, raising the possibility that the ancestors of modern humans could have had relatively light skin color, as is observed in the San population today.

The full article is not out yet, so I still don’t know when all of these light and dark alleles emerged, but the order is absolutely intriguing. For now, it looks like this mystery will still have to wait.



Two Exciting Papers on African Genetics

I loved that movie
Nǃxau ǂToma, (aka Gcao Tekene Coma,) Bushman star of “The Gods Must be Crazy,” roughly 1944-2003

An interesting article on Clues to Africa’s Mysterious Past appeared recently in the NY Times:

It was only two years ago that researchers found the first ancient human genome in Africa: a skeleton in a cave in Ethiopia yielded DNA that turned out to be 4,500 years old.

On Thursday, an international team of scientists reported that they had recovered far older genes from bone fragments in Malawi dating back 8,100 years. The researchers also retrieved DNA from 15 other ancient people in eastern and southern Africa, and compared the genes to those of living Africans.

Let’s skip to the article, Reconstructing Prehistoric African Population Structure by Skoglund et al:

We assembled genome-wide data from 16 prehistoric Africans. We show that the anciently divergent lineage that comprises the primary ancestry of the southern African San had a wider distribution in the past, contributing approximately two-thirds of the ancestry of Malawi hunter-gatherers ∼8,100–2,500 years ago and approximately one-third of the ancestry of Tanzanian hunter-gatherers ∼1,400 years ago.

Paths of the great Bantu Migration

The San are also known as the Bushmen, a famous group of recent hunter-gatherers from southern Africa.

We document how the spread of farmers from western Africa involved complete replacement of local hunter-gatherers in some regions…

This is most likely the Great Bantu Migration, which I wrote about in Into Africa: the Great Bantu Migration.

…and we track the spread of herders by showing that the population of a ∼3,100-year-old pastoralist from Tanzania contributed ancestry to people from northeastern to southern Africa, including a ∼1,200-year-old southern African pastoralist…

Whereas the two individuals buried in ∼2,000 BP hunter-gatherer contexts in South Africa share ancestry with southern African Khoe-San populations in the PCA, 11 of the 12 ancient individuals who lived in eastern and south-central Africa between ∼8,100 and ∼400 BP form a gradient of relatedness to the eastern African Hadza on the one hand and southern African Khoe-San on the other (Figure 1A).

The Hadza are a hunter-gatherer group from Tanzania who are not obviously related to any other people. Their language has traditionally been classed alongside the languages of the KhoiSan/Bushmen people because they all contain clicks, but the languages otherwise have very little in common and Hadza appears to be a language isolate, like Basque.

The genetic cline correlates to geography, running along a north-south axis with ancient individuals from Ethiopia (∼4,500 BP), Kenya (∼400 BP), Tanzania (both ∼1,400 BP), and Malawi (∼8,100–2,500 BP), showing increasing affinity to southern Africans (both ancient individuals and present-day Khoe-San). The seven individuals from Malawi show no clear heterogeneity, indicating a long-standing and distinctive population in ancient Malawi that persisted for at least ∼5,000 years (the minimum span of our radiocarbon dates) but which no longer exists today. …

We find that ancestry closely related to the ancient southern Africans was present much farther north and east in the past than is apparent today. This ancient southern African ancestry comprises up to 91% of the ancestry of Khoe-San groups today (Table S5), and also 31% ± 3% of the ancestry of Tanzania_Zanzibar_1400BP, 60% ± 6% of the ancestry of Malawi_Fingira_6100BP, and 65% ± 3% of the ancestry of Malawi_Fingira_2500BP (Figure 2A). …

Both unsupervised clustering (Figure 1B) and formal ancestry estimation (Figure 2B) suggest that individuals from the Hadza group in Tanzania can be modeled as deriving all their ancestry from a lineage related deeply to ancient eastern Africans such as the Ethiopia_4500BP individual …

So what’s up with the Tanzanian expansion mentioned in the summary?

Western-Eurasian-related ancestry is pervasive in eastern Africa today … and the timing of this admixture has been estimated to be ∼3,000 BP on average… We found that the ∼3,100 BP individual… associated with a Savanna Pastoral Neolithic archeological tradition, could be modeled as having 38% ± 1% of her ancestry related to the nearly 10,000-year-old pre-pottery farmers of the Levant These results could be explained by migration into Africa from descendants of pre-pottery Levantine farmers or alternatively by a scenario in which both pre-pottery Levantine farmers and Tanzania_Luxmanda_3100BP descend from a common ancestral population that lived thousands of years earlier in Africa or the Near East. We fit the remaining approximately two-thirds of Tanzania_Luxmanda_3100BP as most closely related to the Ethiopia_4500BP…

…present-day Cushitic speakers such as the Somali cannot be fit simply as having Tanzania_Luxmanda_3100BP ancestry. The best fitting model for the Somali includes Tanzania_Luxmanda_3100BP ancestry, Dinka-related ancestry, and 16% ± 3% Iranian-Neolithic-related ancestry (p = 0.015). This suggests that ancestry related to the Iranian Neolithic appeared in eastern Africa after earlier gene flow related to Levant Neolithic populations, a scenario that is made more plausible by the genetic evidence of admixture of Iranian-Neolithic-related ancestry throughout the Levant by the time of the Bronze Age …and in ancient Egypt by the Iron Age …

There is then a discussion of possible models of ancient African population splits (were the Bushmen the first? How long have they been isolated?) I suspect the more ancient African DNA we uncover, the more complicated the tree will become, just as in Europe and Asia we’ve discovered Neanderthal and Denisovan admixture.

They also compared genomes to look for genetic adaptations and found evidence for selection for taste receptors and “response to radiation” in the Bushmen, which the authors note “could be due to exposure to sunlight associated with the life of the ‡Khomani and Ju|’hoan North people in the Kalahari Basin, which has become a refuge for hunter-gatherer populations in the last millenia due to encroachment by pastoralist and agriculturalist groups.”

(The Bushmen are lighter than Bantus, with a more golden or tan skin tone.)

They also found evidence of selection for short stature among the Pygmies (which isn’t really surprising to anyone, unless you thought they had acquired their heights by admixture with another very short group of people.)

Overall, this is a great paper and I encourage you to RTWT, especially the pictures/graphs.

Now, if that’s not enough African DNA for you, we also have Loci Associated with Skin Pigmentation Identified in African Populations, by Crawford et al:

Examining ethnically diverse African genomes, we identify variants in or near SLC24A5, MFSD12, DDB1, TMEM138, OCA2 and HERC2 that are significantly associated with skin pigmentation. Genetic evidence indicates that the light pigmentation variant at SLC24A5 was introduced into East Africa by gene flow from non-Africans. At all other loci, variants associated with dark pigmentation in Africans are identical by descent in southern Asian and Australo-Melanesian populations. Functional analyses indicate that MFSD12 encodes a lysosomal protein that affects melanogenesis in zebrafish and mice, and that mutations in melanocyte-specific regulatory regions near DDB1/TMEM138 correlate with expression of UV response genes under selection in Eurasians.

I’ve had an essay on the evolution of African skin tones sitting in my draft folder for ages because this research hadn’t been done. There’s plenty of research on European and Asian skin tones (skin appears to have significantly lightened around 10,000 years ago in Europeans,) but much less on Africans. Luckily for me, this paper fixes that.

Looks like SLC24A5 is related to that Levantine/Iranian back-migration into Africa documented in the first paper.

Parsis, Travellers, and Human Niches

Irish Travellers, 1954


Why are there many kinds of plants and animals? Why doesn’t the best out-compete, eat, and destroy the others, rising to be the sole dominant species on Earth?

In ecology, a niche is an organism’s specific place within the environment. Some animals eat plants; some eat dung. Some live in the sea; others in trees. Different plants flower and grow in different seasons; some are pollinated by bees and some by flies. Every species has its specific niche.

The Competitive Exclusion Principle (aka Gause’s Law) states that ‘no two species can occupy the same niche’ (or positively, ‘two species coexisting must have different niches.’) For example, if squirrels and chipmunks both want to nest in the treetops and eat nuts, (and there are limited treetops and nuts,) then over time, whichever species is better at finding nuts and controlling the treetops will dominate the niche and the other, less successful species will have to find a new niche.

If squirrels are dominating the treetops and nuts, this leaves plenty of room for rabbits to eat grass and owls to eat squirrels.

II. So I was reading recently about the Parsis and the Travellers. The Parsis, as we discussed on Monday, are Zoroastrians, originally from Persia (modern-day Iran,) who settled in India about a thousand yeas ago. They’re often referred to as the “Jews of India” because they played a similar role in Indian society to that historically played by Jews in Europe.*

*Yes I know there are actual Jews in India.

The Travellers are an Irish group that’s functionally similar to Gypsies, but in fact genetically Irish:

In 2011 an analysis of DNA from 40 Travellers was undertaken at the Royal College of Surgeons in Dublin and the University of Edinburgh. The study provided evidence that Irish Travellers are a distinct Irish ethnic minority, who separated from the settled Irish community at least 1000 years ago; the claim was made that they are as distinct from the settled community as Icelanders are from Norwegians.[36]

It appears that Ireland did not have enough Gypsies of Indian extraction and so had to invent its own.

And though I originally thought that only in jest, why not? Gypsies occupy a particular niche, and if there are Gypsies around, I doubt anyone else is going to out-compete them for that niche. But if there aren’t any, then surely someone else could.

According to Wikipedia, the Travellers traditionally were tinkers, mended tinware (like pots) and acquiring dead/old horses for slaughter.

The Gypsies appear to have been originally itinerant musicians/entertainers, but have also worked as tinkers, smiths, peddlers, miners, and horse traders (today, car salesmen.)

These are not glorious jobs, but they are jobs, and peripatetic people have done them.

Jews (and Parsis, presumably) also filled a social niche, using their network of family/religious ties to other Jews throughout the diaspora as the basis of a high-trust business/trading network at a time when trade was difficult and routes were dangerous.

On the subject of “Madeburg rights” or law in Eastern Europe, Wikipedia notes:

In medieval Poland, Jews were invited along with German merchants to settle in cities as part of the royal city development policy.

Jews and Germans were sometimes competitors in those cities. Jews lived under privileges that they carefully negotiated with the king or emperor. They were not subject to city jurisdiction. These privileges guaranteed that they could maintain communal autonomy, live according to their laws, and be subjected directly to the royal jurisdiction in matters concerning Jews and Christians. One of the provisions granted to Jews was that a Jew could not be made Gewährsmann, that is, he could not be compelled to tell from whom he acquired any object which had been sold or pledged to him and which was found in his possession. Other provisions frequently mentioned were a permission to sell meat to Christians, or employ Christian servants.

External merchants coming into the city were not allowed to trade on their own, but instead forced to sell the goods they had brought into the city to local traders, if any wished to buy them.

Note that this situation is immensely better if you already know the guy you’re selling to inside the city and he’s not inclined to cheat you because you both come from the same small, tight-knit group.


Under Bolesław III (1102–1139), the Jews, encouraged by the tolerant regime of this ruler, settled throughout Poland, including over the border in Lithuanian territory as far as Kiev.[32] Bolesław III recognized the utility of Jews in the development of the commercial interests of his country. … Mieszko III employed Jews in his mint as engravers and technical supervisors, and the coins minted during that period even bear Hebraic markings.[30] … Jews enjoyed undisturbed peace and prosperity in the many principalities into which the country was then divided; they formed the middle class in a country where the general population consisted of landlords (developing into szlachta, the unique Polish nobility) and peasants, and they were instrumental in promoting the commercial interests of the land.

If you need merchants and goldsmiths, someone will become merchants and goldsmiths. If it’s useful for those merchants and goldsmiths to all be part of one small, close-knit group, then a small, close-knit group is likely to step into that niche and out-compete anyone else trying to occupy it.

The similarity of the Parsis to the Jews probably has less to do with them both being monotheists (after all, Christians, Muslims, and Sikhs are also monotheists,) and more to do with them both being small but widely-flung diasporic communities united by a common religion that allows them to use their group as a source of trustworthy business partners.

Over hundreds or thousands of years, humans might not just move into social niches, but actually become adapted to them–Jew and Parsis are both reported to be very smart, for example.

III. I can think of several other cases of ethnic groups moving into a particular niche. In the US, the gambling and bootleg alcohol trade were long dominated by ethnic Sicilians, while the crack and heroin trades have been dominated by black and Hispanic gangs.

Note that, while these activities are (often) illegal, they are still things that people want to do and the mafia/gangs are basically providing a goods/services to their customers. As they see it, they’re just businessmen. They’re out to make money, not commit random violence.

That said, these guys do commit lots of violence, including murder, blackmail and extortion. Even violent crime can be its own niche, if it pays well enough.

(Ironically, police crackdown on ethnic Sicilian control in NYC coincided with a massive increase in crime–did the mafia, by controlling a particular territory, keep out competing bands of criminals?)

On a more pleasant note, society is now rich enough that many people can make a living as professional sports stars, marry other professional sports stars, and have children who go on to also be professional sports stars. It’s not quite at the level of “a caste of professional athletes genetically optimized for particular sports,” but if this kept up for a few hundred years, it could be.

Similarly, over in Nepal, “Sherpa” isn’t just a job, it’s an ethnic group. Sherpas, due to their high elevation adaptation, have an advantage over the rest of us when it comes to scaling Mt. Everest, and I hear the global mountain-climbing industry pays them well for their services. A Sherpa who can successfully scale Mt. Everest many times, make lots of money, and raise lots of children in an otherwise impoverished nation is thus a successful Sherpa–and contributing to the group’s further genetic and cultural specialization in the “climbing Mt. Everest” niche.

India, of course, is the ultimate case of ethnic groups specializing into specific jobs–it’d be interesting to see what adaptations different groups have acquired over the years.

I also wonder if the caste system is an effective way to minimize competition between groups in a multi-ethnic society, or if it leads to more resentment and instability in the long run.

Peak Dog vs. Degenerate Dog?

This is Balto, the famous Siberian Husky sled dog who led his team on the final leg of the 1925 serum run to Nome, Alaska. The windchill of the whiteout blizzard when Balto set out was −70 °F. The team traveled all night, with almost no visibility, over the 600-foot Topkok Mountain, and reached Nome at 5:30 AM.

Balto is not the only dog who deserves credit–Togo took a longer and even more dangerous stretch of the run.

And this is a modern Siberian Husky:

Now, don’t get me wrong. He’s a beautiful dog. But he’s a very different dog. I think he’s trying to turn into a German Shepherd-wolf hybrid. Balto practically looks like a corgi next to him.

Siberian huskies were bred by people who depended on them for their lives, and had to endure some of nature’s very harshest weather. We moderns, by contrast, like to keep our dogs inside our warm, comfortable houses to play with our kids or guard our stuff. Have modern huskies been bred for looks rather than sled-pulling?

On the other hand, winning times for the Iditarod have dropped from 20 days to just 8 since the race began in the 1970s, so clearly there are some very fast huskies out there.

The Negritos of Sundaland, Sahul, and the Philippines

Ati (Negrito) woman from the Philippines

The Negritos are a fascinating group of short-statured, dark-skinned, frizzy-haired peoples from southeast Asia–chiefly the Andaman Islands, Malaysia, Philippines, and Thailand. (Spelling note: “Negritoes” is also an acceptable plural, and some sources use the Spanish Negrillos.)

Because of their appearance, they have long been associated with African peoples, especially the Pygmies. Pygmies are formally defined as any group where adult men are, on average 4’11” or less and is almost always used specifically to refer to African Pygmies; the term pygmoid is sometimes used for groups whose men average 5’1″ or below, including the Negritos. (Some of the Bushmen tribes, Bolivians, Amazonians, the remote Taron, and a variety of others may also be pygmoid, by this definition.)

However, genetic testing has long indicated that they, along with other Melanesians and Australian Aborigines, are more closely related to other east Asian peoples than any African groups. In other words, they’re part of the greater Asian race, albeit a distant branch of it.

But how distant? And are the various Negrito groups closely related to each other, or do there just happen to be a variety of short groups of people in the area, perhaps due to convergent evolution triggered by insular dwarfism?

From Wikimedia

In Discerning the origins of the Negritos, First Sundaland Peoples: deep divergence and archaic admixture, Jinam et al gathered genetic data from Filipino, Malaysian, and Andamanese Negrito populations, and compared them both to each other and other Asian, African, and European groups. (Be sure to download the supplementary materials to get all of the graphs and maps.)

They found that the Negrito groups they studied “are basal to other East and Southeast Asians,” (basal: forming the bottom layer or base. In this case, it means they split off first,) “and that they diverged from West Eurasians at least 38,000 years ago.” (West Eurasians: Caucasians, consisting of Europeans, Middle Easterners, North Africans, and people from India.) “We also found relatively high traces of Denisovan admixture in the Philippine Negritos, but not in the Malaysian and Andamanese groups.” (Denisovans are a group of extinct humans similar to Neanderthals, but we’ve yet to find many of their bones. Just as Neanderthal DNA shows up in non-Sub-Saharan-Africans, so Denisvoan shows up in Melanesians.)

Figure 1 (A) shows PC analysis of Andamanese, Malaysian, and Philippine Negritos, revealing three distinct clusters:

In the upper right-hand corner, the Aeta, Agta, Batak, and Mamanwa are Philippine Negritos. The Manobo are non-Negrito Filipinos.

In the lower right-hand corner are the Jehai, Kintak and Batek are Malaysian Negritos.

And in the upper left, we have the extremely isolated Andamanese Onge and Jarawa Negritos.

(Phil-NN and Mly-NN I believe are Filipino and Malaysian Non-Negritos.)

You can find the same chart, but flipped upside down, with Papuan and Melanesian DNA in the supplemental materials. Of the three groups, they cluster closest to the Philippine Negritos, along the same line with the Malaysians.

By excluding the Andamanese (and Kintak) Negritos, Figure 1 (B) allows a closer look at the structure of the Philippine Negritos.

The Agta, Aeta, and Batak form a horizontal “comet-like pattern,” which likely indicates admixture with non-Negrito Philipine groups like the Manobo. The Mamanawa, who hail from a different part of the Philippines, also show this comet-like patterns, but along a different axis–likely because they intermixed with the different Filipinos who lived in their area. As you can see, there’s a fair amount of overlap–several of the Manobo individuals clustered with the Mamanwa Negritos, and the Batak cluster near several non-Negrito groups (see supplemental chart S4 B)–suggesting high amounts of mixing between these groups.

ADMIXTURE analysis reveals a similar picture. The non-Negrito Filipino groups show up primarily as Orange. The Aeta, Agta, and Batak form a clear genetic cluster with each other and cline with the Orange Filipinos, with the Aeta the least admixed and Batak the most.

The white are on the chart isn’t a data error, but the unique signature of the geographically separated Mananwa, who are highly mixed with the Manobo–and the Manobo, in turn, are mixed with them.

But this alone doesn’t tell us how ancient these populations are, nor if they’re descended from one ancestral pop. For this, the authors constructed several phylogenetic trees, based on all of the data at hand and assuming from 0 – 5 admixture events. The one on the left assumes 5 events, but for clarity only shows three of them. The Denisovan DNA is fascinating and well-documented elsewhere in Melanesian populatons; that Malaysian and Philippine Negritos mixed with their neighbors is also known, supporting the choice of this tree as the most likely to be accurate.

Regardless of which you pick, all of the trees show very similar results, with the biggest difference being whether the Melanesians/Papuans split before or after the Andamanese/Malaysian Negritos.

In case you are unfamiliar with these trees, I’ll run down a quick explanation: This is a human family tree, with each split showing where one group of humans split off from the others and became an isolated group with its own unique genetic patterns. The orange and red lines mark places where formerly isolated groups met and interbred, producing children that are a mix of both. The first split in the tree, going back million of years, is between all Homo sapiens (our species) and the Denisovans, a sister species related to the Neanderthals.

All humans outside of sub-Saharan Africans have some Neanderthal DNA because their ancestors met and interbred with Neanderthals on their way Out of Africa. Melanesians, Papuans, and some Negritos also have some Denisovan DNA, because their ancestors met and made children with members of this obscure human species, but Denisovan DNA is quite rare outside these groups.

Here is a map of Denisovan DNA levels the authors found, with 4% of Papuan DNA hailing from Denisivan ancestors, and Aeta nearly as high. By contrast, the Andamanese Negritos appear to have zero Denisovan. Either the Andamanese split off before the ancestors of the Philippine Negritos and Papuans met the Denisovans, or all Denisovan DNA has been purged from their bloodlines, perhaps because it just wasn’t helpful for surviving on their islands.

Back to the Tree: The second node is where the Biaka, a group of Pygmies from the Congo Rainforest in central Africa. Pygmy lineages are among the most ancient on earth, potentially going back over 200,000 years, well before any Homo sapiens had left Africa.

The next group that splits off from the rest of humanity are the Yoruba, a single ethnic group chosen to stand in for the entirety of the Bantus. Bantus are the group that you most likely think of when you think of black Africans, because over the past three millennia they have expanded greatly and conquered most of sub-Saharan Africa.

Next we have the Out of Africa event and the split between Caucasians (here represented by the French) and the greater Asian clade, which includes Australian Aborigines, Melanesians, Polynesians, Chinese, Japanese, Siberians, Inuit, and Native Americans.

The first groups to split off from the greater Asian clade (aka race) were the Andamanese and Malaysian Negritos, followed by the Papuans/Melanesians Australian Aborigines are closely related to Papuans, as Australia and Papua New Guinea were connected in a single continent (called Sahul) back during the last Ice Age. Most of Indonesia and parts of the Philippines were also connected into a single landmass, called Sunda. Sensibly, people reached Sunda before Sahul, though (Perhaps at that time the Andaman islands, to the northwest of Sumatra, were also connected or at least closer to the mainland.)

Irrespective of the exact order in which Melanesians and individual Negrito groups split off, they all split well before all of the other Asian groups in the area.

This is supported by legends told by the Filipinos themselves:

Legends, such as those involving the Ten Bornean Datus and the Binirayan Festival, tell tales about how, at the beginning of the 12th century when Indonesia and Philippines were under the rule of Indianized native kingdoms, the ancestors of the Bisaya escaped from Borneo from the persecution of Rajah Makatunaw. Led by Datu Puti and Datu Sumakwel and sailing with boats called balangays, they landed near a river called Suaragan, on the southwest coast of Panay, (the place then known as Aninipay), and bartered the land from an Ati [Negrito] headman named Polpolan and his son Marikudo for the price of a necklace and one golden salakot. The hills were left to the Atis while the plains and rivers to the Malays. This meeting is commemorated through the Ati-atihan festival.[4]

The study’s authors estimate that the Negritos split from Europeans (Caucasians) around 30-38,000 years ago, and that the Malaysian and Philippine Negritos split around
13-15,000 years ago. (This all seems a bit tentative, IMO, especially since we have physical evidence of people in the area going back much further than that, and the authors themselves admit in the discussion that their time estimate may be too short.)

The authors also note:

Both our NJ (fig. 3A) and UPGMA (supplementary fig. S10) trees show that after divergence from Europeans, the ancestral Asians subsequently split into Papuans, Negritos and East Asians, implying a one-wave colonization of Asia. … This is in contrast to the study based on whole genome sequences that suggested Australian Aboriginal/Papuan first split from European/East Asians 60 kya, and later Europeans and East Asians diverged 40 kya (Malaspinas et al. 2016). This implies a two-wave migration into Asia…

The matter is still up for debate/more study.

Negrito couple from the Andaman Islands

In conclusion: All of the Negrito groups are likely descended from a common ancestor, (rather than having evolved from separate groups that happened to develop similar body types due to exposure to similar environments,) and were among the very first inhabitants of their regions. Despite their short stature, they are more closely related to other Asian groups (like the Chinese) than to African Pygmies. Significant mixing with their neighbors, however, is quickly obscuring their ancient lineages.

I wonder if all ancient human groups were originally short, and height a recently evolved trait in some groups?

In closing, I’d like to thank Jinam et al for their hard work in writing this article and making it available to the public, their sponsors, and the unique Negrito peoples themselves for surviving so long.

Thermodynamics and Urban Sprawl

Termite Mound

Evolution is just a special case of thermodynamics. Molecules spontaneously arrange themselves to optimally dissipate energy.

Society itself is a thermodynamic system for entropy dissipation. Energy goes in–in the form of food and, recently, fuels like oil–and children and buildings come out.

Government is simply the entire power structure of a region–from the President to your dad, from bandits to your boss. But when people say, “government,” they typically mean the official one written down in laws that lives in white buildings in Washington, DC.


When the “government” makes laws that try to change the natural flow of energy or information through society, society responds by routing around the law, just as water flows around a boulder that falls in a stream.

The ban on trade with Britain and France in the early 1800s, for example, did not actually stop people from trading with Britain and France–trade just became re-routed through smuggling operations. It took a great deal of energy–in the form of navies–to suppress piracy and smuggling in the Gulf and Caribbean–chiefly by executing pirates and imprisoning smugglers.


When the government decided that companies couldn’t use IQ tests in hiring anymore (because IQ tests have a “disparate impact” on minorities because black people tend to score worse, on average, than whites,) in Griggs vs. Duke Power, they didn’t start hiring more black folks. They just started using college degrees as a proxy for intelligence, contributing to the soul-crushing debt and degree inflation young people know and love today.

Similarly, when the government tried to stop companies from asking about applicants’ criminal histories–again, because the results were disproportionately bad for minorities–companies didn’t start hiring more blacks. Since not hiring criminals is important to companies, HR departments turned to the next best metric: race. These laws ironically led to fewer blacks being hired, not more.

Where the government has tried to protect the poor by passing tenant’s rights laws, we actually see the opposite: poorer tenants are harmed. By making it harder to evict tenants, the government makes landlords reluctant to take on high-risk (ie, poor) tenants.

The passage of various anti-discrimination and subsidized housing laws (as well as the repeal of various discriminatory laws throughout the mid-20th century) lead to the growth of urban ghettos, which in turn triggered the crime wave of the 70s, 80s, and 90s.

Crime and urban decay have made inner cities–some of the most valuable real estate in the country–nigh unlivable, resulting in the “flight” of millions of residents and the collective loss of millions of dollars due to plummeting home values.

Work-arounds are not cheap. They are less efficient–and thus more expensive–than the previous, banned system.

Urban sprawl driven by white flight

Smuggled goods cost more than legally traded goods due to the personal risks smugglers must take. If companies can’t tell who is and isn’t a criminal, the cost of avoiding criminals becomes turning down good employees just because they happen to be black. If companies can’t directly test intelligence, the cost becomes a massive increase in the amount of money being spent on accreditation and devaluation of the signaling power of a degree.

We have dug up literally billions of dollars worth of concentrated sunlight in the form of fossil fuels in order to rebuild our nation’s infrastructure in order to work around the criminal blights in the centers of our cities, condemning workers to hour-long commutes and paying inflated prices for homes in neighborhoods with “good schools.”

Note: this is not an argument against laws. Some laws increase efficiency. Some laws make life better.

This is a reminder that everything is subject to thermodynamics. Nothing is free.

Existential Caprine


You were



There were predators

The lions could be confusing

But you were free

goat painting, Herculaneum

Then came men

Faster, smarter than lions

They killed the wolves

Brought you food

(The bread of slavery, they say, is far sweeter than the bread of freedom.)

And shelter

Children were born, safe from wolves, hunger, or cold

and you grew used to man.

Centuries passed

And it seemed you outnumbered the stars

Perhaps your sons disappeared

But was it worse than wolves?

You could almost forget you were once wild

Could you return to the mountains, even if you wanted to?

And as they lead you away

You ask

Did I ever have a choice?


To explain: The process of domestication is fascinating. Some animals, like wolves, began associating with humans because they could pick up our scraps. Others, like cats, began living in our cities because they liked eating the vermin we attracted. (You might say the mice, too, are domesticated.) These relationships are obviously mutually beneficial (aside from the mice.)

The animals we eat, though, have a different–more existential–story.

Humans increased the number of wild goats and sheep available for them to eat by eliminating competing predators, like wolves and lions. We brought them food in the winter, built them shelters to keep them warm in the winter, and led them to the best pastures. As a result, their numbers increased.

But, of course, we eat them.

From the goat’s perspective, is it worth it?

There’s a wonderful metaphor in the Bible, enacted every Passover: matzoh.

If you’ve never had it, matzoh tastes like saltines, only worse. It’s the bread of freedom, hastily thrown on the fire, hastily thrown on the fire and carried away.

The bread of slavery tastes delicious. The bread of freedom tastes awful.

1And they took their journey from Elim, and all the congregation of the children of Israel came unto the wilderness of Sin, which is between Elim and Sinai, on the fifteenth day of the second month after their departing out of the land of Egypt. 2And the whole congregation of the children of Israel murmured against Moses and Aaron in the wilderness: 3And the children of Israel said unto them, Would to God we had died by the hand of the LORD in the land of Egypt, when we sat by the flesh pots, and when we did eat bread to the full… Exodus 16

Even if the goats didn’t want to be domesticated, hated it and fought against it, did they have any choice? If the domesticated goats have more surviving children than wild ones, then goats will become domesticated. It’s a simple matter of numbers:

Total Fertility Rate by Country: Purple = 7 children per woman; Blue = 1 child per woman

The future belongs to those who show up.

Which future do you chose?

Evolution is slow–until it’s fast: Genetic Load and the Future of Humanity

Source: Priceonomics

A species may live in relative equilibrium with its environment, hardly changing from generation to generation, for millions of years. Turtles, for example, have barely changed since the Cretaceous, when dinosaurs still roamed the Earth.

But if the environment changes–critically, if selective pressures change–then the species will change, too. This was most famously demonstrated with English moths, which changed color from white-and-black speckled to pure black when pollution darkened the trunks of the trees they lived on. To survive, these moths need to avoid being eaten by birds, so any moth that stands out against the tree trunks tends to get turned into an avian snack. Against light-colored trees, dark-colored moths stood out and were eaten. Against dark-colored trees, light-colored moths stand out.

This change did not require millions of years. Dark-colored moths were virtually unknown in 1810, but by 1895, 98% of the moths were black.

The time it takes for evolution to occur depends simply on A. The frequency of a trait in the population and B. How strongly you are selecting for (or against) it.

Let’s break this down a little bit. Within a species, there exists a great deal of genetic variation. Some of this variation happens because two parents with different genes get together and produce offspring with a combination of their genes. Some of this variation happens because of random errors–mutations–that occur during copying of the genetic code. Much of the “natural variation” we see today started as some kind of error that proved to be useful, or at least not harmful. For example, all humans originally had dark skin similar to modern Africans’, but random mutations in some of the folks who no longer lived in Africa gave them lighter skin, eventually producing “white” and “Asian” skin tones.

(These random mutations also happen in Africa, but there they are harmful and so don’t stick around.)

Natural selection can only act on the traits that are actually present in the population. If we tried to select for “ability to shoot x-ray lasers from our eyes,” we wouldn’t get very far, because no one actually has that mutation. By contrast, albinism is rare, but it definitely exists, and if for some reason we wanted to select for it, we certainly could. (The incidence of albinism among the Hopi Indians is high enough–1 in 200 Hopis vs. 1 in 20,000 Europeans generally and 1 in 30,000 Southern Europeans–for scientists to discuss whether the Hopi have been actively selecting for albinism. This still isn’t a lot of albinism, but since the American Southwest is not a good environment for pale skin, it’s something.)

You will have a much easier time selecting for traits that crop up more frequently in your population than traits that crop up rarely (or never).

Second, we have intensity–and variety–of selective pressure. What % of your population is getting removed by natural selection each year? If 50% of your moths get eaten by birds because they’re too light, you’ll get a much faster change than if only 10% of moths get eaten.

Selection doesn’t have to involve getting eaten, though. Perhaps some of your moths are moth Lotharios, seducing all of the moth ladies with their fuzzy antennae. Over time, the moth population will develop fuzzier antennae as these handsome males out-reproduce their less hirsute cousins.

No matter what kind of selection you have, nor what part of your curve it’s working on, all that ultimately matters is how many offspring each individual has. If white moths have more children than black moths, then you end up with more white moths. If black moths have more babies, then you get more black moths.

Source SUPS.org

So what happens when you completely remove selective pressures from a population?

Back in 1968, ethologist John B. Calhoun set up an experiment popularly called “Mouse Utopia.” Four pairs of mice were given a large, comfortable habitat with no predators and plenty of food and water.

Predictably, the mouse population increased rapidly–once the mice were established in their new homes, their population doubled every 55 days. But after 211 days of explosive growth, reproduction began–mysteriously–to slow. For the next 245 days, the mouse population doubled only once every 145 days.

The birth rate continued to decline. As births and death reached parity, the mouse population stopped growing. Finally the last breeding female died, and the whole colony went extinct.


As I’ve mentioned before Israel is (AFAIK) the only developed country in the world with a TFR above replacement.

It has long been known that overcrowding leads to population stress and reduced reproduction, but overcrowding can only explain why the mouse population began to shrink–not why it died out. Surely by the time there were only a few breeding pairs left, things had become comfortable enough for the remaining mice to resume reproducing. Why did the population not stabilize at some comfortable level?

Professor Bruce Charlton suggests an alternative explanation: the removal of selective pressures on the mouse population resulted in increasing mutational load, until the entire population became too mutated to reproduce.

What is genetic load?

As I mentioned before, every time a cell replicates, a certain number of errors–mutations–occur. Occasionally these mutations are useful, but the vast majority of them are not. About 30-50% of pregnancies end in miscarriage (the percent of miscarriages people recognize is lower because embryos often miscarry before causing any overt signs of pregnancy,) and the majority of those miscarriages are caused by genetic errors.

Unfortunately, randomly changing part of your genetic code is more likely to give you no skin than skintanium armor.

But only the worst genetic problems that never see the light of day. Plenty of mutations merely reduce fitness without actually killing you. Down Syndrome, famously, is caused by an extra copy of chromosome 21.

While a few traits–such as sex or eye color–can be simply modeled as influenced by only one or two genes, many traits–such as height or IQ–appear to be influenced by hundreds or thousands of genes:

Differences in human height is 60–80% heritable, according to several twin studies[19] and has been considered polygenic since the Mendelian-biometrician debate a hundred years ago. A genome-wide association (GWA) study of more than 180,000 individuals has identified hundreds of genetic variants in at least 180 loci associated with adult human height.[20] The number of individuals has since been expanded to 253,288 individuals and the number of genetic variants identified is 697 in 423 genetic loci.[21]

Obviously most of these genes each plays only a small role in determining overall height (and this is of course holding environmental factors constant.) There are a few extreme conditions–gigantism and dwarfism–that are caused by single mutations, but the vast majority of height variation is caused by which particular mix of those 700 or so variants you happen to have.

The situation with IQ is similar:

Intelligence in the normal range is a polygenic trait, meaning it’s influenced by more than one gene.[3][4]

The general figure for the heritability of IQ, according to an authoritative American Psychological Association report, is 0.45 for children, and rises to around 0.75 for late teens and adults.[5][6] In simpler terms, IQ goes from being weakly correlated with genetics, for children, to being strongly correlated with genetics for late teens and adults. … Recent studies suggest that family and parenting characteristics are not significant contributors to variation in IQ scores;[8] however, poor prenatal environment, malnutrition and disease can have deleterious effects.[9][10]

And from a recent article published in Nature Genetics, Genome-wide association meta-analysis of 78,308 individuals identifies new loci and genes influencing human intelligence:

Despite intelligence having substantial heritability2 (0.54) and a confirmed polygenic nature, initial genetic studies were mostly underpowered3, 4, 5. Here we report a meta-analysis for intelligence of 78,308 individuals. We identify 336 associated SNPs (METAL P < 5 × 10−8) in 18 genomic loci, of which 15 are new. Around half of the SNPs are located inside a gene, implicating 22 genes, of which 11 are new findings. Gene-based analyses identified an additional 30 genes (MAGMA P < 2.73 × 10−6), of which all but one had not been implicated previously. We show that the identified genes are predominantly expressed in brain tissue, and pathway analysis indicates the involvement of genes regulating cell development (MAGMA competitive P = 3.5 × 10−6). Despite the well-known difference in twin-based heritability2 for intelligence in childhood (0.45) and adulthood (0.80), we show substantial genetic correlation (rg = 0.89, LD score regression P = 5.4 × 10−29). These findings provide new insight into the genetic architecture of intelligence.

The greater number of genes influence a trait, the harder they are to identify without extremely large studies, because any small group of people might not even have the same set of relevant genes.

High IQ correlates positively with a number of life outcomes, like health and longevity, while low IQ correlates with negative outcomes like disease, mental illness, and early death. Obviously this is in part because dumb people are more likely to make dumb choices which lead to death or disease, but IQ also correlates with choice-free matters like height and your ability to quickly press a button. Our brains are not some mysterious entities floating in a void, but physical parts of our bodies, and anything that affects our overall health and physical functioning is likely to also have an effect on our brains.

Like height, most of the genetic variation in IQ is the combined result of many genes. We’ve definitely found some mutations that result in abnormally low IQ, but so far we have yet (AFAIK) to find any genes that produce the IQ gigantism. In other words, low (genetic) IQ is caused by genetic load–Small Yet Important Genetic Differences Between Highly Intelligent People and General Population:

The study focused, for the first time, on rare, functional SNPs – rare because previous research had only considered common SNPs and functional because these are SNPs that are likely to cause differences in the creation of proteins.

The researchers did not find any individual protein-altering SNPs that met strict criteria for differences between the high-intelligence group and the control group. However, for SNPs that showed some difference between the groups, the rare allele was less frequently observed in the high intelligence group. This observation is consistent with research indicating that rare functional alleles are more often detrimental than beneficial to intelligence.

Maternal mortality rates over time, UK data

Greg Cochran has some interesting Thoughts on Genetic Load. (Currently, the most interesting candidate genes for potentially increasing IQ also have terrible side effects, like autism, Tay Sachs and Torsion Dystonia. The idea is that–perhaps–if you have only a few genes related to the condition, you get an IQ boost, but if you have too many, you get screwed.) Of course, even conventional high-IQ has a cost: increased maternal mortality (larger heads).

Wikipedia defines genetic load as:

the difference between the fitness of an average genotype in a population and the fitness of some reference genotype, which may be either the best present in a population, or may be the theoretically optimal genotype. … Deleterious mutation load is the main contributing factor to genetic load overall.[5] Most mutations are deleterious, and occur at a high rate.

There’s math, if you want it.

Normally, genetic mutations are removed from the population at a rate determined by how bad they are. Really bad mutations kill you instantly, and so are never born. Slightly less bad mutations might survive, but never reproduce. Mutations that are only a little bit deleterious might have no obvious effect, but result in having slightly fewer children than your neighbors. Over many generations, this mutation will eventually disappear.

(Some mutations are more complicated–sickle cell, for example, is protective against malaria if you have only one copy of the mutation, but gives you sickle cell anemia if you have two.)

Jakubany is a town in the Carpathian Mountains

Throughout history, infant mortality was our single biggest killer. For example, here is some data from Jakubany, a town in the Carpathian Mountains:

We can see that, prior to the 1900s, the town’s infant mortality rate stayed consistently above 20%, and often peaked near 80%.

The graph’s creator states:

When I first ran a calculation of the infant mortality rate, I could not believe certain of the intermediate results. I recompiled all of the data and recalculated … with the same astounding result – 50.4% of the children born in Jakubany between the years 1772 and 1890 would diebefore reaching ten years of age! …one out of every two! Further, over the same 118 year period, of the 13306 children who were born, 2958 died (~22 %) before reaching the age of one.

Historical infant mortality rates can be difficult to calculate in part because they were so high, people didn’t always bother to record infant deaths. And since infants are small and their bones delicate, their burials are not as easy to find as adults’. Nevertheless, Wikipedia estimates that Paleolithic man had an average life expectancy of 33 years:

Based on the data from recent hunter-gatherer populations, it is estimated that at 15, life expectancy was an additional 39 years (total 54), with a 0.60 probability of reaching 15.[12]

Priceonomics: Why life expectancy is misleading

In other words, a 40% chance of dying in childhood. (Not exactly the same as infant mortality, but close.)

Wikipedia gives similarly dismal stats for life expectancy in the Neolithic (20-33), Bronze and Iron ages (26), Classical Greece(28 or 25), Classical Rome (20-30), Pre-Columbian Southwest US (25-30), Medieval Islamic Caliphate (35), Late Medieval English Peerage (30), early modern England (33-40), and the whole world in 1900 (31).

Over at ThoughtCo: Surviving Infancy in the Middle Ages, the author reports estimates for between 30 and 50% infant mortality rates. I recall a study on Anasazi nutrition which I sadly can’t locate right now, which found 100% malnutrition rates among adults (based on enamel hypoplasias,) and 50% infant mortality.

As Priceonomics notes, the main driver of increasing global life expectancy–48 years in 1950 and 71.5 years in 2014 (according to Wikipedia)–has been a massive decrease in infant mortality. The average life expectancy of an American newborn back in 1900 was only 47 and a half years, whereas a 60 year old could expect to live to be 75. In 1998, the average infant could expect to live to about 75, and the average 60 year old could expect to live to about 80.

Back in his post on Mousetopia, Charlton writes:

Michael A Woodley suggests that what was going on [in the Mouse experiment] was much more likely to be mutation accumulation; with deleterious (but non-fatal) genes incrementally accumulating with each generation and generating a wide range of increasingly maladaptive behavioural pathologies; this process rapidly overwhelming and destroying the population before any beneficial mutations could emerge to ‘save; the colony from extinction. …

The reason why mouse utopia might produce so rapid and extreme a mutation accumulation is that wild mice naturally suffer very high mortality rates from predation. …

Thus mutation selection balance is in operation among wild mice, with very high mortality rates continually weeding-out the high rate of spontaneously-occurring new mutations (especially among males) – with typically only a small and relatively mutation-free proportion of the (large numbers of) offspring surviving to reproduce; and a minority of the most active and healthy (mutation free) males siring the bulk of each generation.

However, in Mouse Utopia, there is no predation and all the other causes of mortality (eg. Starvation, violence from other mice) are reduced to a minimum – so the frequent mutations just accumulate, generation upon generation – randomly producing all sorts of pathological (maladaptive) behaviours.

Historically speaking, another selective factor operated on humans: while about 67% of women reproduced, only 33% of men did. By contrast, according to Psychology Today, a majority of today’s men have or will have children.

Today, almost everyone in the developed world has plenty of food, a comfortable home, and doesn’t have to worry about dying of bubonic plague. We live in humantopia, where the biggest factor influencing how many kids you have is how many you want to have.


Back in 1930, infant mortality rates were highest among the children of unskilled manual laborers, and lowest among the children of professionals (IIRC, this is Brittish data.) Today, infant mortality is almost non-existent, but voluntary childlessness has now inverted this phenomena:

Yes, the percent of childless women appears to have declined since 1994, but the overall pattern of who is having children still holds. Further, while only 8% of women with post graduate degrees have 4 or more children, 26% of those who never graduated from highschool have 4+ kids. Meanwhile, the age of first-time moms has continued to climb.

In other words, the strongest remover of genetic load–infant mortality–has all but disappeared; populations with higher load (lower IQ) are having more children than populations with lower load; and everyone is having children later, which also increases genetic load.

Take a moment to consider the high-infant mortality situation: an average couple has a dozen children. Four of them, by random good luck, inherit a good combination of the couple’s genes and turn out healthy and smart. Four, by random bad luck, get a less lucky combination of genes and turn out not particularly healthy or smart. And four, by very bad luck, get some unpleasant mutations that render them quite unhealthy and rather dull.

Infant mortality claims half their children, taking the least healthy. They are left with 4 bright children and 2 moderately intelligent children. The three brightest children succeed at life, marry well, and end up with several healthy, surviving children of their own, while the moderately intelligent do okay and end up with a couple of children.

On average, society’s overall health and IQ should hold steady or even increase over time, depending on how strong the selective pressures actually are.

Or consider a consanguineous couple with a high risk of genetic birth defects: perhaps a full 80% of their children die, but 20% turn out healthy and survive.

Today, by contrast, your average couple has two children. One of them is lucky, healthy, and smart. The other is unlucky, unhealthy, and dumb. Both survive. The lucky kid goes to college, majors in underwater intersectionist basket-weaving, and has one kid at age 40. That kid has Down Syndrome and never reproduces. The unlucky kid can’t keep a job, has chronic health problems, and 3 children by three different partners.

Your consanguineous couple migrates from war-torn Somalia to Minnesota. They still have 12 kids, but three of them are autistic with IQs below the official retardation threshold. “We never had this back in Somalia,” they cry. “We don’t even have a word for it.”

People normally think of dysgenics as merely “the dumb outbreed the smart,” but genetic load applies to everyone–men and women, smart and dull, black and white, young and especially old–because we all make random transcription errors when copying our DNA.

I could offer a list of signs of increasing genetic load, but there’s no way to avoid cherry-picking trends I already know are happening, like falling sperm counts or rising (diagnosed) autism rates, so I’ll skip that. You may substitute your own list of “obvious signs society is falling apart at the genes” if you so desire.

Nevertheless, the transition from 30% (or greater) infant mortality to almost 0% is amazing, both on a technical level and because it heralds an unprecedented era in human evolution. The selective pressures on today’s people are massively different from those our ancestors faced, simply because our ancestors’ biggest filter was infant mortality. Unless infant mortality acted completely at random–taking the genetically loaded and unloaded alike–or on factors completely irrelevant to load, the elimination of infant mortality must continuously increase the genetic load in the human population. Over time, if that load is not selected out–say, through more people being too unhealthy to reproduce–then we will end up with an increasing population of physically sick, maladjusted, mentally ill, and low-IQ people.

(Remember, all mental traits are heritable–so genetic load influences everything, not just controversial ones like IQ.)

If all of the above is correct, then I see only 4 ways out:

  1. Do nothing: Genetic load increases until the population is non-functional and collapses, resulting in a return of Malthusian conditions, invasion by stronger neighbors, or extinction.
  2. Sterilization or other weeding out of high-load people, coupled with higher fertility by low-load people
  3. Abortion of high load fetuses
  4. Genetic engineering

#1 sounds unpleasant, and #2 would result in masses of unhappy people. We don’t have the technology for #4, yet. I don’t think the technology is quite there for #2, either, but it’s much closer–we can certainly test for many of the deleterious mutations that we do know of.

Recent Discoveries in Human Evolution: H. Sapiens 300,000 years old?

Welcome back to our discussion of recent exciting advances in our knowledge of human evolution:

  • Ancient hominins in the US?
  • Homo naledi
  • Homo flores
  • Humans evolved in Europe?
  • In two days, first H Sap was pushed back to 260,000 years,
  • then to 300,000 years!
  • Bell beaker paper

As we’ve been discussing for the past couple of weeks, the exact dividing line between “human” and “non-human” isn’t always hard and fast. The very first Homo species, such as Homo habilis, undoubtedly had more in common with its immediate Australopithecine ancestors than with today’s modern humans, 3 million years later, but that doesn’t mean these dividing lines are meaningless. Homo sapiens and Homo neandethalensis, while considered different species, interbred and produced fertile offspring (most non-Africans have 3-5% Neanderthal DNA as a result of these pairings;) by contrast, humans and chimps cannot produce fertile offspring, because humans and chimps have a different number of chromosomes. The genetic distance between the two groups is just too far.

Oldowan tool

The grouping of ancient individuals into Homo or not-Homo, Erectus or Habilis, Sapiens or not, is partly based on physical morphology–what they looked like, how they moved–and partly based on culture, such as the ability to make tools or control fire. While australopithecines made some stone tools (and chimps can make tools out of twigs to retrieve tasty termites from nests,) Homo habilis (“handy man”) was the first to master the art and produce large numbers of more sophisticated tools for different purposes, such as this Oldowan chopper.

But we also group species based on moral or political beliefs–scientists generally believe it would be immoral to say that different modern human groups belong to different species, and so the date when Homo ergaster transforms into Homo sapiens is dependent on the date when the most divergent human groups alive today split apart–no one wants to come up with a finding that will get trumpeted in media as “Scientists Prove Pygmies aren’t Human!” (Pygmies already have enough problems, what with their immediate neighbors actually thinking they aren’t human and using their organs for magic rituals.)

(Of course they would still be Human even if they part of an ancient lineage.)

But if an ecologically-minded space alien arrived on earth back in 1490 and was charged with documenting terrestrial species, it might easily decide–based on morphology, culture, and physical distribution–that there were several different Homo “species” which all deserve to be preserved.

But we are not space aliens, and we have the concerns of our own day.

So when a paper was published last year on archaic admixture in Pygmies and the Pygmy/Bushmen/everyone else split, West Hunter noted the authors used a fast–but discredited–estimate of mutation rate to avoid the claim that Pygmies split off 300,000 years ago, 100,000 years before the emergence of Homo sapiens:

There are a couple of recent papers on introgression from some quite divergent archaic population into Pygmies ( this also looks to be the case with Bushmen). Among other things, one of those papers discussed the time of the split between African farmers (Bantu) and Pygmies, as determined from whole-genome analysis and the mutation rate. They preferred to use the once-fashionable rate of 2.5 x 10-8 per-site per-generation (based on nothing), instead of the new pedigree-based estimate of about 1.2 x 10-8 (based on sequencing parents and child: new stuff in the kid is mutation). The old fast rate indicates that the split between Neanderthals and modern humans is much more recent than the age of early Neanderthal-looking skeletons, while the new slow rate fits the fossil record – so what’s to like about the fast rate? Thing is, using the slow rate, the split time between Pygmies and Bantu is ~300k years ago – long before any archaeological sign of behavioral modernity (however you define it) and well before the first known fossils of AMH (although that shouldn’t bother anyone, considering the raggedness of the fossil record).

This was a good catch. (Here is the relevant Dienekes article, plus Model-based analyses of whole-genome data reveal a complex evolutionary history involving archaic introgression in Central African Pygmies, and Whole-genome sequence analyses of Western Central African Pygmy hunter-gatherers reveal a complex demographic history and identify candidate genes under positive natural selection.) If the slow mutation rate matches the fossil record better than the fast, why use the fast–except if the fast gives you inconvenient results?

But now we have another finding, based on the Bushmen, which also pushes the Bushmen/everyone else split back further than 200,000 years–from BioRxiv, “Ancient genomes from southern Africa pushes modern human divergence beyond 260,000 years ago“:

Southern Africa is consistently placed as one of the potential regions for the evolution of Homo sapiens. To examine the region’s human prehistory prior to the arrival of migrants from East and West Africa or Eurasia in the last 1,700 years, we generated and analyzed genome sequence data from seven ancient individuals from KwaZulu-Natal, South Africa. Three Stone Age hunter-gatherers date to ~2,000 years ago, and we show that they were related to current-day southern San groups such as the Karretjie People. Four Iron Age farmers (300-500 years old) have genetic signatures similar to present day Bantu-speakers. The genome sequence (13x coverage) of a juvenile boy from Ballito Bay, who lived ~2,000 years ago, demonstrates that southern African Stone Age hunter-gatherers were not impacted by recent admixture; however, we estimate that all modern-day Khoekhoe and San groups have been influenced by 9-22% genetic admixture from East African/Eurasian pastoralist groups arriving >1,000 years ago, including the Ju|’hoansi San, previously thought to have very low levels of admixture. Using traditional and new approaches, we estimate the population divergence time between the Ballito Bay boy and other groups to beyond 260,000 years ago.

260,000 years! Looks like West Hunter was correct, and we should be looking at the earlier Pygmy divergence date, too.

Two days later, a paper from the opposite end of Africa appeared in Nature which–potentially–pushes H sapiens’s emergence to 300,000 years ago, “New fossils from Jebel Irhoud, Morocco and the pan-African origin of Homo sapiens“:

Fossil evidence points to an African origin of Homo sapiens from a group called either H. heidelbergensis or H. rhodesiensis. However, the exact place and time of emergence of H. sapiens remain obscure … In particular, it is unclear whether the present day ‘modern’ morphology rapidly emerged approximately 200 thousand years ago (ka) among earlier representatives of H. sapiens1 or evolved gradually over the last 400 thousand years2. Here we report newly discovered human fossils from Jebel Irhoud, Morocco, and interpret the affinities of the hominins from this site with other archaic and recent human groups. We identified a mosaic of features including facial, mandibular and dental morphology that aligns the Jebel Irhoud material with early or recent anatomically modern humans and more primitive neurocranial and endocranial morphology. In combination with an age of 315 ± 34 thousand years (as determined by thermoluminescence dating)3, this evidence makes Jebel Irhoud the oldest and richest African Middle Stone Age hominin site that documents early stages of the H. sapiens clade in which key features of modern morphology were established.

Comparison of the skulls of a Jebel Irhoud human (left) and a modern human (right) (NHM London)

Hublin–one of the study’s coauthors–notes that between 330,000 and 300,000 years ago, the Sahara was green and animals could range freely across it.

While the Moroccan fossils do look like modern H sapiens, they also still look a lot like pre-sapiens, and the matter is still up for debate. Paleoanthropologist Chris Stringer suggests that we should consider all of our ancestors after the Neanderthals split off to be Homo sapiens, which would make our species 500,000 years old. Others would undoubtedly prefer to use a more recent date, arguing that the physical and cultural differences between 500,000 year old humans and today’s people are too large to consider them one species.

According to the Atlantic:

[The Jebel Irhoud] people had very similar faces to today’s humans, albeit with slightly more prominent brows. But the backs of their heads were very different. Our skulls are rounded globes, but theirs were lower on the top and longer at the back. If you saw them face on, they could pass for a modern human. But they turned around, you’d be looking at a skull that’s closer to extinct hominids like Homo erectus. “Today, you wouldn’t be able to find anyone with a braincase that shape,” says Gunz.

Their brains, though already as large as ours, must also have been shaped differently. It seems that the size of the human brain had already been finalized 300,000 years ago, but its structure—and perhaps its abilities—were fine-tuned over the subsequent millennia of evolution.

No matter how we split it, these are exciting days in the field!

No, Graecopithecus does not prove humans evolved in Europe

Hello! We’re in the midst of a series of posts on recent exciting news in the field of human evolution:

  • Ancient hominins in the US?
  • Homo naledi
  • Homo flores
  • Humans evolved in Europe?
  • In two days, first H Sap was pushed back to 260,000 years,
  • then to 300,000 years!
  • Bell beaker paper

Today we’re discussing the much-publicized claim that scientists have discovered that humans evolved in Europe. (If you haven’t read last week’s post on Homo naledi and flores, I encourage you to do so first.) The way reporters have framed their headlines about the recent Graecopithecus freybergi findings is itself a tale:

The Telegraph proclaimed, “Europe was the birthplace of mankind, not Africa, scientists find,” Newsweek similarly trumpeted, “First Human Ancestor Came from Europe Not Africa,” and CBS News stated, “Controversial study suggests earliest humans lived in Europe – not Africa.”

The Conversation more prudently inquired, “Did humans evolve in Europe rather than Africa? ” and NewScientist and the Washington Post, in a burst of knowing what a “human” is, stated, “Our common ancestor with chimps may be from Europe, not Africa” and “Ape that lived in Europe 7 million years ago could be human ancestor,” respectively.

This all occasioned some very annoying conversations along the lines of “White skin tone couldn’t possibly have evolved within the past 20,000 years because humans evolved in Europe! Don’t you know anything about science?”

Ohkay. Let’s step back a moment and take a look at what Graecopithecus is and what it isn’t.

This is Graecopithecus:

I think there is a second jawbone, but that’s basically it–and that’s not six teeth, that’s three teeth, shown from two different perspectives. There’s no skull, no shoulder blades, no pelvis, no legs.


By contrast, here are Lucy, the famous Australopithecus from Ethiopia, and a sample of the over 1,500 bones and pieces of Homo naledi recently recovered from a cave in South Africa.

Now, given what little scientists had to work with, the fact that they managed to figure out anything about Graecopithecus is quite impressive. The study, reasonably titled “Potential hominin affinities of Graecopithecus from the Late Miocene of Europe,” by
Jochen Fuss, Nikolai Spassov, David R. Begun, and Madelaine Böhm, used μCT and 3D reconstructions of the jawbones and teeth to compare Graecopithecus’s teeth to those of other apes. They decided the teeth were different enough to distinguish Graecopithecus from the nearby but older Ouranopithecus, while looking more like hominin teeth:

G. freybergi uniquely shares p4 partial root fusion and a possible canine root reduction with this tribe and therefore, provides intriguing evidence of what could be the oldest known hominin.

My hat’s off to the authors, but not to all of the reporters who dressed up “teeth look kind of like hominin teeth” as “Humans evolved in Europe!”

First of all, you cannot make that kind of jump based off of two jawbones and a handfull of teeth. Many of the hominin species we have recovered–such as Homo naledi and Homo floresiensis, as you know if you already read the previous post–possessed a mosaic of “ape like” and “human like” traits, ie:

The physical characteristics of H. naledi are described as having traits similar to the genus Australopithecus, mixed with traits more characteristic of the genus Homo, and traits not known in other hominin species. The skeletal anatomy displays plesiomorphic (“ancestral”) features found in the australopithecines and more apomorphic (“derived,” or traits arising separately from the ancestral state) features known from later hominins.[2]

Nebraska Man teeth compared to chimps, Homo erectus, and modern humans

If we only had six Homo naledi bones instead of 1,500 of them, we might be looking only at the part that looks like an Australopithecus instead of the parts that look like H. erectus or totally novel. You simply cannot make that kind of claim off a couple of jawbones. You’re far too likely to be wrong, and then not only will you end up with egg on your face, but you’ll only be giving more fuel to folks who like to proclaim that “Nebraska Man turned out to be a pig!”:

In February 1922, Harold Cook wrote to Dr. Henry Osborn to inform him of the tooth that he had had in his possession for some time. The tooth had been found years prior in the Upper Snake Creek beds of Nebraska along with other fossils typical of North America. … Osborn, along with Dr. William D. Matthew soon came to the conclusion that the tooth had belonged to an anthropoid ape. They then passed the tooth along to William K. Gregory and Dr. Milo Hellman who agreed that the tooth belonged to an anthropoid ape more closely related to humans than to other apes. Only a few months later, an article was published in Science announcing the discovery of a manlike ape in North America.[1] An illustration of H. haroldcookii was done by artist Amédée Forestier, who modeled the drawing on the proportions of “Pithecanthropus” (now Homo erectus), the “Java ape-man,” for the Illustrated London News. …

Examinations of the specimen continued, and the original describers continued to draw comparisons between Hesperopithecus and apes. Further field work on the site in the summers of 1925 and 1926 uncovered other parts of the skeleton. These discoveries revealed that the tooth was incorrectly identified. According to these discovered pieces, the tooth belonged neither to a man nor an ape, but to a fossil of an extinct species of peccary called Prosthennops serus.

That basically sums up everything I learned about human evolution in highschool.


Scientists define “humans” as members of the genus Homo, which emerged around 3 million years ago. These are the guys with funny names like Homo habilis, Homo neanderthalensis, and the embarrassingly named Homo erectus. The genus also includes ourselves, Homo sapiens, who emerged around 200-300,000 years ago.

Homo habilis descended from an Australopithecus, perhaps Lucy herself. Australopithecines are not in the Homo genus; they are not “human,” though they are more like us than modern chimps and bonobos are. They evolved around 4 million years ago.

The Australopithecines evolved, in turn, from even older apes, such as–maybe–Ardipithecus (4-6 million years ago) or Sahelanthropus tchadensis.

Regardless, humans didn’t evolve 7 million years ago. Sahelanthropus and even Lucy do not look like anyone you would call “human.” Humans have only been around for about 3 million years, and our own specific species is only about 300,000 years old. Even if Graecopithecus turns out to be the missing link–the true ancestor of both modern chimps and modern humans–that still does not change where humans evolved, because Graecopithecus narrowly missed being a human by 4 million years.

If you want to challenge the Out of Africa narrative, I think you’d do far better arguing for a multi-regional model of human evolution that includes back-migration of H. erectus into Africa and interbreeding with hominins there as spurring the emergence of H. sapiens than arguing about a 7 million year old jawbone. (I just made that up, by the way. It has no basis in anything I have read. But it at least has the right characters, in the right time frame, in a reasonable situation.)

Sorry this was a bit of a rant; I am just rather passionate about the subject. Next time we’ll examine very exciting news about Bushmen and Pygmy DNA!