North Africa in Genetics and History

detailed map of African and Middle Eastern ethnicities in Haaks et al’s dataset

North Africa is an often misunderstood region in human genetics. Since it is in Africa, people often assume that it contains the same variety of people referenced in terms like “African Americans,” “black Africans,” or even just “Africans.” In reality, the African content contains members of all three of the great human clades–Sub-Saharan Africans in the south, Polynesians (Asian clade) in Madagascar, and Caucasians in the north.

The North African Middle Stone Age and its place in recent human evolution provides an overview of the first 275,000 years of humanity’s history in the region(300,000-25,000 years ago, more or less), including the development of symbolic culture and early human dispersal. Unfortunately the paper is paywalled.

Throughout most of human history, the Sahara–not the Mediterranean or Red seas–has been the biggest local impediment to human migration–thus North Africans are much closer, genetically, to their neighbors in Europe and the Middle East than their neighbors across the desert (and before the domestication of the camel, about 3,000 years ago, the Sahara was even harder to cross.)

But from time to time, global weather patterns change and the Sahara becomes a garden: the Green Sahara. The last time we had a Green Sahara was about 9-7,000 years ago; during this time, people lived, hunted, fished, herded and perhaps farmed throughout areas that are today nearly uninhabited wastes.

The Peopling of the last Green Sahara revealed by high-coverage resequencing of trans-Saharan patrilineages sheds light on how the Green (and subsequently brown) Sahara affected the spread (and separation) of African groups into northern and sub-Saharan:

In order to investigate the role of the last Green Sahara in the peopling of Africa, we deep-sequence the whole non-repetitive portion of the Y chromosome in 104 males selected as representative of haplogroups which are currently found to the north and to the south of the Sahara. … We find that the coalescence age of the trans-Saharan haplogroups dates back to the last Green Sahara, while most northern African or sub-Saharan clades expanded locally in the subsequent arid phase. …

Our findings suggest that the Green Sahara promoted human movements and demographic expansions, possibly linked to the adoption of pastoralism. Comparing our results with previously reported genome-wide data, we also find evidence for a sex-biased sub-Saharan contribution to northern Africans, suggesting that historical events such as the trans-Saharan slave trade mainly contributed to the mtDNA and autosomal gene pool, whereas the northern African paternal gene pool was mainly shaped by more ancient events.

In other words, modern North Africans have some maternal (female) Sub-Saharan DNA that arrived recently via the Islamic slave trade, but most of their Sub-Saharan Y-DNA (male) is much older, hailing from the last time the Sahara was easy to cross.

Note that not much DNA is shared across the Sahara:

After the African humid period, the climatic conditions became rapidly hyper-arid and the Green Sahara was replaced by the desert, which acted as a strong geographic barrier against human movements between northern and sub-Saharan Africa.

A consequence of this is that there is a strong differentiation in the Y chromosome haplogroup composition between the northern and sub-Saharan regions of the African continent. In the northern area, the predominant Y lineages are J-M267 and E-M81, with the former being linked to the Neolithic expansion in the Near East and the latter reaching frequencies as high as 80 % in some north-western populations as a consequence of a very recent local demographic expansion [810]. On the contrary, sub-Saharan Africa is characterised by a completely different genetic landscape, with lineages within E-M2 and haplogroup B comprising most of the Y chromosomes. In most regions of sub-Saharan Africa, the observed haplogroup distribution has been linked to the recent (~ 3 kya) demic diffusion of Bantu agriculturalists, which brought E-M2 sub-clades from central Africa to the East and to the South [1117]. On the contrary, the sub-Saharan distribution of B-M150 seems to have more ancient origins, since its internal lineages are present in both Bantu farmers and non-Bantu hunter-gatherers and coalesce long before the Bantu expansion [1820].

In spite of their genetic differentiation, however, northern and sub-Saharan Africa share at least four patrilineages at different frequencies, namely A3-M13, E-M2, E-M78 and R-V88.

A recent article in Nature, “Whole Y-chromosome sequences reveal an extremely recent origin of the most common North African paternal lineage E-M183 (M81),” tells some of North Africa’s fascinating story:

Here, by using whole Y chromosome sequences, we intend to shed some light on the historical and demographic processes that modelled the genetic landscape of North Africa. Previous studies suggested that the strategic location of North Africa, separated from Europe by the Mediterranean Sea, from the rest of the African continent by the Sahara Desert and limited to the East by the Arabian Peninsula, has shaped the genetic complexity of current North Africans15,16,17. Early modern humans arrived in North Africa 190–140 kya (thousand years ago)18, and several cultures settled in the area before the Holocene. In fact, a previous study by Henn et al.19 identified a gradient of likely autochthonous North African ancestry, probably derived from an ancient “back-to-Africa” gene flow prior to the Holocene (12 kya). In historic times, North Africa has been populated successively by different groups, including Phoenicians, Romans, Vandals and Byzantines. The most important human settlement in North Africa was conducted by the Arabs by the end of the 7th century. Recent studies have demonstrated the complexity of human migrations in the area, resulting from an amalgam of ancestral components in North African groups15,20.

According to the article, E-M81 is dominant in Northwest Africa and absent almost everywhere else in the world.

The authors tested various men across north Africa in order to draw up a phylogenic tree of the branching of E-M183:

The distribution of each subhaplogroup within E-M183 can be observed in Table 1 and Fig. 2. Indeed, different populations present different subhaplogroup compositions. For example, whereas in Morocco almost all subhaplogorups are present, Western Sahara shows a very homogeneous pattern with only E-SM001 and E-Z5009 being represented. A similar picture to that of Western Sahara is shown by the Reguibates from Algeria, which contrast sharply with the Algerians from Oran, which showed a high diversity of haplogroups. It is also worth to notice that a slightly different pattern could be appreciated in coastal populations when compared with more inland territories (Western Sahara, Algerian Reguibates).

Overall, the authors found that the haplotypes were “strikingly similar” to each other and showed little geographic structure besides the coastal/inland differences:

As proposed by Larmuseau et al.25, the scenario that better explains Y-STR haplotype similarity within a particular haplogroup is a recent and rapid radiation of subhaplogroups. Although the dating of this lineage has been controversial, with dates proposed ranging from Paleolithic to Neolithic and to more recent times17,22,28, our results suggested that the origin of E-M183 is much more recent than was previously thought. … In addition to the recent radiation suggested by the high haplotype resemblance, the pattern showed by E-M183 imply that subhaplogroups originated within a relatively short time period, in a burst similar to those happening in many Y-chromosome haplogroups23.

In other words, someone went a-conquering.

Alternatively, given the high frequency of E-M183 in the Maghreb, a local origin of E-M183 in NW Africa could be envisaged, which would fit the clear pattern of longitudinal isolation by distance reported in genome-wide studies15,20. Moreover, the presence of autochthonous North African E-M81 lineages in the indigenous population of the Canary Islands, strongly points to North Africa as the most probable origin of the Guanche ancestors29. This, together with the fact that the oldest indigenous inviduals have been dated 2210 ± 60 ya, supports a local origin of E-M183 in NW Africa. Within this scenario, it is also worth to mention that the paternal lineage of an early Neolithic Moroccan individual appeared to be distantly related to the typically North African E-M81 haplogroup30, suggesting again a NW African origin of E-M183. A local origin of E-M183 in NW Africa > 2200 ya is supported by our TMRCA estimates, which can be taken as 2,000–3,000, depending on the data, methods, and mutation rates used.

However, the authors also note that they can’t rule out a Middle Eastern origin for the haplogroup since their study simply doesn’t include genomes from Middle Eastern individuals. They rule out a spread during the Neolithic expansion (too early) but not the Islamic expansion (“an extensive, male-biased Near Eastern admixture event is registered ~1300 ya, coincidental with the Arab expansion20.”) Alternatively, they suggest E-M183 might have expanded near the end of the third Punic War. Sure, Carthage (in Tunisia) was defeated by the Romans, but the era was otherwise one of great North African wealth and prosperity.

 

Interesting papers! My hat’s off to the authors. I hope you enjoyed them and get a chance to RTWT.

Advertisements

Anthropology Friday: Numbers and the Making of Us, by Caleb Everett pt. 4

Yes, but which 25% of us is grape?

Welcome to our final post on Numbers and the Making of Us: Counting and the Course of Human Cultures, by Caleb Everett. Today I just want to highlight a few interesting passages.

On DNA:

For example, there is about 25% overlap between the human genome and that of grapes. (And we have fewer genes than grapes!) So some caution should be exercised before reading too much into percentages of genomic correspondence across species. I doubt, after all that you consider yourself one-quarter grape. … canine and bovine species generally exhibit about an 85% rate of genomic correspondence with humans. … small changes in genetic makeup can, among other influences, lead to large changes in brain size.

On the development of numbers:

Babylonian math homework

After all, for the vast majority of our species’ existence, we lived as hunters and gatherers in Africa … A reasonable interpretation of the contemporary distribution of cultural and number-system types, then, is that humans did not rely on complex number system for the bulk of their history. We can also reasonably conclude that transitions to larger, more sedentary, and more trade-based cultures helped pressure various groups to develop more involved numerical technologies. … Written numerals, and writing more generally, were developed first in the Fertile Crescent after the agricultural revolution began there. … These pressures ultimately resulted in numerals and other written symbols, such as the clay-token based numerals … The numerals then enabled new forms of agriculture and trade that required the exact discrimination and representation of quantities. The ancient Mesopotamian case is suggestive, then, of the motivation for the present-day correlation between subsistence and number types: larger agricultural and trade-based economies require numerical elaboration to function. …

Intriguingly, though, the same maybe true of Chinese writing, the earliest samples of which date to the Shang Dynasty and are 3,000 years old. The most ancient of these samples are oracle bones. These bones were inscribed with nuemerals quantifying such items as enemy prisoners, birds and animals hunted, and sacrificed animals. … Ancient writing around the world is numerically focused.

Changes in the Jungle as population growth makes competition for resources more intense and forces people out of their traditional livelihoods:

Consider the case of one of my good friends, a member of an indigenous group known as the Karitiana. … Paulo spent the majority of his childhood, in the 1980s and 1990s in the largest village of his people’s reservation. … While some Karitiana sought to make a living in nearby Porto Velho, many strived to maintain their traditional way of life on their reservation. At the time this was feasible, and their traditional subsistence strategies of hunting, gathering, and horticulture could be realistically practiced. Recently, however, maintaining their conventional way of life has become a less tenable proposition. … many Karitiana feel they have little choice but to seek employment in the local Brazilian economy… This is certainly true of Paulo. He has been enrolled in Brazilian schools for some time, has received some higher education, and is currently employed by a governmental organization. To do these things, of course, Paulo had to learn Portuguese grammar and writing. And he had to learn numbers and math, also. In short, the socioeconomic pressures he has felt to acquire the numbers of another culture are intense.

Everett cites a statistic that >90% of the world’s approximately 7,000 languages are endangered.

They are endangered primarily because people like Paulo are being conscripted into larger nation-states, gaining fluency in more economically viable languages. … From New Guinea to Australia to Amazonia and elsewhere, the mathematizing of people is happening.

On the advantages of different number systems:

Recent research also suggests that the complexity of some non-linguistic number systems have been under appreciated. Many counting boards and abaci that have been used, and are still in use across the world’s culture, present clear advantages to those using them … the abacus presents some cognitive advantages. That is because, research now suggests, children who are raised using the abacus develop a “mental abacus” with time. … According to recent cross-cultural findings, practitioners of abacus-based mathematical strategies outperform those unfamiliar with such strategies,a t least in some mathematical tasks. The use of the Soroban abacus has, not coincidentally, now been adopted in many schools throughout Asia.

The zero is a dot in the middle of the photo–earliest known zero, Cambodia

I suspect these higher math scores are more due to the mental abilities of the people using the abacus than the abacus itself. I have also just ordered an abacus.

… in 2015 the world’s oldest known unambiguous inscription of a circular zero was rediscovered in Cambodia. The zero in question, really a large dot, serves as a placeholder in the ancient Khmer numeral for 605. It is inscribed on a stone tablet, dating to 683 CE, that was found only kilometers from the faces of Bayon and other ruins of Angkor Wat and Angkor Thom. … the Maya also developed a written form for zero, and the Inca encoded the concept in their Quipu.

In 1202, Fibonacci wrote the Book of Calculation, which promoted the use of the superior Arabic (yes Hindu) numerals (zero included) over the old Roman ones. Just as the introduction of writing jump-started the Cherokee publishing industry, so the introduction of superior numerals probably helped jump-start the Renaissance.

Cities and the rise of organized religion:

…although creation myths, animistic practices, and other forms of spiritualism are universal or nearly universal, large-scale hierarchical religions are restricted to relatively few cultural lineages. Furthermore, these religions… developed only after people began living in larger groups and settlements because of their agricultural lifestyles. … A phalanx of scholars has recently suggested that the development of major hierarchical religions, like the development of hierarchical governments, resulted from the agglomeration of people in such places. …

Organized religious beliefs, with moral-enforcing deities and priest case, were a by-product of the need for large groups of people to cooperate via shared morals and altruism. As the populations of cultures grew after the advent of agricultural centers… individuals were forced to rely on shared trust with many more individuals, including non-kin, than was or is the case in smaller groups like bands or tribes. … Since natural selection is predicated on the protection of one’s genes, in-group altruism and sacrifice are easier to make sense of in bands and tribes. But why would humans in much larger populations–humans who have no discernible genetic relationship… cooperate with these other individuals in their own culture? … some social mechanism had to evolve so that larger cultures would not disintegrate due to competition among individuals and so that many people would not freeload off the work of others. One social mechanism that foster prosocial and cooperative behavior is an organized religion based on shared morals and omniscient deities capable of keeping track of the violation of such morals. …

When Moses descended from Mt. Sinai with his stone tablets, they were inscribed with ten divine moral imperatives. … Why ten? … Here is an eleventh commandment that could likely be uncontroversially adopted by many people: “thou shalt not torture.” … But then the list would appear to lose some of its rhetorical heft. “eleven commandments’ almost hints of a satirical deity.

Technically there are 613 commandments, but that’s not nearly as catchy as the Ten Commandments–inadvertently proving Everett’s point.

Overall, I found this book frustrating and repetitive, but there were some good parts. I’ve left out most of the discussion of the Piraha and similar cultures, and the rather fascinating case of Nicaraguan homesigners (“homesigners” are deaf people who were never taught a formal sign language but made up their own.) If you’d like to learn more about them, you might want to look up the book at your local library.

Favorite Things Redux: Beringian DNA

Map of gene-flow in and out of Beringia, from 25,000 years ago to present

Scientists have long believed that the first humans made it to the Americas by crossing from now-Russia to now-Alaska. When and how they did it–by boat or by foot–remain matters of contentious debate. Did people move quickly through Alaska and into the rest of North America, or did they hover–as the “Bering standstill” hypothesis suggests–in Beringia (or the Aleutian Islands) for thousands of years?

Archaeologists working at the Upward Sun River site (approximately in the middle of Alaska) recently uncovered the burials of three children: a cremated three year old, and beneath it, a 6-12 week old infant and a 30 week, possibly premature or stillborn fetus. The three year old has been dubbed “Upward Sun River Mouth Child,” and the 6 week old “Sun-Rise Girl Child.” Since these aren’t really names, I’m going to dub them Sunny (3 yrs old), Rosy (6 weeks), and Hope (fetus).

They died around 11,500 years ago, making them the oldest burials so far from northern North America. Rosy and Hope were probably girls; cremation rendered Sunny’s gender a mystery. Rosy and Hope were covered in red ocher and buried together, accompanied by four decorated antler rods, two dart points and two stone axes. (Here’s an illustration of their burial.) The site where the children were buried was abandoned soon after Sunny’s death–perhaps their parents were too sad to stay, or perhaps the location was just too harsh.

Rosy and Hope were well enough preserved to yield DNA.

Surprisingly, they weren’t sisters. Rosy’s mother’s mtDNA hailed from haplogroup C1b, which is found only in the Americas (though its ancestral clade, haplogroup C, is found throughout Siberia.) Hope’s mtDNA is from haplogroup B2, which is also only found in the Americas. Oddly, B2’s parent clade, (B), isn’t common in Siberia–it’s much more common in places like Vietnam, Laos, the Philippines, and Saipan. It’s not entirely absent from Siberia, but it got to Alaska without leaving a larger trail remains a mystery.

Since they are found in the Americans but not Asia, we know these lineages most likely evolved over here; the main questions are when and where. If the Bering Standstill hypothesis is correct and the Indians spent 10-20,000 years stranded in Beringia, they would have had plenty of time to evolve new lineages while still in Alaska. By contrast, if they crossed relatively quickly and then dispersed, these new lineages would have had much less time to emerge, and we would expect them to show up as people moved south.

Source: Ancient Beringians: A Discovery Changing Early Native American Hisotry

Or there could have been multiple migration waves, with different haplogroups arriving in different waves. (There were multiple migration waves, but the others occurred well after Sunny and the others were buried.)

In fact, there are five mtDNA lineages found only in the Americas (A2, B2, C1, D1, and X2a.) With Hope and Rosy, we have now identified all five mtDNA lineages in North American burials over 8,000 years old, lending support to the Beringian Standstill hypothesis.

But were the Upward Sun River children’s families ancestral to today’s Native Americans? Not quite.

It looks like Sunny’s tribe split off from the rest of the Beringians (or perhaps the others split off from them) around 22-18,000 years ago. Most of the others headed south, while Sunny’s people stayed in Alaska and disappeared (perhaps because all of their children died.) So Sunny’s tribe was less “grandparent” to today’s Indians and more “great aunt and uncle,” but they still hailed from the same, even older ancestors who first set out from Siberia.

I have previously favored the Aleutian or at least a much more rapid Beringian route, but it looks like I was wrong. I find the idea of the Bering Standstill difficult to believe, but that may just be my own biases. Perhaps people really did get stuck there for thousands of years, waiting for the ice to clear. What amazing people they must have been to survive for so long in so harsh an environment.

2 Interesting studies: Early Humans in SE Asia and Genetics, Relationships, and Mental Illness

Ancient Teeth Push Back Early Arrival of Humans in Southeast Asia :

New tests on two ancient teeth found in a cave in Indonesia more than 120 years ago have established that early modern humans arrived in Southeast Asia at least 20,000 years earlier than scientists previously thought, according to a new study. …

The findings push back the date of the earliest known modern human presence in tropical Southeast Asia to between 63,000 and 73,000 years ago. The new study also suggests that early modern humans could have made the crossing to Australia much earlier than the commonly accepted time frame of 60,000 to 65,000 years ago.

I would like to emphasize that nothing based on a couple of teeth is conclusive, “settled,” or “proven” science. Samples can get contaminated, machines make errors, people play tricks–in the end, we’re looking for the weight of the evidence.

I am personally of the opinion that there were (at least) two ancient human migrations into south east Asia, but only time will tell if I am correct.

Genome-wide association study of social relationship satisfaction: significant loci and correlations with psychiatric conditions, by Varun Warrier, Thomas Bourgeron, Simon Baron-Cohen:

We investigated the genetic architecture of family relationship satisfaction and friendship satisfaction in the UK Biobank. …

In the DSM-55, difficulties in social functioning is one of the criteria for diagnosing conditions such as autism, anorexia nervosa, schizophrenia, and bipolar disorder. However, little is known about the genetic architecture of social relationship satisfaction, and if social relationship dissatisfaction genetically contributes to risk for psychiatric conditions. …

We present the results of a large-scale genome-wide association study of social
relationship satisfaction in the UK Biobank measured using family relationship satisfaction and friendship satisfaction. Despite the modest phenotypic correlations, there was a significant and high genetic correlation between the two phenotypes, suggesting a similar genetic architecture between the two phenotypes.

Note: the two “phenotypes” here are “family relationship satisfaction” and “friendship satisfaction.”

We first investigated if the two phenotypes were genetically correlated with
psychiatric conditions. As predicted, most if not all psychiatric conditions had a significant negative correlation for the two phenotypes. … We observed significant negative genetic correlation between the two phenotypes and a large cross-condition psychiatric GWAS38. This underscores the importance of social relationship dissatisfaction in psychiatric conditions. …

In other words, people with mental illnesses generally don’t have a lot of friends nor get along with their families.

One notable exception is the negative genetic correlation between measures of cognition and the two phenotypes. Whilst subjective wellbeing is positively genetically correlated with measures of cognition, we identify a small but statistically significant negative correlation between measures of correlation and the two phenotypes.

Are they saying that smart people have fewer friends? Or that dumber people are happier with their friends and families? I think they are clouding this finding in intentionally obtuse language.

A recent study highlighted that people with very high IQ scores tend to report lower satisfaction with life with more frequent socialization.

Oh, I think I read that one. It’s not the socialization per se that’s the problem, but spending time away from the smart person’s intellectual activities. For example, I enjoy discussing the latest genetics findings with friends, but I don’t enjoy going on family vacations because they are a lot of work that does not involve genetics. (This is actually something my relatives complain about.)

…alleles that increase the risk for schizophrenia are in the same haplotype as
alleles that decrease friendship satisfaction. The functional consequences of this locus must be formally tested. …

Loss of function mutations in these genes lead to severe biochemical consequences, and are implicated in several neuropsychiatric conditions. For
example, de novo loss of function mutations in pLI intolerant genes confers significant risk for autism. Our results suggest that pLI > 0.9 genes contribute to psychiatric risk through both common and rare genetic variation.

When Did Black People Evolve?

In previous posts, we discussed the evolution of Whites and Asians, so today we’re taking a look at people from Sub-Saharan Africa.

Modern humans only left Africa about 100,000 to 70,000 yeas ago, and split into Asians and Caucasians around 40,000 years ago. Their modern appearances came later–white skin, light hair and light eyes, for example, only evolved in the past 20,000 and possibly within the past 10,000 years.

What about the Africans, or specifically, Sub-Saharans? (North Africans, like Tunisians and Moroccans, are in the Caucasian clade.) When did their phenotypes evolve?

The Sahara, an enormous desert about the size of the United States, is one of the world’s biggest, most ancient barriers to human travel. The genetic split between SSAs and non-SSAs, therefore, is one of the oldest and most substantial among human populations. But there are even older splits within Africa–some of the ancestors of today’s Pygmies and Bushmen may have split off from other Africans 200,000-300,000 years ago. We’re not sure, because the study of archaic African DNA is still in its infancy.

Some anthropologists refer to Bushmen as “gracile,” which means they are a little shorter than average Europeans and not stockily built

The Bushmen present an interesting case, because their skin is quite light (for Africans.) I prefer to call it golden. The nearby Damara of Namibia, by contrast, are one of the world’s darkest peoples. (The peoples of South Sudan, eg Malik Agar, may be darker, though.) The Pygmies are the world’s shortest peoples; the peoples of South Sudan, such as the Dinka and Shiluk, are among the world’s tallest.

Sub-Saharan Africa’s ethnic groups can be grouped, very broadly, into Bushmen, Pygmies, Bantus (aka Niger-Congo), Nilotics, and Afro-Asiatics. Bushmen and Pygmies are extremely small groups, while Bantus dominate the continent–about 85% of Sub Saharan Africans speak a language from the Niger-Congo family. The Afro-Asiatic groups, as their name implies, have had extensive contact with North Africa and the Middle East.

Most of America’s black population hails from West Africa–that is, the primarily Bantu region. The Bantus and similar-looking groups among the Nilotics and Afro-Asiatics (like the Hausa) are, therefore, have both Africa’s most iconic and most common phenotypes.

For the sake of this post, we are not interested in the evolution of traits common to all humans, such as bipedalism. We are only interested in those traits generally shared by most Sub-Saharans and generally not shared by people outside of Africa.

detailed map of African and Middle Eastern ethnicities in Haaks et al’s dataset

One striking trait is black hair: it is distinctively “curly” or “frizzy.” Chimps and gorrilas do not have curly hair. Neither do whites and Asians. (Whites and Asians, therefore, more closely resemble chimps in this regard.) Only Africans and a smattering of other equatorial peoples like Melanesians have frizzy hair.

Black skin is similarly distinct. Chimps, who live in the shaded forest and have fur, do not have high levels of melanin all over their bodies. While chimps naturally vary in skin tone, an unfortunate, hairless chimp is practically “white.

Humans therefore probably evolved both black skin and frizzy hair at about the same time–when we came out of the shady forests and began running around on the much sunnier savannahs. Frizzy hair seems well-adapted to cooling–by standing on end, it lets air flow between the follicles–and of course melanin is protective from the sun’s rays. (And apparently, many of the lighter-skinned Bushmen suffer from skin cancer.)

Steatopygia also comes to mind, though I don’t know if anyone has studied its origins.

According to Wikipedia, additional traits common to Sub-Saharan Africans include:

In modern craniofacial anthropometry, Negroid describes features that typify skulls of black people. These include a broad and round nasal cavity; no dam or nasal sill; Quonset hut-shaped nasal bones; notable facial projection in the jaw and mouth area (prognathism); a rectangular-shaped palate; a square or rectangular eye orbit shape;[21] a large interorbital distance; a more undulating supraorbital ridge;[22] and large, megadontic teeth.[23] …

Modern cross-analysis of osteological variables and genome-wide SNPs has identified specific genes, which control this craniofacial development. Of these genes, DCHS2, RUNX2, GLI3, PAX1 and PAX3 were found to determine nasal morphology, whereas EDAR impacts chin protrusion.[27] …

Ashley Montagu lists “neotenous structural traits in which…Negroids [generally] differ from Caucasoids… flattish nose, flat root of the nose, narrower ears, narrower joints, frontal skull eminences, later closure of premaxillarysutures, less hairy, longer eyelashes, [and] cruciform pattern of second and third molars.”[28]

The Wikipedia page on Dark Skin states:

As hominids gradually lost their fur (between 4.5 and 2 million years ago) to allow for better cooling through sweating, their naked and lightly pigmented skin was exposed to sunlight. In the tropics, natural selection favoured dark-skinned human populations as high levels of skin pigmentation protected against the harmful effects of sunlight. Indigenous populations’ skin reflectance (the amount of sunlight the skin reflects) and the actual UV radiation in a particular geographic area is highly correlated, which supports this idea. Genetic evidence also supports this notion, demonstrating that around 1.2 million years ago there was a strong evolutionary pressure which acted on the development of dark skin pigmentation in early members of the genus Homo.[25]

About 7 million years ago human and chimpanzee lineages diverged, and between 4.5 and 2 million years ago early humans moved out of rainforests to the savannas of East Africa.[23][28] They not only had to cope with more intense sunlight but had to develop a better cooling system. …

Skin colour is a polygenic trait, which means that several different genes are involved in determining a specific phenotype. …

Data collected from studies on MC1R gene has shown that there is a lack of diversity in dark-skinned African samples in the allele of the gene compared to non-African populations. This is remarkable given that the number of polymorphisms for almost all genes in the human gene pool is greater in African samples than in any other geographic region. So, while the MC1Rf gene does not significantly contribute to variation in skin colour around the world, the allele found in high levels in African populations probably protects against UV radiation and was probably important in the evolution of dark skin.[57][58]

Skin colour seems to vary mostly due to variations in a number of genes of large effect as well as several other genes of small effect (TYR, TYRP1, OCA2, SLC45A2, SLC24A5, MC1R, KITLG and SLC24A4). This does not take into account the effects of epistasis, which would probably increase the number of related genes.[59] Variations in the SLC24A5 gene account for 20–25% of the variation between dark and light skinned populations of Africa,[60] and appear to have arisen as recently as within the last 10,000 years.[61] The Ala111Thr or rs1426654 polymorphism in the coding region of the SLC24A5 gene reaches fixation in Europe, and is also common among populations in North Africa, the Horn of Africa, West Asia, Central Asia and South Asia.[62][63][64]

That’s rather interesting about MC1R. It could imply that the difference in skin tone between SSAs and non-SSAs is due to active selection in Blacks for dark skin and relaxed selection in non-Blacks, rather than active selection for light skin in non-Blacks.

The page on MC1R states:

MC1R is one of the key proteins involved in regulating mammalianskin and hair color. …It works by controlling the type of melanin being produced, and its activation causes the melanocyte to switch from generating the yellow or red phaeomelanin by default to the brown or black eumelanin in replacement. …

This is consistent with active selection being necessary to produce dark skin, and relaxed selection producing lighter tones.

Studies show the MC1R Arg163Gln allele has a high frequency in East Asia and may be part of the evolution of light skin in East Asian populations.[40] No evidence is known for positive selection of MC1R alleles in Europe[41] and there is no evidence of an association between MC1R and the evolution of light skin in European populations.[42] The lightening of skin color in Europeans and East Asians is an example of convergent evolution.

However, we should also note:

Dark-skinned people living in low sunlight environments have been recorded to be very susceptible to vitamin D deficiency due to reduced vitamin D synthesis. A dark-skinned person requires about six times as much UVB than lightly pigmented persons.

PCA graph and map of sampling locations. Modern people are indicated with gray circles.

Unfortunately, most of the work on human skin tones has been done among Europeans (and, oddly, zebra fish,) limiting our knowledge about the evolution of African skin tones, which is why this post has been sitting in my draft file for months. Luckily, though, two recent studies–Loci Associated with Skin Pigmentation Identified in African Populations and Reconstructing Prehistoric African Population Structure–have shed new light on African evolution.

In Reconstructing Prehistoric African Population Structure, Skoglund et al assembled genetic data from 16 prehistoric Africans and compared them to DNA from nearby present-day Africans. They found:

  1. The ancestors of the Bushmen (aka the San/KhoiSan) once occupied a much wider area.
  2. They contributed about 2/3s of the ancestry of ancient Malawi hunter-gatherers (around 8,100-2,500 YA)
  3. Contributed about 1/3 of the ancestry of ancient Tanzanian hunter-gatherers (around 1,400 YA)
  4. Farmers (Bantus) spread from west Africa, completely replacing hunter-gatherers in some areas
  5. Modern Malawians are almost entirely Bantu.
  6. A Tanzanian pastoralist population from 3,100 YA spread out across east Africa and into southern Africa
  7. Bushmen ancestry was not found in modern Hadza, even though they are hunter-gatherers and speak a click language like the Bushmen.
  8. The Hadza more likely derive most of their ancestry from ancient Ethiopians
  9. Modern Bantu-speakers in Kenya derive from a mix between western Africans and Nilotics around 800-400 years ago.
  10. Middle Eastern (Levant) ancestry is found across eastern Africa from an admixture event that occurred around 3,000 YA, or around the same time as the Bronze Age Collapse.
  11. A small amount of Iranian DNA arrived more recently in the Horn of Africa
  12. Ancient Bushmen were more closely related to modern eastern Africans like the Dinka (Nilotics) and Hadza than to modern west Africans (Bantus),
  13. This suggests either complex relationships between the groups or that some Bantus may have had ancestors from an unknown group of humans more ancient than the Bushmen.
  14. Modern Bushmen have been evolving darker skins
  15. Pygmies have been evolving shorter stature
Automated clustering of ancient and modern populations (moderns in gray)

I missed #12-13 on my previous post about this paper, though I did note that the more data we get on ancient African groups, the more likely I think we are to find ancient admixture events. If humans can mix with Neanderthals and Denisovans, then surely our ancestors could have mixed with Ergaster, Erectus, or whomever else was wandering around.

Distribution of ancient Bushmen and Ethiopian DNA in south and east Africa

#15 is interesting, and consistent with the claim that Bushmen suffer from a lot of skin cancer–before the Bantu expansion, they lived in far more forgiving climates than the Kalahari desert. But since Bushmen are already lighter than their neighbors, this begs the question of how light their ancestors–who had no Levantine admixture–were. Could the Bantus’ and Nilotics’ darker skins have evolved after the Bushmen/everyone else split?

Meanwhile, in Loci Associated with Skin Pigmentation Identified in African Populations, Crawford et al used genetic samples from 1,570 people from across Africa to find six genetic areas–SLC24A5, MFSD12, DDB1, TMEM138, OCA2 and HERC2–which account for almost 30% of the local variation in skin color.

Bantu (green) and Levantine/pastoralist DNA in modern peoples

SLC24A5 is a light pigment introduced to east Africa from the Levant, probably around 3,000 years ago. Today, it is common in Ethiopia and Tanzania.

Interestingly, according to the article, “At all other loci, variants associated with dark pigmentation in Africans are identical by descent in southern Asian and Australo-Melanesian populations.”

These are the world’s other darkest peoples, such as the Jarawas of the Andaman Islands or the Melanesians of Bougainville, PNG. (And, I assume, some groups from India such as the Tamils.) This implies that these groups 1. had dark skin already when they left Africa, and 2. Never lost it on their way to their current homes. (If they had gotten lighter during their journey and then darkened again upon arrival, they likely would have different skin color variants than their African cousins.)

This implies that even if the Bushmen split off (around 200,000-300,000 YA) before dark skin evolved, it had evolved by the time people left Africa and headed toward Australia (around 100,000-70,000 YA.) This gives us a minimum threshold: it most likely evolved before 70,000 YA.

(But as always, we should be careful because perhaps there are even more skin color variant that we don’t know about yet in these populations.)

MFSD12 is common among Nilotics and is related to darker skin.

And according to the abstract, which Razib Khan posted:

Further, the alleles associated with skin pigmentation at all loci but SLC24A5 are ancient, predating the origin of modern humans. The ancestral alleles at the majority of predicted causal SNPs are associated with light skin, raising the possibility that the ancestors of modern humans could have had relatively light skin color, as is observed in the San population today.

The full article is not out yet, so I still don’t know when all of these light and dark alleles emerged, but the order is absolutely intriguing. For now, it looks like this mystery will still have to wait.

 

Two Exciting Papers on African Genetics

I loved that movie
Nǃxau ǂToma, (aka Gcao Tekene Coma,) Bushman star of “The Gods Must be Crazy,” roughly 1944-2003

An interesting article on Clues to Africa’s Mysterious Past appeared recently in the NY Times:

It was only two years ago that researchers found the first ancient human genome in Africa: a skeleton in a cave in Ethiopia yielded DNA that turned out to be 4,500 years old.

On Thursday, an international team of scientists reported that they had recovered far older genes from bone fragments in Malawi dating back 8,100 years. The researchers also retrieved DNA from 15 other ancient people in eastern and southern Africa, and compared the genes to those of living Africans.

Let’s skip to the article, Reconstructing Prehistoric African Population Structure by Skoglund et al:

We assembled genome-wide data from 16 prehistoric Africans. We show that the anciently divergent lineage that comprises the primary ancestry of the southern African San had a wider distribution in the past, contributing approximately two-thirds of the ancestry of Malawi hunter-gatherers ∼8,100–2,500 years ago and approximately one-third of the ancestry of Tanzanian hunter-gatherers ∼1,400 years ago.

Paths of the great Bantu Migration

The San are also known as the Bushmen, a famous group of recent hunter-gatherers from southern Africa.

We document how the spread of farmers from western Africa involved complete replacement of local hunter-gatherers in some regions…

This is most likely the Great Bantu Migration, which I wrote about in Into Africa: the Great Bantu Migration.

…and we track the spread of herders by showing that the population of a ∼3,100-year-old pastoralist from Tanzania contributed ancestry to people from northeastern to southern Africa, including a ∼1,200-year-old southern African pastoralist…

Whereas the two individuals buried in ∼2,000 BP hunter-gatherer contexts in South Africa share ancestry with southern African Khoe-San populations in the PCA, 11 of the 12 ancient individuals who lived in eastern and south-central Africa between ∼8,100 and ∼400 BP form a gradient of relatedness to the eastern African Hadza on the one hand and southern African Khoe-San on the other (Figure 1A).

The Hadza are a hunter-gatherer group from Tanzania who are not obviously related to any other people. Their language has traditionally been classed alongside the languages of the KhoiSan/Bushmen people because they all contain clicks, but the languages otherwise have very little in common and Hadza appears to be a language isolate, like Basque.

The genetic cline correlates to geography, running along a north-south axis with ancient individuals from Ethiopia (∼4,500 BP), Kenya (∼400 BP), Tanzania (both ∼1,400 BP), and Malawi (∼8,100–2,500 BP), showing increasing affinity to southern Africans (both ancient individuals and present-day Khoe-San). The seven individuals from Malawi show no clear heterogeneity, indicating a long-standing and distinctive population in ancient Malawi that persisted for at least ∼5,000 years (the minimum span of our radiocarbon dates) but which no longer exists today. …

We find that ancestry closely related to the ancient southern Africans was present much farther north and east in the past than is apparent today. This ancient southern African ancestry comprises up to 91% of the ancestry of Khoe-San groups today (Table S5), and also 31% ± 3% of the ancestry of Tanzania_Zanzibar_1400BP, 60% ± 6% of the ancestry of Malawi_Fingira_6100BP, and 65% ± 3% of the ancestry of Malawi_Fingira_2500BP (Figure 2A). …

Both unsupervised clustering (Figure 1B) and formal ancestry estimation (Figure 2B) suggest that individuals from the Hadza group in Tanzania can be modeled as deriving all their ancestry from a lineage related deeply to ancient eastern Africans such as the Ethiopia_4500BP individual …

So what’s up with the Tanzanian expansion mentioned in the summary?

Western-Eurasian-related ancestry is pervasive in eastern Africa today … and the timing of this admixture has been estimated to be ∼3,000 BP on average… We found that the ∼3,100 BP individual… associated with a Savanna Pastoral Neolithic archeological tradition, could be modeled as having 38% ± 1% of her ancestry related to the nearly 10,000-year-old pre-pottery farmers of the Levant These results could be explained by migration into Africa from descendants of pre-pottery Levantine farmers or alternatively by a scenario in which both pre-pottery Levantine farmers and Tanzania_Luxmanda_3100BP descend from a common ancestral population that lived thousands of years earlier in Africa or the Near East. We fit the remaining approximately two-thirds of Tanzania_Luxmanda_3100BP as most closely related to the Ethiopia_4500BP…

…present-day Cushitic speakers such as the Somali cannot be fit simply as having Tanzania_Luxmanda_3100BP ancestry. The best fitting model for the Somali includes Tanzania_Luxmanda_3100BP ancestry, Dinka-related ancestry, and 16% ± 3% Iranian-Neolithic-related ancestry (p = 0.015). This suggests that ancestry related to the Iranian Neolithic appeared in eastern Africa after earlier gene flow related to Levant Neolithic populations, a scenario that is made more plausible by the genetic evidence of admixture of Iranian-Neolithic-related ancestry throughout the Levant by the time of the Bronze Age …and in ancient Egypt by the Iron Age …

There is then a discussion of possible models of ancient African population splits (were the Bushmen the first? How long have they been isolated?) I suspect the more ancient African DNA we uncover, the more complicated the tree will become, just as in Europe and Asia we’ve discovered Neanderthal and Denisovan admixture.

They also compared genomes to look for genetic adaptations and found evidence for selection for taste receptors and “response to radiation” in the Bushmen, which the authors note “could be due to exposure to sunlight associated with the life of the ‡Khomani and Ju|’hoan North people in the Kalahari Basin, which has become a refuge for hunter-gatherer populations in the last millenia due to encroachment by pastoralist and agriculturalist groups.”

(The Bushmen are lighter than Bantus, with a more golden or tan skin tone.)

They also found evidence of selection for short stature among the Pygmies (which isn’t really surprising to anyone, unless you thought they had acquired their heights by admixture with another very short group of people.)

Overall, this is a great paper and I encourage you to RTWT, especially the pictures/graphs.

Now, if that’s not enough African DNA for you, we also have Loci Associated with Skin Pigmentation Identified in African Populations, by Crawford et al:

Examining ethnically diverse African genomes, we identify variants in or near SLC24A5, MFSD12, DDB1, TMEM138, OCA2 and HERC2 that are significantly associated with skin pigmentation. Genetic evidence indicates that the light pigmentation variant at SLC24A5 was introduced into East Africa by gene flow from non-Africans. At all other loci, variants associated with dark pigmentation in Africans are identical by descent in southern Asian and Australo-Melanesian populations. Functional analyses indicate that MFSD12 encodes a lysosomal protein that affects melanogenesis in zebrafish and mice, and that mutations in melanocyte-specific regulatory regions near DDB1/TMEM138 correlate with expression of UV response genes under selection in Eurasians.

I’ve had an essay on the evolution of African skin tones sitting in my draft folder for ages because this research hadn’t been done. There’s plenty of research on European and Asian skin tones (skin appears to have significantly lightened around 10,000 years ago in Europeans,) but much less on Africans. Luckily for me, this paper fixes that.

Looks like SLC24A5 is related to that Levantine/Iranian back-migration into Africa documented in the first paper.

Zoroastrian (Parsi) DNA

Farvahar. Persepolis, Iran.

Zoroastrianism is one of the world’s oldest surviving religions and possibly its first monotheistic one. It emerged in now-Iran about 3,000 years ago, but following the Arab (Islamic) conquest of Persia, many Zoroastrians migrated to India, where they became known as the Parsi (from the word for “Persian.”) To be clear, where this post refers to “Parsis” it means the specific Zoroastrian community in India, and where it refers to “Iranian Zoroastrians” it means the Zoroastrians currently living in Iran.

Although Zoroastrianism was once the official state religion of Persia, today only about 190,000 believers remain (according to Wikipedia,) and their numbers are declining.

If you’re thinking that a diasporic community of monotheists sounds familiar, you’re in good company. According to Wikipedia:

Portuguese physician Garcia de Orta observed in 1563 that “there are merchants … in the kingdom of Cambaia … known as Esparcis. We Portuguese call them Jews, but they are not so. They are Gentios.”

Another parallel: Ashkenazi Jews and Parsis are both reported to be very smart. Famous Parsis include Queen Guitarist Freddy Mercury, nuclear physicist Homi J. Bhabha, and our Harvard-employed friend, Homi K. Bhabha.

Lopez et al have recently carried out a very interesting study of Zoroastrian DNA, The Genetic Legacy of Zoroastrianism in Iran and India: Insights into Population Structure, Gene Flow, and Selection:

Historical records indicate that migrants from Persia brought Zoroastrianism to India, but there is debate over the timing of these migrations. Here we present genome-wide autosomal, Y chromosome, and mitochondrial DNA data from Iranian and Indian Zoroastrians and neighboring modern-day Indian and Iranian populations and conduct a comprehensive genome-wide genetic analysis in these groups. … we find that Zoroastrians in Iran and India have increased genetic homogeneity relative to other sampled groups in their respective countries, consistent with their current practices of endogamy. Despite this, we infer that Indian Zoroastrians (Parsis) intermixed with local groups sometime after their arrival in India, dating this mixture to 690–1390 CE and providing strong evidence that Iranian Zoroastrian ancestry was maintained primarily through the male line.

Note that all diasporic–that is, migrant–groups appear to be heavily male. Women tend to stay put while men move and take new wives in their new homelands.

By making use of the rich information in DNA from ancient human remains, we also highlight admixture in the ancestors of Iranian Zoroastrians dated to 570 BCE–746 CE, older than admixture seen in any other sampled Iranian group, consistent with a long-standing isolation of Zoroastrians from outside groups. …

Admixture with whom? (Let’s just read the paper and see if it answers the question):

Furthermore, a recent study using genome-wide autosomal DNA found that haplotype patterns in Iranian Zoroastrians matched more than other modern Iranian groups to a high-coverage early Neolithic farmer genome from Iran

A study of four restriction fragment length polymorphisms (RFLPs) suggested a closer genetic affinity of Parsis to Southern Europeans than to non-Parsis from Bombay. Furthermore, NRY haplotype analysis and patterns of variation at the HLA locus in the Parsis of Pakistan support a predominately Iranian origin. …

In (1) and (2), we detected admixture in the Parsis dated to 27 (range: 17–38) and 32 (19–44) generations ago, respectively, in each case between one predominantly Indian-like source and one predominantly Iranian-like source. This large contribution from an Iranian-like source (∼64%–76%) is not seen in any of our other 7 Indian clusters, though we detect admixture in each of these 7 groups from wide-ranging sources related to modern day individuals from Bangladesh, Cambodia, Europe, Pakistan, or of Jewish heritage (Figures 2 and S7, Tables S5–S7). For Iranian Zoroastrians, we detect admixture only under analysis (2), occurring 66 (42–89) generations ago between a source best genetically explained as a mixture of modern-day Croatian and Cypriot samples, and a second source matching to the Neolithic Iranian farmer WC1. … The two Iranian Zoroastrians that had been excluded as outliers exhibited admixture patterns more similar to the Lebanese, Turkish Jews, or Iranian Bandari individuals than to Zoroastrians (Table S8).

Parsi Wedding, 1905

If I assume a generation is about 25 years long, 27 generations was about 675 years ago; 32 was about 800 years ago. (Though given the wide range on these dates, perhaps we should estimate between 425 and 1,100 years ago.) This sounds consistent with Parsis taking local wives after they arrived in India between the 8th and 10th century CE (after the Arab conquest of Perisa.) Also consistently, this admixture isn’t found in Iranian Zoroastrians.

The Iranians’ admixture occurred about 1,050 and 2,225 years ago, which is an awfully broad time range. Could Croatian or Cypriot migrants have arrived due to the Greek/Roma/ Byzantine Empires? Were they incorporated into the Persian Empire as a result of its territorial conquests or the Arab conquest? Or were they just long-distance merchants who happened to wander into the area?

The Fire Temple of Baku

The authors found that Parsi priests had “the lowest gene diversity values of all population samples studied for both Y and mtDNA,” though they didn’t have enough Iranian Zoroastrian priest samples to compare them to Parsi priests. (I bet this is similar to what you’d find if you sampled Orthodox rabbis.)

Finally, in the genetic selection and diseases section, the authors write:

In the case of the Iranian Zoroastrians, … some of the most significant SNPs… are located upstream of gene SLC39A10 … with an important role in humoral immunity61 or in CALB2 … which plays a major role in the cerebellar physiology.62

With regard to the positive selection tests on Parsis versus India Hindu/Gujarati groups, the most significant SNPs were embedded in WWOX … associated with neurological disorders like early epilepsy … and in a region in chromosome 20 … (see Table S11 for a complete list). …

Genetic isolation and endogamous practices can be associated with higher frequencies of disease prevalence. For example, there are reports claiming a high recurrence of diseases such as diabetes among the Iranian Zoroastrians, and Parkinson, colon cancer, or the deficiency of G6PD, an enzyme that triggers the sudden reduction of red blood cells, among the Parsis.

However, the authors warn that these results are weak (these are rare conditions in an already small population) and cannot not be depended upon.

The Negritos of Sundaland, Sahul, and the Philippines

Ati (Negrito) woman from the Philippines

The Negritos are a fascinating group of short-statured, dark-skinned, frizzy-haired peoples from southeast Asia–chiefly the Andaman Islands, Malaysia, Philippines, and Thailand. (Spelling note: “Negritoes” is also an acceptable plural, and some sources use the Spanish Negrillos.)

Because of their appearance, they have long been associated with African peoples, especially the Pygmies. Pygmies are formally defined as any group where adult men are, on average 4’11” or less and is almost always used specifically to refer to African Pygmies; the term pygmoid is sometimes used for groups whose men average 5’1″ or below, including the Negritos. (Some of the Bushmen tribes, Bolivians, Amazonians, the remote Taron, and a variety of others may also be pygmoid, by this definition.)

However, genetic testing has long indicated that they, along with other Melanesians and Australian Aborigines, are more closely related to other east Asian peoples than any African groups. In other words, they’re part of the greater Asian race, albeit a distant branch of it.

But how distant? And are the various Negrito groups closely related to each other, or do there just happen to be a variety of short groups of people in the area, perhaps due to convergent evolution triggered by insular dwarfism?

From Wikimedia

In Discerning the origins of the Negritos, First Sundaland Peoples: deep divergence and archaic admixture, Jinam et al gathered genetic data from Filipino, Malaysian, and Andamanese Negrito populations, and compared them both to each other and other Asian, African, and European groups. (Be sure to download the supplementary materials to get all of the graphs and maps.)

They found that the Negrito groups they studied “are basal to other East and Southeast Asians,” (basal: forming the bottom layer or base. In this case, it means they split off first,) “and that they diverged from West Eurasians at least 38,000 years ago.” (West Eurasians: Caucasians, consisting of Europeans, Middle Easterners, North Africans, and people from India.) “We also found relatively high traces of Denisovan admixture in the Philippine Negritos, but not in the Malaysian and Andamanese groups.” (Denisovans are a group of extinct humans similar to Neanderthals, but we’ve yet to find many of their bones. Just as Neanderthal DNA shows up in non-Sub-Saharan-Africans, so Denisvoan shows up in Melanesians.)

Figure 1 (A) shows PC analysis of Andamanese, Malaysian, and Philippine Negritos, revealing three distinct clusters:

In the upper right-hand corner, the Aeta, Agta, Batak, and Mamanwa are Philippine Negritos. The Manobo are non-Negrito Filipinos.

In the lower right-hand corner are the Jehai, Kintak and Batek are Malaysian Negritos.

And in the upper left, we have the extremely isolated Andamanese Onge and Jarawa Negritos.

(Phil-NN and Mly-NN I believe are Filipino and Malaysian Non-Negritos.)

You can find the same chart, but flipped upside down, with Papuan and Melanesian DNA in the supplemental materials. Of the three groups, they cluster closest to the Philippine Negritos, along the same line with the Malaysians.

By excluding the Andamanese (and Kintak) Negritos, Figure 1 (B) allows a closer look at the structure of the Philippine Negritos.

The Agta, Aeta, and Batak form a horizontal “comet-like pattern,” which likely indicates admixture with non-Negrito Philipine groups like the Manobo. The Mamanawa, who hail from a different part of the Philippines, also show this comet-like patterns, but along a different axis–likely because they intermixed with the different Filipinos who lived in their area. As you can see, there’s a fair amount of overlap–several of the Manobo individuals clustered with the Mamanwa Negritos, and the Batak cluster near several non-Negrito groups (see supplemental chart S4 B)–suggesting high amounts of mixing between these groups.

ADMIXTURE analysis reveals a similar picture. The non-Negrito Filipino groups show up primarily as Orange. The Aeta, Agta, and Batak form a clear genetic cluster with each other and cline with the Orange Filipinos, with the Aeta the least admixed and Batak the most.

The white are on the chart isn’t a data error, but the unique signature of the geographically separated Mananwa, who are highly mixed with the Manobo–and the Manobo, in turn, are mixed with them.

But this alone doesn’t tell us how ancient these populations are, nor if they’re descended from one ancestral pop. For this, the authors constructed several phylogenetic trees, based on all of the data at hand and assuming from 0 – 5 admixture events. The one on the left assumes 5 events, but for clarity only shows three of them. The Denisovan DNA is fascinating and well-documented elsewhere in Melanesian populatons; that Malaysian and Philippine Negritos mixed with their neighbors is also known, supporting the choice of this tree as the most likely to be accurate.

Regardless of which you pick, all of the trees show very similar results, with the biggest difference being whether the Melanesians/Papuans split before or after the Andamanese/Malaysian Negritos.

In case you are unfamiliar with these trees, I’ll run down a quick explanation: This is a human family tree, with each split showing where one group of humans split off from the others and became an isolated group with its own unique genetic patterns. The orange and red lines mark places where formerly isolated groups met and interbred, producing children that are a mix of both. The first split in the tree, going back million of years, is between all Homo sapiens (our species) and the Denisovans, a sister species related to the Neanderthals.

All humans outside of sub-Saharan Africans have some Neanderthal DNA because their ancestors met and interbred with Neanderthals on their way Out of Africa. Melanesians, Papuans, and some Negritos also have some Denisovan DNA, because their ancestors met and made children with members of this obscure human species, but Denisovan DNA is quite rare outside these groups.

Here is a map of Denisovan DNA levels the authors found, with 4% of Papuan DNA hailing from Denisivan ancestors, and Aeta nearly as high. By contrast, the Andamanese Negritos appear to have zero Denisovan. Either the Andamanese split off before the ancestors of the Philippine Negritos and Papuans met the Denisovans, or all Denisovan DNA has been purged from their bloodlines, perhaps because it just wasn’t helpful for surviving on their islands.

Back to the Tree: The second node is where the Biaka, a group of Pygmies from the Congo Rainforest in central Africa. Pygmy lineages are among the most ancient on earth, potentially going back over 200,000 years, well before any Homo sapiens had left Africa.

The next group that splits off from the rest of humanity are the Yoruba, a single ethnic group chosen to stand in for the entirety of the Bantus. Bantus are the group that you most likely think of when you think of black Africans, because over the past three millennia they have expanded greatly and conquered most of sub-Saharan Africa.

Next we have the Out of Africa event and the split between Caucasians (here represented by the French) and the greater Asian clade, which includes Australian Aborigines, Melanesians, Polynesians, Chinese, Japanese, Siberians, Inuit, and Native Americans.

The first groups to split off from the greater Asian clade (aka race) were the Andamanese and Malaysian Negritos, followed by the Papuans/Melanesians Australian Aborigines are closely related to Papuans, as Australia and Papua New Guinea were connected in a single continent (called Sahul) back during the last Ice Age. Most of Indonesia and parts of the Philippines were also connected into a single landmass, called Sunda. Sensibly, people reached Sunda before Sahul, though (Perhaps at that time the Andaman islands, to the northwest of Sumatra, were also connected or at least closer to the mainland.)

Irrespective of the exact order in which Melanesians and individual Negrito groups split off, they all split well before all of the other Asian groups in the area.

This is supported by legends told by the Filipinos themselves:

Legends, such as those involving the Ten Bornean Datus and the Binirayan Festival, tell tales about how, at the beginning of the 12th century when Indonesia and Philippines were under the rule of Indianized native kingdoms, the ancestors of the Bisaya escaped from Borneo from the persecution of Rajah Makatunaw. Led by Datu Puti and Datu Sumakwel and sailing with boats called balangays, they landed near a river called Suaragan, on the southwest coast of Panay, (the place then known as Aninipay), and bartered the land from an Ati [Negrito] headman named Polpolan and his son Marikudo for the price of a necklace and one golden salakot. The hills were left to the Atis while the plains and rivers to the Malays. This meeting is commemorated through the Ati-atihan festival.[4]

The study’s authors estimate that the Negritos split from Europeans (Caucasians) around 30-38,000 years ago, and that the Malaysian and Philippine Negritos split around
13-15,000 years ago. (This all seems a bit tentative, IMO, especially since we have physical evidence of people in the area going back much further than that, and the authors themselves admit in the discussion that their time estimate may be too short.)

The authors also note:

Both our NJ (fig. 3A) and UPGMA (supplementary fig. S10) trees show that after divergence from Europeans, the ancestral Asians subsequently split into Papuans, Negritos and East Asians, implying a one-wave colonization of Asia. … This is in contrast to the study based on whole genome sequences that suggested Australian Aboriginal/Papuan first split from European/East Asians 60 kya, and later Europeans and East Asians diverged 40 kya (Malaspinas et al. 2016). This implies a two-wave migration into Asia…

The matter is still up for debate/more study.

Negrito couple from the Andaman Islands

In conclusion: All of the Negrito groups are likely descended from a common ancestor, (rather than having evolved from separate groups that happened to develop similar body types due to exposure to similar environments,) and were among the very first inhabitants of their regions. Despite their short stature, they are more closely related to other Asian groups (like the Chinese) than to African Pygmies. Significant mixing with their neighbors, however, is quickly obscuring their ancient lineages.

I wonder if all ancient human groups were originally short, and height a recently evolved trait in some groups?

In closing, I’d like to thank Jinam et al for their hard work in writing this article and making it available to the public, their sponsors, and the unique Negrito peoples themselves for surviving so long.

Evolution is slow–until it’s fast: Genetic Load and the Future of Humanity

Source: Priceonomics

A species may live in relative equilibrium with its environment, hardly changing from generation to generation, for millions of years. Turtles, for example, have barely changed since the Cretaceous, when dinosaurs still roamed the Earth.

But if the environment changes–critically, if selective pressures change–then the species will change, too. This was most famously demonstrated with English moths, which changed color from white-and-black speckled to pure black when pollution darkened the trunks of the trees they lived on. To survive, these moths need to avoid being eaten by birds, so any moth that stands out against the tree trunks tends to get turned into an avian snack. Against light-colored trees, dark-colored moths stood out and were eaten. Against dark-colored trees, light-colored moths stand out.

This change did not require millions of years. Dark-colored moths were virtually unknown in 1810, but by 1895, 98% of the moths were black.

The time it takes for evolution to occur depends simply on A. The frequency of a trait in the population and B. How strongly you are selecting for (or against) it.

Let’s break this down a little bit. Within a species, there exists a great deal of genetic variation. Some of this variation happens because two parents with different genes get together and produce offspring with a combination of their genes. Some of this variation happens because of random errors–mutations–that occur during copying of the genetic code. Much of the “natural variation” we see today started as some kind of error that proved to be useful, or at least not harmful. For example, all humans originally had dark skin similar to modern Africans’, but random mutations in some of the folks who no longer lived in Africa gave them lighter skin, eventually producing “white” and “Asian” skin tones.

(These random mutations also happen in Africa, but there they are harmful and so don’t stick around.)

Natural selection can only act on the traits that are actually present in the population. If we tried to select for “ability to shoot x-ray lasers from our eyes,” we wouldn’t get very far, because no one actually has that mutation. By contrast, albinism is rare, but it definitely exists, and if for some reason we wanted to select for it, we certainly could. (The incidence of albinism among the Hopi Indians is high enough–1 in 200 Hopis vs. 1 in 20,000 Europeans generally and 1 in 30,000 Southern Europeans–for scientists to discuss whether the Hopi have been actively selecting for albinism. This still isn’t a lot of albinism, but since the American Southwest is not a good environment for pale skin, it’s something.)

You will have a much easier time selecting for traits that crop up more frequently in your population than traits that crop up rarely (or never).

Second, we have intensity–and variety–of selective pressure. What % of your population is getting removed by natural selection each year? If 50% of your moths get eaten by birds because they’re too light, you’ll get a much faster change than if only 10% of moths get eaten.

Selection doesn’t have to involve getting eaten, though. Perhaps some of your moths are moth Lotharios, seducing all of the moth ladies with their fuzzy antennae. Over time, the moth population will develop fuzzier antennae as these handsome males out-reproduce their less hirsute cousins.

No matter what kind of selection you have, nor what part of your curve it’s working on, all that ultimately matters is how many offspring each individual has. If white moths have more children than black moths, then you end up with more white moths. If black moths have more babies, then you get more black moths.

Source SUPS.org

So what happens when you completely remove selective pressures from a population?

Back in 1968, ethologist John B. Calhoun set up an experiment popularly called “Mouse Utopia.” Four pairs of mice were given a large, comfortable habitat with no predators and plenty of food and water.

Predictably, the mouse population increased rapidly–once the mice were established in their new homes, their population doubled every 55 days. But after 211 days of explosive growth, reproduction began–mysteriously–to slow. For the next 245 days, the mouse population doubled only once every 145 days.

The birth rate continued to decline. As births and death reached parity, the mouse population stopped growing. Finally the last breeding female died, and the whole colony went extinct.

source

As I’ve mentioned before Israel is (AFAIK) the only developed country in the world with a TFR above replacement.

It has long been known that overcrowding leads to population stress and reduced reproduction, but overcrowding can only explain why the mouse population began to shrink–not why it died out. Surely by the time there were only a few breeding pairs left, things had become comfortable enough for the remaining mice to resume reproducing. Why did the population not stabilize at some comfortable level?

Professor Bruce Charlton suggests an alternative explanation: the removal of selective pressures on the mouse population resulted in increasing mutational load, until the entire population became too mutated to reproduce.

What is genetic load?

As I mentioned before, every time a cell replicates, a certain number of errors–mutations–occur. Occasionally these mutations are useful, but the vast majority of them are not. About 30-50% of pregnancies end in miscarriage (the percent of miscarriages people recognize is lower because embryos often miscarry before causing any overt signs of pregnancy,) and the majority of those miscarriages are caused by genetic errors.

Unfortunately, randomly changing part of your genetic code is more likely to give you no skin than skintanium armor.

But only the worst genetic problems that never see the light of day. Plenty of mutations merely reduce fitness without actually killing you. Down Syndrome, famously, is caused by an extra copy of chromosome 21.

While a few traits–such as sex or eye color–can be simply modeled as influenced by only one or two genes, many traits–such as height or IQ–appear to be influenced by hundreds or thousands of genes:

Differences in human height is 60–80% heritable, according to several twin studies[19] and has been considered polygenic since the Mendelian-biometrician debate a hundred years ago. A genome-wide association (GWA) study of more than 180,000 individuals has identified hundreds of genetic variants in at least 180 loci associated with adult human height.[20] The number of individuals has since been expanded to 253,288 individuals and the number of genetic variants identified is 697 in 423 genetic loci.[21]

Obviously most of these genes each plays only a small role in determining overall height (and this is of course holding environmental factors constant.) There are a few extreme conditions–gigantism and dwarfism–that are caused by single mutations, but the vast majority of height variation is caused by which particular mix of those 700 or so variants you happen to have.

The situation with IQ is similar:

Intelligence in the normal range is a polygenic trait, meaning it’s influenced by more than one gene.[3][4]

The general figure for the heritability of IQ, according to an authoritative American Psychological Association report, is 0.45 for children, and rises to around 0.75 for late teens and adults.[5][6] In simpler terms, IQ goes from being weakly correlated with genetics, for children, to being strongly correlated with genetics for late teens and adults. … Recent studies suggest that family and parenting characteristics are not significant contributors to variation in IQ scores;[8] however, poor prenatal environment, malnutrition and disease can have deleterious effects.[9][10]

And from a recent article published in Nature Genetics, Genome-wide association meta-analysis of 78,308 individuals identifies new loci and genes influencing human intelligence:

Despite intelligence having substantial heritability2 (0.54) and a confirmed polygenic nature, initial genetic studies were mostly underpowered3, 4, 5. Here we report a meta-analysis for intelligence of 78,308 individuals. We identify 336 associated SNPs (METAL P < 5 × 10−8) in 18 genomic loci, of which 15 are new. Around half of the SNPs are located inside a gene, implicating 22 genes, of which 11 are new findings. Gene-based analyses identified an additional 30 genes (MAGMA P < 2.73 × 10−6), of which all but one had not been implicated previously. We show that the identified genes are predominantly expressed in brain tissue, and pathway analysis indicates the involvement of genes regulating cell development (MAGMA competitive P = 3.5 × 10−6). Despite the well-known difference in twin-based heritability2 for intelligence in childhood (0.45) and adulthood (0.80), we show substantial genetic correlation (rg = 0.89, LD score regression P = 5.4 × 10−29). These findings provide new insight into the genetic architecture of intelligence.

The greater number of genes influence a trait, the harder they are to identify without extremely large studies, because any small group of people might not even have the same set of relevant genes.

High IQ correlates positively with a number of life outcomes, like health and longevity, while low IQ correlates with negative outcomes like disease, mental illness, and early death. Obviously this is in part because dumb people are more likely to make dumb choices which lead to death or disease, but IQ also correlates with choice-free matters like height and your ability to quickly press a button. Our brains are not some mysterious entities floating in a void, but physical parts of our bodies, and anything that affects our overall health and physical functioning is likely to also have an effect on our brains.

Like height, most of the genetic variation in IQ is the combined result of many genes. We’ve definitely found some mutations that result in abnormally low IQ, but so far we have yet (AFAIK) to find any genes that produce the IQ gigantism. In other words, low (genetic) IQ is caused by genetic load–Small Yet Important Genetic Differences Between Highly Intelligent People and General Population:

The study focused, for the first time, on rare, functional SNPs – rare because previous research had only considered common SNPs and functional because these are SNPs that are likely to cause differences in the creation of proteins.

The researchers did not find any individual protein-altering SNPs that met strict criteria for differences between the high-intelligence group and the control group. However, for SNPs that showed some difference between the groups, the rare allele was less frequently observed in the high intelligence group. This observation is consistent with research indicating that rare functional alleles are more often detrimental than beneficial to intelligence.

Maternal mortality rates over time, UK data

Greg Cochran has some interesting Thoughts on Genetic Load. (Currently, the most interesting candidate genes for potentially increasing IQ also have terrible side effects, like autism, Tay Sachs and Torsion Dystonia. The idea is that–perhaps–if you have only a few genes related to the condition, you get an IQ boost, but if you have too many, you get screwed.) Of course, even conventional high-IQ has a cost: increased maternal mortality (larger heads).

Wikipedia defines genetic load as:

the difference between the fitness of an average genotype in a population and the fitness of some reference genotype, which may be either the best present in a population, or may be the theoretically optimal genotype. … Deleterious mutation load is the main contributing factor to genetic load overall.[5] Most mutations are deleterious, and occur at a high rate.

There’s math, if you want it.

Normally, genetic mutations are removed from the population at a rate determined by how bad they are. Really bad mutations kill you instantly, and so are never born. Slightly less bad mutations might survive, but never reproduce. Mutations that are only a little bit deleterious might have no obvious effect, but result in having slightly fewer children than your neighbors. Over many generations, this mutation will eventually disappear.

(Some mutations are more complicated–sickle cell, for example, is protective against malaria if you have only one copy of the mutation, but gives you sickle cell anemia if you have two.)

Jakubany is a town in the Carpathian Mountains

Throughout history, infant mortality was our single biggest killer. For example, here is some data from Jakubany, a town in the Carpathian Mountains:

We can see that, prior to the 1900s, the town’s infant mortality rate stayed consistently above 20%, and often peaked near 80%.

The graph’s creator states:

When I first ran a calculation of the infant mortality rate, I could not believe certain of the intermediate results. I recompiled all of the data and recalculated … with the same astounding result – 50.4% of the children born in Jakubany between the years 1772 and 1890 would diebefore reaching ten years of age! …one out of every two! Further, over the same 118 year period, of the 13306 children who were born, 2958 died (~22 %) before reaching the age of one.

Historical infant mortality rates can be difficult to calculate in part because they were so high, people didn’t always bother to record infant deaths. And since infants are small and their bones delicate, their burials are not as easy to find as adults’. Nevertheless, Wikipedia estimates that Paleolithic man had an average life expectancy of 33 years:

Based on the data from recent hunter-gatherer populations, it is estimated that at 15, life expectancy was an additional 39 years (total 54), with a 0.60 probability of reaching 15.[12]

Priceonomics: Why life expectancy is misleading

In other words, a 40% chance of dying in childhood. (Not exactly the same as infant mortality, but close.)

Wikipedia gives similarly dismal stats for life expectancy in the Neolithic (20-33), Bronze and Iron ages (26), Classical Greece(28 or 25), Classical Rome (20-30), Pre-Columbian Southwest US (25-30), Medieval Islamic Caliphate (35), Late Medieval English Peerage (30), early modern England (33-40), and the whole world in 1900 (31).

Over at ThoughtCo: Surviving Infancy in the Middle Ages, the author reports estimates for between 30 and 50% infant mortality rates. I recall a study on Anasazi nutrition which I sadly can’t locate right now, which found 100% malnutrition rates among adults (based on enamel hypoplasias,) and 50% infant mortality.

As Priceonomics notes, the main driver of increasing global life expectancy–48 years in 1950 and 71.5 years in 2014 (according to Wikipedia)–has been a massive decrease in infant mortality. The average life expectancy of an American newborn back in 1900 was only 47 and a half years, whereas a 60 year old could expect to live to be 75. In 1998, the average infant could expect to live to about 75, and the average 60 year old could expect to live to about 80.

Back in his post on Mousetopia, Charlton writes:

Michael A Woodley suggests that what was going on [in the Mouse experiment] was much more likely to be mutation accumulation; with deleterious (but non-fatal) genes incrementally accumulating with each generation and generating a wide range of increasingly maladaptive behavioural pathologies; this process rapidly overwhelming and destroying the population before any beneficial mutations could emerge to ‘save; the colony from extinction. …

The reason why mouse utopia might produce so rapid and extreme a mutation accumulation is that wild mice naturally suffer very high mortality rates from predation. …

Thus mutation selection balance is in operation among wild mice, with very high mortality rates continually weeding-out the high rate of spontaneously-occurring new mutations (especially among males) – with typically only a small and relatively mutation-free proportion of the (large numbers of) offspring surviving to reproduce; and a minority of the most active and healthy (mutation free) males siring the bulk of each generation.

However, in Mouse Utopia, there is no predation and all the other causes of mortality (eg. Starvation, violence from other mice) are reduced to a minimum – so the frequent mutations just accumulate, generation upon generation – randomly producing all sorts of pathological (maladaptive) behaviours.

Historically speaking, another selective factor operated on humans: while about 67% of women reproduced, only 33% of men did. By contrast, according to Psychology Today, a majority of today’s men have or will have children.

Today, almost everyone in the developed world has plenty of food, a comfortable home, and doesn’t have to worry about dying of bubonic plague. We live in humantopia, where the biggest factor influencing how many kids you have is how many you want to have.

source

Back in 1930, infant mortality rates were highest among the children of unskilled manual laborers, and lowest among the children of professionals (IIRC, this is Brittish data.) Today, infant mortality is almost non-existent, but voluntary childlessness has now inverted this phenomena:

Yes, the percent of childless women appears to have declined since 1994, but the overall pattern of who is having children still holds. Further, while only 8% of women with post graduate degrees have 4 or more children, 26% of those who never graduated from highschool have 4+ kids. Meanwhile, the age of first-time moms has continued to climb.

In other words, the strongest remover of genetic load–infant mortality–has all but disappeared; populations with higher load (lower IQ) are having more children than populations with lower load; and everyone is having children later, which also increases genetic load.

Take a moment to consider the high-infant mortality situation: an average couple has a dozen children. Four of them, by random good luck, inherit a good combination of the couple’s genes and turn out healthy and smart. Four, by random bad luck, get a less lucky combination of genes and turn out not particularly healthy or smart. And four, by very bad luck, get some unpleasant mutations that render them quite unhealthy and rather dull.

Infant mortality claims half their children, taking the least healthy. They are left with 4 bright children and 2 moderately intelligent children. The three brightest children succeed at life, marry well, and end up with several healthy, surviving children of their own, while the moderately intelligent do okay and end up with a couple of children.

On average, society’s overall health and IQ should hold steady or even increase over time, depending on how strong the selective pressures actually are.

Or consider a consanguineous couple with a high risk of genetic birth defects: perhaps a full 80% of their children die, but 20% turn out healthy and survive.

Today, by contrast, your average couple has two children. One of them is lucky, healthy, and smart. The other is unlucky, unhealthy, and dumb. Both survive. The lucky kid goes to college, majors in underwater intersectionist basket-weaving, and has one kid at age 40. That kid has Down Syndrome and never reproduces. The unlucky kid can’t keep a job, has chronic health problems, and 3 children by three different partners.

Your consanguineous couple migrates from war-torn Somalia to Minnesota. They still have 12 kids, but three of them are autistic with IQs below the official retardation threshold. “We never had this back in Somalia,” they cry. “We don’t even have a word for it.”

People normally think of dysgenics as merely “the dumb outbreed the smart,” but genetic load applies to everyone–men and women, smart and dull, black and white, young and especially old–because we all make random transcription errors when copying our DNA.

I could offer a list of signs of increasing genetic load, but there’s no way to avoid cherry-picking trends I already know are happening, like falling sperm counts or rising (diagnosed) autism rates, so I’ll skip that. You may substitute your own list of “obvious signs society is falling apart at the genes” if you so desire.

Nevertheless, the transition from 30% (or greater) infant mortality to almost 0% is amazing, both on a technical level and because it heralds an unprecedented era in human evolution. The selective pressures on today’s people are massively different from those our ancestors faced, simply because our ancestors’ biggest filter was infant mortality. Unless infant mortality acted completely at random–taking the genetically loaded and unloaded alike–or on factors completely irrelevant to load, the elimination of infant mortality must continuously increase the genetic load in the human population. Over time, if that load is not selected out–say, through more people being too unhealthy to reproduce–then we will end up with an increasing population of physically sick, maladjusted, mentally ill, and low-IQ people.

(Remember, all mental traits are heritable–so genetic load influences everything, not just controversial ones like IQ.)

If all of the above is correct, then I see only 4 ways out:

  1. Do nothing: Genetic load increases until the population is non-functional and collapses, resulting in a return of Malthusian conditions, invasion by stronger neighbors, or extinction.
  2. Sterilization or other weeding out of high-load people, coupled with higher fertility by low-load people
  3. Abortion of high load fetuses
  4. Genetic engineering

#1 sounds unpleasant, and #2 would result in masses of unhappy people. We don’t have the technology for #4, yet. I don’t think the technology is quite there for #2, either, but it’s much closer–we can certainly test for many of the deleterious mutations that we do know of.