North Africa in Genetics and History

detailed map of African and Middle Eastern ethnicities in Haaks et al’s dataset

North Africa is an often misunderstood region in human genetics. Since it is in Africa, people often assume that it contains the same variety of people referenced in terms like “African Americans,” “black Africans,” or even just “Africans.” In reality, the African content contains members of all three of the great human clades–Sub-Saharan Africans in the south, Polynesians (Asian clade) in Madagascar, and Caucasians in the north.

The North African Middle Stone Age and its place in recent human evolution provides an overview of the first 275,000 years of humanity’s history in the region(300,000-25,000 years ago, more or less), including the development of symbolic culture and early human dispersal. Unfortunately the paper is paywalled.

Throughout most of human history, the Sahara–not the Mediterranean or Red seas–has been the biggest local impediment to human migration–thus North Africans are much closer, genetically, to their neighbors in Europe and the Middle East than their neighbors across the desert (and before the domestication of the camel, about 3,000 years ago, the Sahara was even harder to cross.)

But from time to time, global weather patterns change and the Sahara becomes a garden: the Green Sahara. The last time we had a Green Sahara was about 9-7,000 years ago; during this time, people lived, hunted, fished, herded and perhaps farmed throughout areas that are today nearly uninhabited wastes.

The Peopling of the last Green Sahara revealed by high-coverage resequencing of trans-Saharan patrilineages sheds light on how the Green (and subsequently brown) Sahara affected the spread (and separation) of African groups into northern and sub-Saharan:

In order to investigate the role of the last Green Sahara in the peopling of Africa, we deep-sequence the whole non-repetitive portion of the Y chromosome in 104 males selected as representative of haplogroups which are currently found to the north and to the south of the Sahara. … We find that the coalescence age of the trans-Saharan haplogroups dates back to the last Green Sahara, while most northern African or sub-Saharan clades expanded locally in the subsequent arid phase. …

Our findings suggest that the Green Sahara promoted human movements and demographic expansions, possibly linked to the adoption of pastoralism. Comparing our results with previously reported genome-wide data, we also find evidence for a sex-biased sub-Saharan contribution to northern Africans, suggesting that historical events such as the trans-Saharan slave trade mainly contributed to the mtDNA and autosomal gene pool, whereas the northern African paternal gene pool was mainly shaped by more ancient events.

In other words, modern North Africans have some maternal (female) Sub-Saharan DNA that arrived recently via the Islamic slave trade, but most of their Sub-Saharan Y-DNA (male) is much older, hailing from the last time the Sahara was easy to cross.

Note that not much DNA is shared across the Sahara:

After the African humid period, the climatic conditions became rapidly hyper-arid and the Green Sahara was replaced by the desert, which acted as a strong geographic barrier against human movements between northern and sub-Saharan Africa.

A consequence of this is that there is a strong differentiation in the Y chromosome haplogroup composition between the northern and sub-Saharan regions of the African continent. In the northern area, the predominant Y lineages are J-M267 and E-M81, with the former being linked to the Neolithic expansion in the Near East and the latter reaching frequencies as high as 80 % in some north-western populations as a consequence of a very recent local demographic expansion [810]. On the contrary, sub-Saharan Africa is characterised by a completely different genetic landscape, with lineages within E-M2 and haplogroup B comprising most of the Y chromosomes. In most regions of sub-Saharan Africa, the observed haplogroup distribution has been linked to the recent (~ 3 kya) demic diffusion of Bantu agriculturalists, which brought E-M2 sub-clades from central Africa to the East and to the South [1117]. On the contrary, the sub-Saharan distribution of B-M150 seems to have more ancient origins, since its internal lineages are present in both Bantu farmers and non-Bantu hunter-gatherers and coalesce long before the Bantu expansion [1820].

In spite of their genetic differentiation, however, northern and sub-Saharan Africa share at least four patrilineages at different frequencies, namely A3-M13, E-M2, E-M78 and R-V88.

A recent article in Nature, “Whole Y-chromosome sequences reveal an extremely recent origin of the most common North African paternal lineage E-M183 (M81),” tells some of North Africa’s fascinating story:

Here, by using whole Y chromosome sequences, we intend to shed some light on the historical and demographic processes that modelled the genetic landscape of North Africa. Previous studies suggested that the strategic location of North Africa, separated from Europe by the Mediterranean Sea, from the rest of the African continent by the Sahara Desert and limited to the East by the Arabian Peninsula, has shaped the genetic complexity of current North Africans15,16,17. Early modern humans arrived in North Africa 190–140 kya (thousand years ago)18, and several cultures settled in the area before the Holocene. In fact, a previous study by Henn et al.19 identified a gradient of likely autochthonous North African ancestry, probably derived from an ancient “back-to-Africa” gene flow prior to the Holocene (12 kya). In historic times, North Africa has been populated successively by different groups, including Phoenicians, Romans, Vandals and Byzantines. The most important human settlement in North Africa was conducted by the Arabs by the end of the 7th century. Recent studies have demonstrated the complexity of human migrations in the area, resulting from an amalgam of ancestral components in North African groups15,20.

According to the article, E-M81 is dominant in Northwest Africa and absent almost everywhere else in the world.

The authors tested various men across north Africa in order to draw up a phylogenic tree of the branching of E-M183:

The distribution of each subhaplogroup within E-M183 can be observed in Table 1 and Fig. 2. Indeed, different populations present different subhaplogroup compositions. For example, whereas in Morocco almost all subhaplogorups are present, Western Sahara shows a very homogeneous pattern with only E-SM001 and E-Z5009 being represented. A similar picture to that of Western Sahara is shown by the Reguibates from Algeria, which contrast sharply with the Algerians from Oran, which showed a high diversity of haplogroups. It is also worth to notice that a slightly different pattern could be appreciated in coastal populations when compared with more inland territories (Western Sahara, Algerian Reguibates).

Overall, the authors found that the haplotypes were “strikingly similar” to each other and showed little geographic structure besides the coastal/inland differences:

As proposed by Larmuseau et al.25, the scenario that better explains Y-STR haplotype similarity within a particular haplogroup is a recent and rapid radiation of subhaplogroups. Although the dating of this lineage has been controversial, with dates proposed ranging from Paleolithic to Neolithic and to more recent times17,22,28, our results suggested that the origin of E-M183 is much more recent than was previously thought. … In addition to the recent radiation suggested by the high haplotype resemblance, the pattern showed by E-M183 imply that subhaplogroups originated within a relatively short time period, in a burst similar to those happening in many Y-chromosome haplogroups23.

In other words, someone went a-conquering.

Alternatively, given the high frequency of E-M183 in the Maghreb, a local origin of E-M183 in NW Africa could be envisaged, which would fit the clear pattern of longitudinal isolation by distance reported in genome-wide studies15,20. Moreover, the presence of autochthonous North African E-M81 lineages in the indigenous population of the Canary Islands, strongly points to North Africa as the most probable origin of the Guanche ancestors29. This, together with the fact that the oldest indigenous inviduals have been dated 2210 ± 60 ya, supports a local origin of E-M183 in NW Africa. Within this scenario, it is also worth to mention that the paternal lineage of an early Neolithic Moroccan individual appeared to be distantly related to the typically North African E-M81 haplogroup30, suggesting again a NW African origin of E-M183. A local origin of E-M183 in NW Africa > 2200 ya is supported by our TMRCA estimates, which can be taken as 2,000–3,000, depending on the data, methods, and mutation rates used.

However, the authors also note that they can’t rule out a Middle Eastern origin for the haplogroup since their study simply doesn’t include genomes from Middle Eastern individuals. They rule out a spread during the Neolithic expansion (too early) but not the Islamic expansion (“an extensive, male-biased Near Eastern admixture event is registered ~1300 ya, coincidental with the Arab expansion20.”) Alternatively, they suggest E-M183 might have expanded near the end of the third Punic War. Sure, Carthage (in Tunisia) was defeated by the Romans, but the era was otherwise one of great North African wealth and prosperity.


Interesting papers! My hat’s off to the authors. I hope you enjoyed them and get a chance to RTWT.


How to Minimize “Emotional Labor” and “Mental Load”: A Guide for Frazzled Women

A comic strip in the Guardian recently alerted me to the fact that many women are exhausted from the “Mental Load” of thinking about things and need their husbands to pitch in and help. Go ahead and read it.

Whew. There’s a lot to unpack here:

  1. Yes, you have to talk to men. DO NOT EXPECT OTHER PEOPLE TO KNOW WHAT YOU ARE THINKING. Look, if I can get my husband to help me when I need it, you certainly can too. That or you married the wrong man.
  2. Get a dayplanner and write things like “grocery lists” and doctors appointments in it. There’s probably one built into your phone.

There, I solved your problems.

That said, female anxiety (at least in our modern world) appears to be a real thing:

(though American Indians are the real untold story in this graph.)

According to the America’s State of Mind Report (PDF):

Medco data shows that antidepressants are the most commonly used mental health medications and that women have the highest utilization rates.  In 2010, 21 percent of women ages 20 and older were using an antidepressant.  … Men’s use of antidepressants is almost half that of women, but has also been on the rise with a 28 percent increase over the past decade. …

Anxiety disorders are the most common psychiatric illnesses affecting children and adults. … Although anxiety disorders are highly treatable, only about one‐third of sufferers receive treatment. …

Medco data shows that women have the highest utilization rate of anti‐anxiety medications; in
fact, 11 percent of middle‐aged women (ages 45‐64) were on an anti‐anxiety drug treatment in
2010, nearly twice the rate of their male counterparts (5.7 percent).

And based on the age group data, women in their prime working years (but waning childbearing years) have even higher rates of mental illness. (Adult women even take ADHD medicine at slightly higher rates than adult men.)

What causes this? Surely 20% of us–one in 5–can’t actually be mentally ill, can we? Is it biology or culture? Or perhaps a mismatch between biology and culture?

Or perhaps we should just scale back a little, and when we have friends over for dinner, just order a pizza instead of trying to cook two separate meals?

But if you think that berating your husband for merely taking a bottle out of the dishwasher when you asked him to get a bottle out of the dishwasher (instead of realizing this was code for “empty the entire dishwasher”) will make you happier, think again. “Couples who share the workload are more likely to divorce, study finds“:

Divorce rates are far higher among “modern” couples who share the housework than in those where the woman does the lion’s share of the chores, a Norwegian study has found. …

Norway has a long tradition of gender equality and childrearing is shared equally between mothers and fathers in 70 per cent of cases.But when it comes to housework, women in Norway still account for most of it in seven out of 10 couples. The study emphasised women who did most of the chores did so of their own volition and were found to be as “happy” those in “modern” couples. …

The researchers expected to find that where men shouldered more of the burden, women’s happiness levels were higher. In fact they found that it was the men who were happier while their wives and girlfriends appeared to be largely unmoved.

Those men who did more housework generally reported less work-life conflict and were scored slightly higher for wellbeing overall.

Theory: well-adjusted people who love each other are happy to do what it takes to keep the household running and don’t waste time passive-aggressively trying to convince their spouse that he’s a bad person for not reading her mind.

Now let’s talk about biology. The author claims,

Of course, there’s nothing genetic or innate about this behavior. We’re not born with an all-consuming passion for clearing tables, just like boys aren’t born with an utter disinterest for thing lying around.

Of course, the author doesn’t cite any papers from the fields of genetics or behavior psychology to back up her claims–just like she feels entitled to claim that other people should read her mind and absurdly thinks that a good project manager at work doesn’t bother to tell their team what needs to be done, she doesn’t feel any compulsion to cite any proof of her claims. Science says s. We know because some cartoonist on the internet claimed it did.

Over in reality-land, when we make scientific claims about things like genetics, we cite our sources. And women absolutely have an instinct for cleaning things: the Nesting Instinct. No, it isn’t present when we’re born. It kicks in when we’re pregnant–often shortly before going into labor. Here’s an actual scientific paper on the Nesting Instinct published in the scientific journal Evolution and Human Behavior:

In altricial mammals, “nesting” refers to a suite of primarily maternal behaviours including nest-site selection, nest building and nest defense, and the many ways that nonhuman animals prepare themselves for parturition are well studied. In contrast, little research has considered pre-parturient preparation behaviours in women from a functional perspective.

According to the university’s press release about the study:

The overwhelming urge that drives many pregnant women to clean, organize and get life in order—otherwise known  as nesting—is not irrational, but an adaptive behaviour stemming from humans’ evolutionary past.

Researchers from McMaster University suggest that these behaviours—characterized by unusual bursts of energy and a compulsion to organize the household—are a result of a mechanism to protect and prepare for the unborn baby.

Women also become more selective about the company they keep, preferring to spend time only with people they trust, say researchers.

In short, having control over the environment is a key feature of preparing for childbirth, including decisions about where the birth will take place and who will be welcome.

“Nesting is not a frivolous activity,” says Marla Anderson, lead author of the study and a graduate student in the Department of Psychology, Neuroscience & Behaviour.  “We have found that it peaks in the third trimester as the birth of the baby draws near and is an important task that probably serves the same purpose in women as it does in other animals.”

Even Wikipeidia cites a number of sources on the subject:

Nesting behaviour refers to an instinct or urge in pregnant animals caused by the increase of estradiol (E2) [1] to prepare a home for the upcoming newborn(s). It is found in a variety of animals such as birds, fish, squirrels, mice and pigs as well as humans.[2][3]

Nesting is pretty much impossible to miss if you’ve ever been pregnant or around pregnant women.

Of course, this doesn’t prove the instinct persists (though in my personal case it definitely did.)

By the way, estradiol is a fancy name for estrogen, which is found in much higher levels in women than men. (Just to be rigorous, here’s data on estrogen levels in normal men and women.)

So if high estradiol levels make a variety of mammals–including humans–want to clean things, and women between puberty and menopause consistently have higher levels of estrogen than men, then it seems fairly likely that women actually do have, on average, a higher innate, biological, instinctual, even genetic urge to clean and organize their homes than men do.

But returning to the comic, the author claims:

But we’re born into a society in which very early on, we’re given dolls and miniature vacuum cleaners, and in which it seems shameful for boys to like those same toys.

What bollocks. I used to work at a toystore. Yes, we stocked toy vacuum cleaners and the like in a “Little Helpers” set. We never sold a single one, and I worked there over Christmas. (Great times.)

I am always on the lookout for toys my kids would enjoy and receive constant feedback on whether they like my choices. (“A book? Why did Santa bring me a book? Books are boring!”)

I don’t spend money getting more of stuff my kids aren’t interested in. A child who doesn’t like dolls isn’t going to get a bunch of dolls and be ordered to sit and play with them and nothing else. A child who doesn’t like trucks isn’t going to get a bunch of trucks.

Assuming that other parents are neither stupid (unable to tell which toys their children like) nor evil (forcing their children to play with specific toys even though they know they don’t like them,) I conclude that children’s toys reflect the children’s actual preferences, not the parents’ (for goodness’s sakes, it if it were up to me, I’d socialize my children to be super-geniuses who spend all of their time reading textbooks and whose toys are all science and math manipulatives, not toy dump trucks!)

Even young rhesus monkeys–who cannot talk and obviously have not been socialized into human gender norms–have the same gendered toy preferences as humans:

We compared the interactions of 34 rhesus monkeys, living within a 135 monkey troop, with human wheeled toys and plush toys. Male monkeys, like boys, showed consistent and strong preferences for wheeled toys, while female monkeys, like girls, showed greater variability in preferences. Thus, the magnitude of preference for wheeled over plush toys differed significantly between males and females. The similarities to human findings demonstrate that such preferences can develop without explicit gendered socialization.

Young female chimps also make their own dolls:

Now new research suggests that such gender-driven desires are also seen in young female chimpanzees in the wild—a behavior that possibly evolved to make the animals better mothers, experts say.

Young females of the Kanyawara chimpanzee community in Kibale National Park, Uganda, use sticks as rudimentary dolls and care for them like the group’s mother chimps tend to their real offspring. The behavior, which was very rarely observed in males, has been witnessed more than a hundred times over 14 years of study.

In Jane Goodall’s revolutionary research on the Gombe Chimps, she noted the behavior of young females who often played with or held their infant siblings, in contrast to young males who generally preferred not to.

And just as estradiol levels have an effect on how much cleaning women want to do, so androgen levels have an effect on which toys children prefer to play with:

Gonadal hormones, particularly androgens, direct certain aspects of brain development and exert permanent influences on sex-typical behavior in nonhuman mammals. Androgens also influence human behavioral development, with the most convincing evidence coming from studies of sex-typical play. Girls exposed to unusually high levels of androgens prenatally, because they have the genetic disorder, congenital adrenal hyperplasia (CAH), show increased preferences for toys and activities usually preferred by boys, and for male playmates, and decreased preferences for toys and activities usually preferred by girls. Normal variability in androgen prenatally also has been related to subsequent sex-typed play behavior in girls, and nonhuman primates have been observed to show sex-typed preferences for human toys. These findings suggest that androgen during early development influences childhood play behavior in humans at least in part by altering brain development.

But the author of the comic strip would like us to believe that gender roles are a result of watching the wrong stuff on TV:

And in which culture and media essentially portray women as mothers and wives, while men are heroes who go on fascinating adventures away from home.

I don’t know about you, but I grew up in the Bad Old Days of the 80s when She-Ra, Princess of Power, was kicking butt on TV; little girls were being magically transported to Ponyland to fight evil monsters: and Rainbow Bright defeated the evil King of Shadows and saved the Color Kids.


If you’re older than me, perhaps you grew up watching Wonder Woman (first invented in 1941) and Leia Skywalker; and if you’re younger, Dora the Explorer and Katniss Everdeen.

If you can’t find adventurous female characters in movies or TV, YOU AREN’T LOOKING.

I mentioned this recently: it’s like the Left has no idea what the past–anytime before last Tuesday–actually contained. Somehow the 60s, 70s, 80s, 90s, and 2000s have entirely disappeared, and they live in a timewarp where we are connected directly to the media and gender norms of over half a century ago.

Enough. The Guardian comic is a load of entitled whining from someone who actually thinks that other people are morally obligated to try to read her mind. She has the maturity of a bratty teenager (“You should have known I hate this band!”) and needs to learn how to actually communicate with others instead of complaining that it’s everyone else who has a problem.


Review: Numbers and the Making of Us, by Caleb Everett

I’m about halfway through Caleb Everett’s Numbers and the Making of Us: Counting and the Course of Human Cultures. Everett begins the book with a lengthy clarification that he thinks everyone in the world has equal math abilities, some of us just happen to have been exposed to more number ideas than others. Once that’s out of the way, the book gets interesting.

When did humans invent numbers? It’s hard to say. We have notched sticks from the Paleolithic, but no way to tell if these notches were meant to signify numbers or were just decorated.

The slightly more recent Ishango, Lebombo, and Wolf bones (30,000 YA, Czech Republic) seem more likely to indicate that someone was at least counting–if not keeping track–of something.

The Ishango bone (estimated 20,000 years old, found in the Democratic Republic of the Congo near the headwaters of the Nile,) has three sets of notches–two sets total to 60, the third to 48. Interestingly, the notches are grouped, with both sets of sixty composed of primes: 19 + 17 + 13 + 11 and 9 + 19 + 21 + 11. The set of 48 contains groups of 3, 6, 4, 8, 10, 5, 5, and 7. Aside from the stray seven, the sequence tantalizingly suggests that someone was doubling numbers.

Ishango Bone

The Ishango bone also has a quartz point set into the end, which perhaps allowed it to be used for scraping, drawing, or etching–or perhaps it just looked nice atop someone’s decorated bone.

The Lebombo bone, (estimated 43-44,2000 years old, found near the border between South Africa and Swaziland,) is quite similar to the Ishango bone, but only contains 29 notches (as far as we can tell–it’s broken.)

I’ve seen a lot of people proclaiming “Scientists think it was used to keep track of menstrual cycles. Menstruating African women were the first mathematicians!” so I’m just going to let you in on a little secret: scientists have no idea what it was for. Maybe someone was just having fun putting notches on a bone. Maybe someone was trying to count all of their relatives. Maybe someone was counting days between new and full moons, or counting down to an important date.

Without a far richer archaeological assembly than one bone, we have no idea what this particular person might have wanted to count or keep track of. (Also, why would anyone want to keep track of menstrual cycles? You’ll know when they happen.)

The Wolf bone (30,000 years old, Czech Republic,) has received far less interest from folks interested in proclaiming that menstruating African women were the first mathematicians, but is a nice looking artifact with 60 notches–notches 30 and 31 are significantly longer than the others, as though marking a significant place in the counting (or perhaps just the middle of the pattern.)

Everett cites another, more satisfying tally stick: a 10,000 year old piece of antler found in the anoxic waters of Little Salt Spring, Florida. The antler contains two sets of marks: 28 (or possibly 29–the top is broken in a way that suggests another notch might have been a weak point contributing to the break) large, regular, evenly spaced notches running up the antler, and a much smaller set of notches set beside and just slightly beneath the first. It definitely looks like someone was ticking off quantities of something they wanted to keep track of.

Here’s an article with more information on Little Salt Spring and a good photograph of the antler.

I consider the bones “maybes” and the Little Salt Spring antler a definite for counting/keeping track of quantities.

Inca Quipu

Everett also mentions a much more recent and highly inventive tally system: the Incan quipu.

A quipu is made of knotted strings attached to one central string. A series of knots along the length of each string denotes numbers–one knot for 1, two for 2, etc. The knots are grouped in clusters, allowing place value–first cluster for the ones, second for the tens, third for hundreds, etc. (And a blank space for a zero.)

Thus a sequence of 2 knots, 4 knots, a space, and 5 knots = 5,402

The Incas, you see, had an empire to administer, no paper, but plenty of lovely alpaca wool. So being inventive people, they made do.

Everett then discusses the construction of names for numbers/base systems in different languages. Many languages use a combination of different bases, eg, “two twos” for four, (base 2,) “two hands” to signify 10 (base 5,) and from there, words for multiples of 10 or 20, (base 10 or 20,) can all appear in the same language. He argues convincingly that most languages derived their counting words from our original tally sticks: fingers and toes, found in quantities of 5, 10, and 20. So the number for 5 in a language might be “one hand”, the number for 10, “Two hands,” and the number for 20 “one person” (two hands + two feet.) We could express the number 200 in such a language by saying “two hands of one person”= 10 x 20.

(If you’re wondering how anyone could come up with a base 60 system, such as we inherited from the Babylonians for telling time, try using the knuckles of the four fingers on one hand [12] times the fingers of the other hand [5] to get 60.)

Which begs the question of what counts as a “number” word (numeral). Some languages, it is claimed, don’t have words for numbers higher than 3–but put out an array of 6 objects, and their speakers can construct numbers like “three twos.” Is this a number? What about the number in English that comes after twelve: four-teen, really just a longstanding mispronunciation of four and ten?

Perhaps a better question than “Do they have a word for it,” is “Do they have a common, easy to use word for it?” English contains the world nonillion, but you probably don’t use it very often (and according to the dictionary, a nonillion is much bigger in Britain than in the US, which makes it especially useless.) By contrast, you probably use quantities like a hundred or a thousand all the time, especially when thinking about household budgets.

Roman Numerals are really just an advanced tally system with two bases: 5 and 10. IIII are clearly regular tally marks. V (5) is similar to our practice of crossing through four tally marks. X (10) is two Vs set together. L (50) is a rotated V. C (100) is an abbreviation for the Roman word Centum, hundred. (I, V, X, and L are not abbreviations.) I’m not sure why 500 is D; maybe just because D follows C and it looks like a C with an extra line. M is short for Mille, or thousand. Roman numerals are also fairly unique in their use of subtraction in writing numbers, which few people do because it makes addition horrible. Eg, IV and VI are not the same number, nor do they equal 15 and 51. No, they equal 4 (v-1) and 6 (v+1,) respectively. Adding or multiplying large Roman numerals quickly becomes cumbersome; if you don’t believe me, try XLVII times XVIII with only a pencil and paper.

Now imagine you’re trying to run an empire this way.

You’re probably thinking, “At least those quipus had a zero and were reliably base ten,” about now.

Interestingly, the Mayans (and possibly the Olmecs) already had a proper symbol that they used for zero in their combination base-5/base-20 system with pretty functional place value at a time when the Greeks and Romans did not (the ancient Greeks were philosophically unsure about this concept of a “number that isn’t there.”)

(Note: given the level of sophistication of Native American civilizations like the Inca, Aztec, and Maya, and the fact that these developed in near total isolation, they must have been pretty smart. Their current populations appear to be under-performing relative to their ancestors.)

But let’s let Everett have a chance to speak:

Our increasingly refined means of survival and adaptation are the result of a cultural ratchet. This term, popularized by Duke University psychologist and primatologist Michael Tomasello, refers to the fact that humans cooperatively lock in knowledge from one generation to the next, like the clicking of a ratchet. In other word, our species’ success is due in large measure to individual members’ ability to learn from and emulate the advantageous behavior of their predecessors and contemporaries in their community. What makes humans special is not simply that we are so smart, it is that we do not have to continually come up with new solutions to the same old problems. …

Now this is imminently reasonable; I did not invent the calculus, nor could I have done so had it not already existed. Luckily for me, Newton and Leibniz already invented it and I live in a society that goes to great lengths to encode math in textbooks and teach it to students.

I call this “cultural knowledge” or “cultural memory,” and without it we’d still be monkeys with rocks.

The importance of gradually acquired knowledge stored in the community, culturally reified but not housed in the mind of any one individual, crystallizes when we consider cases in which entire cultures have nearly gone extinct because some of their stored knowledge dissipated due to the death of individuals who served as crucial nodes in their community’s knowledge network. In the case of the Polar Inuit of Northwest Greenland, population declined in the mid-nineteenth century after an epidemic killed several elders of the community. These elders were buried along with their tool sand weapons, in accordance with local tradition, and the Inuits’ ability to manufacture the tools and weapons in question was severely compromised. … As a result, their population did not recover until about 40 years later, when contact with another Inuit group allowed for the restoration of the communal knowledge base.

The first big advance, the one that separates us from the rest of the animal kingdom, was language itself. Yes, other animals can communicate–whales and birds sing; bees do their waggle dance–but only humans have full-fledged, generative language which allows us to both encode and decode new ideas with relative ease. Language lets different people in a tribe learn different things and then pool their ideas far more efficiently than mere imitation.

The next big leap was the development of visual symbols we could record–and read–on wood, clay, wax, bones, cloth, cave walls, etc. Everett suggests that the first of these symbols were likely tally marks such us those found on the Lebombo bone, though of course the ability to encode a buffalo on the wall of the Lascaux cave, France, was also significant. From these first symbols we developed both numbers and letters, which eventually evolved into books.

Books are incredible. Books are like external hard drives for your brain, letting you store, access, and transfer information to other people well beyond your own limits of memorization and well beyond a human lifetime. Books reach across the ages, allowing us to read what philosophers, poets, priests and sages were thinking about a thousand years ago.

Recently we invented an even more incredible information storage/transfer device: computers/the internet. To be fair, they aren’t as sturdy as clay tablets, (fired clay is practically immortal,) but they can handle immense quantities of data–and make it searchable, an incredibly important task.

But Everett tries to claim that cultural ratchet is all there is to human mathematical ability. If you live in a society with calculus textbooks, then you can learn calculus, and if you don’t, you can’t. Everett does not want to imply that Amazonian tribesmen with no words for numbers bigger than three are in any way less able to do math than the Mayans with their place value system and fancy zero.

But this seems unlikely for two reasons. First, we know very well that even in societies with calculus textbooks, not everyone can make use of them. Even among my own children, who have been raised with about as similar an environment as a human can make and have very similar genetics, there’s a striking difference in intellectual strengths and weaknesses. Humans are not identical in their abilities.

Moreover, we know that different mental tasks are performed in different, specialized parts of the brain. For example, we decode letters in the “visual word form area” of the brain; people whose VWAs have been damaged can still read, but they have to use different parts of their brains to work out the letters and they end up reading more slowly than they did before.

Memorably, before he died, the late Henry Harpending (of West Hunter) had a stroke while in Germany. He initially didn’t notice the stroke because it was located in the part of the brain that decodes letters into words, but since he was in Germany, he didn’t expect to read the words, anyway. It was only when he looked at something written in English later that day that he realized he couldn’t read it, and soon after I believe he passed out and was taken to the hospital.

Why should our brains have a VWA at all? It’s not like our primate ancestors did a whole lot of reading. It turns out that the VWA is repurposed from the part of our brain that recognizes faces :)

Likewise, there are specific regions of the brain that handle mathematical tasks. People who are better at math not only have more gray matter in these regions, but they also have stronger connections between them, letting the work together in harmony to solve different problems. We don’t do math by just throwing all of our mental power at a problem, but by routing it through specific regions of our brain.

Interestingly, humans and chimps differ in their ability to recognize faces and perceive emotions. (For anatomical reasons, chimps are more inclined to identify each other’s bottoms than each other’s faces.) We evolved the ability to recognize faces–the region of our brain we use to decode letters–when we began walking upright and interacting to each other face to face, though we do have some vestigial interest in butts and butt-like regions (“My eyes are up here.”) Our brains have evolved over the millenia to get better at specific tasks–in this case, face reading, a precursor to decoding symbolic language.

And there is a tremendous quantity of evidence that intelligence is at least partly genetic–estimates for the heritablity of intelligence range between 60 and 80%. The rest of the variation–the environmental part–looks to be essentially random chance, such as accidents, nutrition, or perhaps your third grade teacher.

So, yes, we absolutely can breed people for mathematical or linguistic ability, if that’s what the environment is selecting for. By contrast, if there have been no particular mathematical or linguistic section pressures in an environment (a culture with no written language, mathematical notation, and very few words for numbers clearly is not experiencing much pressure to use them), then you won’t select for such abilities. The question is not whether we can all be Newtons, (or Leibnizes,) but how many Newtons a society produces and how many people in that society have the potential to understand calculus, given the chance.

I do wonder why he made the graph so much bigger than the relevant part
Lifted gratefully from La Griffe Du Lion’s Smart Fraction II article

Just looking at the state of different societies around the world (including many indigenous groups that live within and have access to modern industrial or post-industrial technologies), there is clear variation in the average abilities of different groups to build and maintain complex societies. Japanese cities are technologically advanced, clean, and violence-free. Brazil, (which hasn’t even been nuked,) is full of incredibly violent, unsanitary, poorly-constructed favelas. Some of this variation is cultural, (Venezuela is doing particularly badly because communism doesn’t work,) or random chance, (Saudi Arabia has oil,) but some of it, by necessity, is genetic.

But if you find that a depressing thought, take heart: selective pressures can be changed. Start selecting for mathematical and verbal ability (and let everyone have a shot at developing those abilities) and you’ll get more mathematical and verbal abilities.

But this is getting long, so let’s continue our discussion next week.

2 Interesting studies: Early Humans in SE Asia and Genetics, Relationships, and Mental Illness

Ancient Teeth Push Back Early Arrival of Humans in Southeast Asia :

New tests on two ancient teeth found in a cave in Indonesia more than 120 years ago have established that early modern humans arrived in Southeast Asia at least 20,000 years earlier than scientists previously thought, according to a new study. …

The findings push back the date of the earliest known modern human presence in tropical Southeast Asia to between 63,000 and 73,000 years ago. The new study also suggests that early modern humans could have made the crossing to Australia much earlier than the commonly accepted time frame of 60,000 to 65,000 years ago.

I would like to emphasize that nothing based on a couple of teeth is conclusive, “settled,” or “proven” science. Samples can get contaminated, machines make errors, people play tricks–in the end, we’re looking for the weight of the evidence.

I am personally of the opinion that there were (at least) two ancient human migrations into south east Asia, but only time will tell if I am correct.

Genome-wide association study of social relationship satisfaction: significant loci and correlations with psychiatric conditions, by Varun Warrier, Thomas Bourgeron, Simon Baron-Cohen:

We investigated the genetic architecture of family relationship satisfaction and friendship satisfaction in the UK Biobank. …

In the DSM-55, difficulties in social functioning is one of the criteria for diagnosing conditions such as autism, anorexia nervosa, schizophrenia, and bipolar disorder. However, little is known about the genetic architecture of social relationship satisfaction, and if social relationship dissatisfaction genetically contributes to risk for psychiatric conditions. …

We present the results of a large-scale genome-wide association study of social
relationship satisfaction in the UK Biobank measured using family relationship satisfaction and friendship satisfaction. Despite the modest phenotypic correlations, there was a significant and high genetic correlation between the two phenotypes, suggesting a similar genetic architecture between the two phenotypes.

Note: the two “phenotypes” here are “family relationship satisfaction” and “friendship satisfaction.”

We first investigated if the two phenotypes were genetically correlated with
psychiatric conditions. As predicted, most if not all psychiatric conditions had a significant negative correlation for the two phenotypes. … We observed significant negative genetic correlation between the two phenotypes and a large cross-condition psychiatric GWAS38. This underscores the importance of social relationship dissatisfaction in psychiatric conditions. …

In other words, people with mental illnesses generally don’t have a lot of friends nor get along with their families.

One notable exception is the negative genetic correlation between measures of cognition and the two phenotypes. Whilst subjective wellbeing is positively genetically correlated with measures of cognition, we identify a small but statistically significant negative correlation between measures of correlation and the two phenotypes.

Are they saying that smart people have fewer friends? Or that dumber people are happier with their friends and families? I think they are clouding this finding in intentionally obtuse language.

A recent study highlighted that people with very high IQ scores tend to report lower satisfaction with life with more frequent socialization.

Oh, I think I read that one. It’s not the socialization per se that’s the problem, but spending time away from the smart person’s intellectual activities. For example, I enjoy discussing the latest genetics findings with friends, but I don’t enjoy going on family vacations because they are a lot of work that does not involve genetics. (This is actually something my relatives complain about.)

…alleles that increase the risk for schizophrenia are in the same haplotype as
alleles that decrease friendship satisfaction. The functional consequences of this locus must be formally tested. …

Loss of function mutations in these genes lead to severe biochemical consequences, and are implicated in several neuropsychiatric conditions. For
example, de novo loss of function mutations in pLI intolerant genes confers significant risk for autism. Our results suggest that pLI > 0.9 genes contribute to psychiatric risk through both common and rare genetic variation.

When Did Black People Evolve?

In previous posts, we discussed the evolution of Whites and Asians, so today we’re taking a look at people from Sub-Saharan Africa.

Modern humans only left Africa about 100,000 to 70,000 yeas ago, and split into Asians and Caucasians around 40,000 years ago. Their modern appearances came later–white skin, light hair and light eyes, for example, only evolved in the past 20,000 and possibly within the past 10,000 years.

What about the Africans, or specifically, Sub-Saharans? (North Africans, like Tunisians and Moroccans, are in the Caucasian clade.) When did their phenotypes evolve?

The Sahara, an enormous desert about the size of the United States, is one of the world’s biggest, most ancient barriers to human travel. The genetic split between SSAs and non-SSAs, therefore, is one of the oldest and most substantial among human populations. But there are even older splits within Africa–some of the ancestors of today’s Pygmies and Bushmen may have split off from other Africans 200,000-300,000 years ago. We’re not sure, because the study of archaic African DNA is still in its infancy.

Some anthropologists refer to Bushmen as “gracile,” which means they are a little shorter than average Europeans and not stockily built

The Bushmen present an interesting case, because their skin is quite light (for Africans.) I prefer to call it golden. The nearby Damara of Namibia, by contrast, are one of the world’s darkest peoples. (The peoples of South Sudan, eg Malik Agar, may be darker, though.) The Pygmies are the world’s shortest peoples; the peoples of South Sudan, such as the Dinka and Shiluk, are among the world’s tallest.

Sub-Saharan Africa’s ethnic groups can be grouped, very broadly, into Bushmen, Pygmies, Bantus (aka Niger-Congo), Nilotics, and Afro-Asiatics. Bushmen and Pygmies are extremely small groups, while Bantus dominate the continent–about 85% of Sub Saharan Africans speak a language from the Niger-Congo family. The Afro-Asiatic groups, as their name implies, have had extensive contact with North Africa and the Middle East.

Most of America’s black population hails from West Africa–that is, the primarily Bantu region. The Bantus and similar-looking groups among the Nilotics and Afro-Asiatics (like the Hausa) are, therefore, have both Africa’s most iconic and most common phenotypes.

For the sake of this post, we are not interested in the evolution of traits common to all humans, such as bipedalism. We are only interested in those traits generally shared by most Sub-Saharans and generally not shared by people outside of Africa.

detailed map of African and Middle Eastern ethnicities in Haaks et al’s dataset

One striking trait is black hair: it is distinctively “curly” or “frizzy.” Chimps and gorrilas do not have curly hair. Neither do whites and Asians. (Whites and Asians, therefore, more closely resemble chimps in this regard.) Only Africans and a smattering of other equatorial peoples like Melanesians have frizzy hair.

Black skin is similarly distinct. Chimps, who live in the shaded forest and have fur, do not have high levels of melanin all over their bodies. While chimps naturally vary in skin tone, an unfortunate, hairless chimp is practically “white.

Humans therefore probably evolved both black skin and frizzy hair at about the same time–when we came out of the shady forests and began running around on the much sunnier savannahs. Frizzy hair seems well-adapted to cooling–by standing on end, it lets air flow between the follicles–and of course melanin is protective from the sun’s rays. (And apparently, many of the lighter-skinned Bushmen suffer from skin cancer.)

Steatopygia also comes to mind, though I don’t know if anyone has studied its origins.

According to Wikipedia, additional traits common to Sub-Saharan Africans include:

In modern craniofacial anthropometry, Negroid describes features that typify skulls of black people. These include a broad and round nasal cavity; no dam or nasal sill; Quonset hut-shaped nasal bones; notable facial projection in the jaw and mouth area (prognathism); a rectangular-shaped palate; a square or rectangular eye orbit shape;[21] a large interorbital distance; a more undulating supraorbital ridge;[22] and large, megadontic teeth.[23] …

Modern cross-analysis of osteological variables and genome-wide SNPs has identified specific genes, which control this craniofacial development. Of these genes, DCHS2, RUNX2, GLI3, PAX1 and PAX3 were found to determine nasal morphology, whereas EDAR impacts chin protrusion.[27] …

Ashley Montagu lists “neotenous structural traits in which…Negroids [generally] differ from Caucasoids… flattish nose, flat root of the nose, narrower ears, narrower joints, frontal skull eminences, later closure of premaxillarysutures, less hairy, longer eyelashes, [and] cruciform pattern of second and third molars.”[28]

The Wikipedia page on Dark Skin states:

As hominids gradually lost their fur (between 4.5 and 2 million years ago) to allow for better cooling through sweating, their naked and lightly pigmented skin was exposed to sunlight. In the tropics, natural selection favoured dark-skinned human populations as high levels of skin pigmentation protected against the harmful effects of sunlight. Indigenous populations’ skin reflectance (the amount of sunlight the skin reflects) and the actual UV radiation in a particular geographic area is highly correlated, which supports this idea. Genetic evidence also supports this notion, demonstrating that around 1.2 million years ago there was a strong evolutionary pressure which acted on the development of dark skin pigmentation in early members of the genus Homo.[25]

About 7 million years ago human and chimpanzee lineages diverged, and between 4.5 and 2 million years ago early humans moved out of rainforests to the savannas of East Africa.[23][28] They not only had to cope with more intense sunlight but had to develop a better cooling system. …

Skin colour is a polygenic trait, which means that several different genes are involved in determining a specific phenotype. …

Data collected from studies on MC1R gene has shown that there is a lack of diversity in dark-skinned African samples in the allele of the gene compared to non-African populations. This is remarkable given that the number of polymorphisms for almost all genes in the human gene pool is greater in African samples than in any other geographic region. So, while the MC1Rf gene does not significantly contribute to variation in skin colour around the world, the allele found in high levels in African populations probably protects against UV radiation and was probably important in the evolution of dark skin.[57][58]

Skin colour seems to vary mostly due to variations in a number of genes of large effect as well as several other genes of small effect (TYR, TYRP1, OCA2, SLC45A2, SLC24A5, MC1R, KITLG and SLC24A4). This does not take into account the effects of epistasis, which would probably increase the number of related genes.[59] Variations in the SLC24A5 gene account for 20–25% of the variation between dark and light skinned populations of Africa,[60] and appear to have arisen as recently as within the last 10,000 years.[61] The Ala111Thr or rs1426654 polymorphism in the coding region of the SLC24A5 gene reaches fixation in Europe, and is also common among populations in North Africa, the Horn of Africa, West Asia, Central Asia and South Asia.[62][63][64]

That’s rather interesting about MC1R. It could imply that the difference in skin tone between SSAs and non-SSAs is due to active selection in Blacks for dark skin and relaxed selection in non-Blacks, rather than active selection for light skin in non-Blacks.

The page on MC1R states:

MC1R is one of the key proteins involved in regulating mammalianskin and hair color. …It works by controlling the type of melanin being produced, and its activation causes the melanocyte to switch from generating the yellow or red phaeomelanin by default to the brown or black eumelanin in replacement. …

This is consistent with active selection being necessary to produce dark skin, and relaxed selection producing lighter tones.

Studies show the MC1R Arg163Gln allele has a high frequency in East Asia and may be part of the evolution of light skin in East Asian populations.[40] No evidence is known for positive selection of MC1R alleles in Europe[41] and there is no evidence of an association between MC1R and the evolution of light skin in European populations.[42] The lightening of skin color in Europeans and East Asians is an example of convergent evolution.

However, we should also note:

Dark-skinned people living in low sunlight environments have been recorded to be very susceptible to vitamin D deficiency due to reduced vitamin D synthesis. A dark-skinned person requires about six times as much UVB than lightly pigmented persons.

PCA graph and map of sampling locations. Modern people are indicated with gray circles.

Unfortunately, most of the work on human skin tones has been done among Europeans (and, oddly, zebra fish,) limiting our knowledge about the evolution of African skin tones, which is why this post has been sitting in my draft file for months. Luckily, though, two recent studies–Loci Associated with Skin Pigmentation Identified in African Populations and Reconstructing Prehistoric African Population Structure–have shed new light on African evolution.

In Reconstructing Prehistoric African Population Structure, Skoglund et al assembled genetic data from 16 prehistoric Africans and compared them to DNA from nearby present-day Africans. They found:

  1. The ancestors of the Bushmen (aka the San/KhoiSan) once occupied a much wider area.
  2. They contributed about 2/3s of the ancestry of ancient Malawi hunter-gatherers (around 8,100-2,500 YA)
  3. Contributed about 1/3 of the ancestry of ancient Tanzanian hunter-gatherers (around 1,400 YA)
  4. Farmers (Bantus) spread from west Africa, completely replacing hunter-gatherers in some areas
  5. Modern Malawians are almost entirely Bantu.
  6. A Tanzanian pastoralist population from 3,100 YA spread out across east Africa and into southern Africa
  7. Bushmen ancestry was not found in modern Hadza, even though they are hunter-gatherers and speak a click language like the Bushmen.
  8. The Hadza more likely derive most of their ancestry from ancient Ethiopians
  9. Modern Bantu-speakers in Kenya derive from a mix between western Africans and Nilotics around 800-400 years ago.
  10. Middle Eastern (Levant) ancestry is found across eastern Africa from an admixture event that occurred around 3,000 YA, or around the same time as the Bronze Age Collapse.
  11. A small amount of Iranian DNA arrived more recently in the Horn of Africa
  12. Ancient Bushmen were more closely related to modern eastern Africans like the Dinka (Nilotics) and Hadza than to modern west Africans (Bantus),
  13. This suggests either complex relationships between the groups or that some Bantus may have had ancestors from an unknown group of humans more ancient than the Bushmen.
  14. Modern Bushmen have been evolving darker skins
  15. Pygmies have been evolving shorter stature
Automated clustering of ancient and modern populations (moderns in gray)

I missed #12-13 on my previous post about this paper, though I did note that the more data we get on ancient African groups, the more likely I think we are to find ancient admixture events. If humans can mix with Neanderthals and Denisovans, then surely our ancestors could have mixed with Ergaster, Erectus, or whomever else was wandering around.

Distribution of ancient Bushmen and Ethiopian DNA in south and east Africa

#15 is interesting, and consistent with the claim that Bushmen suffer from a lot of skin cancer–before the Bantu expansion, they lived in far more forgiving climates than the Kalahari desert. But since Bushmen are already lighter than their neighbors, this begs the question of how light their ancestors–who had no Levantine admixture–were. Could the Bantus’ and Nilotics’ darker skins have evolved after the Bushmen/everyone else split?

Meanwhile, in Loci Associated with Skin Pigmentation Identified in African Populations, Crawford et al used genetic samples from 1,570 people from across Africa to find six genetic areas–SLC24A5, MFSD12, DDB1, TMEM138, OCA2 and HERC2–which account for almost 30% of the local variation in skin color.

Bantu (green) and Levantine/pastoralist DNA in modern peoples

SLC24A5 is a light pigment introduced to east Africa from the Levant, probably around 3,000 years ago. Today, it is common in Ethiopia and Tanzania.

Interestingly, according to the article, “At all other loci, variants associated with dark pigmentation in Africans are identical by descent in southern Asian and Australo-Melanesian populations.”

These are the world’s other darkest peoples, such as the Jarawas of the Andaman Islands or the Melanesians of Bougainville, PNG. (And, I assume, some groups from India such as the Tamils.) This implies that these groups 1. had dark skin already when they left Africa, and 2. Never lost it on their way to their current homes. (If they had gotten lighter during their journey and then darkened again upon arrival, they likely would have different skin color variants than their African cousins.)

This implies that even if the Bushmen split off (around 200,000-300,000 YA) before dark skin evolved, it had evolved by the time people left Africa and headed toward Australia (around 100,000-70,000 YA.) This gives us a minimum threshold: it most likely evolved before 70,000 YA.

(But as always, we should be careful because perhaps there are even more skin color variant that we don’t know about yet in these populations.)

MFSD12 is common among Nilotics and is related to darker skin.

And according to the abstract, which Razib Khan posted:

Further, the alleles associated with skin pigmentation at all loci but SLC24A5 are ancient, predating the origin of modern humans. The ancestral alleles at the majority of predicted causal SNPs are associated with light skin, raising the possibility that the ancestors of modern humans could have had relatively light skin color, as is observed in the San population today.

The full article is not out yet, so I still don’t know when all of these light and dark alleles emerged, but the order is absolutely intriguing. For now, it looks like this mystery will still have to wait.


Two Exciting Papers on African Genetics

I loved that movie
Nǃxau ǂToma, (aka Gcao Tekene Coma,) Bushman star of “The Gods Must be Crazy,” roughly 1944-2003

An interesting article on Clues to Africa’s Mysterious Past appeared recently in the NY Times:

It was only two years ago that researchers found the first ancient human genome in Africa: a skeleton in a cave in Ethiopia yielded DNA that turned out to be 4,500 years old.

On Thursday, an international team of scientists reported that they had recovered far older genes from bone fragments in Malawi dating back 8,100 years. The researchers also retrieved DNA from 15 other ancient people in eastern and southern Africa, and compared the genes to those of living Africans.

Let’s skip to the article, Reconstructing Prehistoric African Population Structure by Skoglund et al:

We assembled genome-wide data from 16 prehistoric Africans. We show that the anciently divergent lineage that comprises the primary ancestry of the southern African San had a wider distribution in the past, contributing approximately two-thirds of the ancestry of Malawi hunter-gatherers ∼8,100–2,500 years ago and approximately one-third of the ancestry of Tanzanian hunter-gatherers ∼1,400 years ago.

Paths of the great Bantu Migration

The San are also known as the Bushmen, a famous group of recent hunter-gatherers from southern Africa.

We document how the spread of farmers from western Africa involved complete replacement of local hunter-gatherers in some regions…

This is most likely the Great Bantu Migration, which I wrote about in Into Africa: the Great Bantu Migration.

…and we track the spread of herders by showing that the population of a ∼3,100-year-old pastoralist from Tanzania contributed ancestry to people from northeastern to southern Africa, including a ∼1,200-year-old southern African pastoralist…

Whereas the two individuals buried in ∼2,000 BP hunter-gatherer contexts in South Africa share ancestry with southern African Khoe-San populations in the PCA, 11 of the 12 ancient individuals who lived in eastern and south-central Africa between ∼8,100 and ∼400 BP form a gradient of relatedness to the eastern African Hadza on the one hand and southern African Khoe-San on the other (Figure 1A).

The Hadza are a hunter-gatherer group from Tanzania who are not obviously related to any other people. Their language has traditionally been classed alongside the languages of the KhoiSan/Bushmen people because they all contain clicks, but the languages otherwise have very little in common and Hadza appears to be a language isolate, like Basque.

The genetic cline correlates to geography, running along a north-south axis with ancient individuals from Ethiopia (∼4,500 BP), Kenya (∼400 BP), Tanzania (both ∼1,400 BP), and Malawi (∼8,100–2,500 BP), showing increasing affinity to southern Africans (both ancient individuals and present-day Khoe-San). The seven individuals from Malawi show no clear heterogeneity, indicating a long-standing and distinctive population in ancient Malawi that persisted for at least ∼5,000 years (the minimum span of our radiocarbon dates) but which no longer exists today. …

We find that ancestry closely related to the ancient southern Africans was present much farther north and east in the past than is apparent today. This ancient southern African ancestry comprises up to 91% of the ancestry of Khoe-San groups today (Table S5), and also 31% ± 3% of the ancestry of Tanzania_Zanzibar_1400BP, 60% ± 6% of the ancestry of Malawi_Fingira_6100BP, and 65% ± 3% of the ancestry of Malawi_Fingira_2500BP (Figure 2A). …

Both unsupervised clustering (Figure 1B) and formal ancestry estimation (Figure 2B) suggest that individuals from the Hadza group in Tanzania can be modeled as deriving all their ancestry from a lineage related deeply to ancient eastern Africans such as the Ethiopia_4500BP individual …

So what’s up with the Tanzanian expansion mentioned in the summary?

Western-Eurasian-related ancestry is pervasive in eastern Africa today … and the timing of this admixture has been estimated to be ∼3,000 BP on average… We found that the ∼3,100 BP individual… associated with a Savanna Pastoral Neolithic archeological tradition, could be modeled as having 38% ± 1% of her ancestry related to the nearly 10,000-year-old pre-pottery farmers of the Levant These results could be explained by migration into Africa from descendants of pre-pottery Levantine farmers or alternatively by a scenario in which both pre-pottery Levantine farmers and Tanzania_Luxmanda_3100BP descend from a common ancestral population that lived thousands of years earlier in Africa or the Near East. We fit the remaining approximately two-thirds of Tanzania_Luxmanda_3100BP as most closely related to the Ethiopia_4500BP…

…present-day Cushitic speakers such as the Somali cannot be fit simply as having Tanzania_Luxmanda_3100BP ancestry. The best fitting model for the Somali includes Tanzania_Luxmanda_3100BP ancestry, Dinka-related ancestry, and 16% ± 3% Iranian-Neolithic-related ancestry (p = 0.015). This suggests that ancestry related to the Iranian Neolithic appeared in eastern Africa after earlier gene flow related to Levant Neolithic populations, a scenario that is made more plausible by the genetic evidence of admixture of Iranian-Neolithic-related ancestry throughout the Levant by the time of the Bronze Age …and in ancient Egypt by the Iron Age …

There is then a discussion of possible models of ancient African population splits (were the Bushmen the first? How long have they been isolated?) I suspect the more ancient African DNA we uncover, the more complicated the tree will become, just as in Europe and Asia we’ve discovered Neanderthal and Denisovan admixture.

They also compared genomes to look for genetic adaptations and found evidence for selection for taste receptors and “response to radiation” in the Bushmen, which the authors note “could be due to exposure to sunlight associated with the life of the ‡Khomani and Ju|’hoan North people in the Kalahari Basin, which has become a refuge for hunter-gatherer populations in the last millenia due to encroachment by pastoralist and agriculturalist groups.”

(The Bushmen are lighter than Bantus, with a more golden or tan skin tone.)

They also found evidence of selection for short stature among the Pygmies (which isn’t really surprising to anyone, unless you thought they had acquired their heights by admixture with another very short group of people.)

Overall, this is a great paper and I encourage you to RTWT, especially the pictures/graphs.

Now, if that’s not enough African DNA for you, we also have Loci Associated with Skin Pigmentation Identified in African Populations, by Crawford et al:

Examining ethnically diverse African genomes, we identify variants in or near SLC24A5, MFSD12, DDB1, TMEM138, OCA2 and HERC2 that are significantly associated with skin pigmentation. Genetic evidence indicates that the light pigmentation variant at SLC24A5 was introduced into East Africa by gene flow from non-Africans. At all other loci, variants associated with dark pigmentation in Africans are identical by descent in southern Asian and Australo-Melanesian populations. Functional analyses indicate that MFSD12 encodes a lysosomal protein that affects melanogenesis in zebrafish and mice, and that mutations in melanocyte-specific regulatory regions near DDB1/TMEM138 correlate with expression of UV response genes under selection in Eurasians.

I’ve had an essay on the evolution of African skin tones sitting in my draft folder for ages because this research hadn’t been done. There’s plenty of research on European and Asian skin tones (skin appears to have significantly lightened around 10,000 years ago in Europeans,) but much less on Africans. Luckily for me, this paper fixes that.

Looks like SLC24A5 is related to that Levantine/Iranian back-migration into Africa documented in the first paper.

Are “Nerds” Just a Hollywood Stereotype?

Yes, MIT has a football team.

The other day on Twitter, Nick B. Steves challenged me to find data supporting or refuting his assertion that Nerds vs. Jocks is a false stereotype, invented around 1975. Of course, we HBDers have a saying–“all stereotypes are true,” even the ones about us–but let’s investigate Nick’s claim and see where it leads us.

(NOTE: If you have relevant data, I’d love to see it.)

Unfortunately, terms like “nerd,” “jock,” and “chad” are not all that well defined. Certainly if we define “jock” as “athletic but not smart” and nerd as “smart but not athletic,” then these are clearly separate categories. But what if there’s a much bigger group of people who are smart and athletic?

Or what if we are defining “nerd” and “jock” too narrowly? Wikipedia defines nerd as, “a person seen as overly intellectual, obsessive, or lacking social skills.” I recall a study–which I cannot find right now–which found that nerds had, overall, lower-than-average IQs, but that study included people who were obsessive about things like comic books, not just people who majored in STEM. Similarly, should we define “jock” only as people who are good at sports, or do passionate sports fans count?

For the sake of this post, I will define “nerd” as “people with high math/science abilities” and “jock” as “people with high athletic abilities,” leaving the matter of social skills undefined. (People who merely like video games or watch sports, therefore, do not count.)

Nick is correct on one count: according to Wikipedia, although the word “nerd” has been around since 1951, it was popularized during the 70s by the sitcom Happy Days. However, Wikipedia also notes that:

An alternate spelling,[10] as nurd or gnurd, also began to appear in the mid-1960s or early 1970s.[11] Author Philip K. Dick claimed to have coined the nurd spelling in 1973, but its first recorded use appeared in a 1965 student publication at Rensselaer Polytechnic Institute.[12][13] Oral tradition there holds that the word is derived from knurd (drunk spelled backward), which was used to describe people who studied rather than partied. The term gnurd (spelled with the “g”) was in use at the Massachusetts Institute of Technology by 1965.[14] The term nurd was also in use at the Massachusetts Institute of Technology as early as 1971 but was used in the context for the proper name of a fictional character in a satirical “news” article.[15]

suggesting that the word was already common among nerds themselves before it was picked up by TV.

But we can trace the nerd-jock dichotomy back before the terms were coined: back in 1921, Lewis Terman, a researcher at Stanford University, began a long-term study of exceptionally high-IQ children, the Genetic Studies of Genius aka the Terman Study of the Gifted:

Terman’s goal was to disprove the then-current belief that gifted children were sickly, socially inept, and not well-rounded.

This belief was especially popular in a little nation known as Germany, where it inspired people to take schoolchildren on long hikes in the woods to keep them fit and the mass-extermination of Jews, who were believed to be muddying the German genepool with their weak, sickly, high-IQ genes (and nefariously trying to marry strong, healthy German in order to replenish their own defective stock.) It didn’t help that German Jews were both high-IQ and beset by a number of illnesses (probably related to high rates of consanguinity,) but then again, the Gypsies are beset by even more debilitating illnesses, but no one blames this on all of the fresh air and exercise afforded by their highly mobile lifestyles.

(Just to be thorough, though, the Nazis also exterminated the Gypsies and Hans Asperger’s subjects, despite Asperger’s insistence that they were very clever children who could probably be of great use to the German war effort via code breaking and the like.)

The results of Terman’s study are strongly in Nick’s favor. According to Psychology Today’s  account:

His final group of “Termites” averaged a whopping IQ of 151. Following-up his group 35-years later, his gifted group at mid-life definitely seemed to conform to his expectations. They were taller, healthier, physically better developed, and socially adept (dispelling the myth at the time of high-IQ awkward nerds).

According to Wikipedia:

…the first volume of the study reported data on the children’s family,[17] educational progress,[18] special abilities,[19] interests,[20] play,[21] and personality.[22] He also examined the children’s racial and ethnic heritage.[23] Terman was a proponent of eugenics, although not as radical as many of his contemporary social Darwinists, and believed that intelligence testing could be used as a positive tool to shape society.[3]

Based on data collected in 1921–22, Terman concluded that gifted children suffered no more health problems than normal for their age, save a little more myopia than average. He also found that the children were usually social, were well-adjusted, did better in school, and were even taller than average.[24] A follow-up performed in 1923–1924 found that the children had maintained their high IQs and were still above average overall as a group.

Of course, we can go back even further than Terman–in the early 1800s, allergies like hay fever were associated with the nobility, who of course did not do much vigorous work in the fields.

My impression, based on studies I’ve seen previously, is that athleticism and IQ are positively correlated. That is, smarter people tend to be more athletic, and more athletic people tend to be smarter. There’s a very obvious reason for this: our brains are part of our bodies, people with healthier bodies therefore also have healthier brains, and healthier brains tend to work better.

At the very bottom of the IQ distribution, mentally retarded people tend to also be clumsy, flacid, or lacking good muscle tone. The same genes (or environmental conditions) that make children have terrible health/developmental problems often also affect their brain growth, and conditions that affect their brains also affect their bodies. As we progress from low to average to above-average IQ, we encounter increasingly healthy people.

In most smart people, high-IQ doesn’t seem to be a random fluke, a genetic error, nor fitness reducing: in a genetic study of children with exceptionally high IQs, researchers failed to find many genes that specifically endowed the children with genius, but found instead a fortuitous absence of deleterious genes that knock a few points off the rest of us. The same genes that have a negative effect on the nerves and proteins in your brain probably also have a deleterious effect on the nerves and proteins throughout the rest of your body.

And indeed, there are many studies which show a correlation between intelligence and strength (eg, Longitudinal and Cross-Sectional Assessments of Age Changes in Physical Strength as Related to Sex, Social Class, and Mental Ability) or intelligence and overall health/not dying (eg, Intelligence in young adulthood and cause-specific mortality in the Danish Conscription Database (pdf) and The effects of occupation-based social position on mortality in a large American cohort.)

On the other hand, the evolutionary standard for “fitness” isn’t strength or longevity, but reproduction, and on this scale the high-IQ don’t seem to do as well:

Smart teens don’t have sex (or kiss much either): (h/t Gene Expresion)

Controlling for age, physical maturity, and mother’s education, a significant curvilinear relationship between intelligence and coital status was demonstrated; adolescents at the upper and lower ends of the intelligence distribution were less likely to have sex. Higher intelligence was also associated with postponement of the initiation of the full range of partnered sexual activities. … Higher intelligence operates as a protective factor against early sexual activity during adolescence, and lower intelligence, to a point, is a risk factor.


Here we see the issue plainly: males at 120 and 130 IQ are less likely to get laid than clinically retarded men in 70s and 60s. The right side of the graph are “nerds”, the left side, “jocks.” Of course, the high-IQ females are even less likely to get laid than the high-IQ males, but males tend to judge themselves against other men, not women, when it comes to dating success. Since the low-IQ females are much less likely to get laid than the low-IQ males, this implies that most of these “popular” guys are dating girls who are smarter than themselves–a fact not lost on the nerds, who would also like to date those girls.

 In 2001, the MIT/Wellesley magazine Counterpart (Wellesley is MIT’s “sister school” and the two campuses allow cross-enrollment in each other’s courses) published a sex survey that provides a more detailed picture of nerd virginity:

I’m guessing that computer scientists invented polyamory, and neuroscientists are the chads of STEM. The results are otherwise pretty predictable.

Unfortunately, Counterpoint appears to be defunct due to lack of funding/interest and I can no longer find the original survey, but here is Jason Malloy’s summary from Gene Expression:

By the age of 19, 80% of US males and 75% of women have lost their virginity, and 87% of college students have had sex. But this number appears to be much lower at elite (i.e. more intelligent) colleges. According to the article, only 56% of Princeton undergraduates have had intercourse. At Harvard 59% of the undergraduates are non-virgins, and at MIT, only a slight majority, 51%, have had intercourse. Further, only 65% of MIT graduate students have had sex.

The student surveys at MIT and Wellesley also compared virginity by academic major. The chart for Wellesley displayed below shows that 0% of studio art majors were virgins, but 72% of biology majors were virgins, and 83% of biochem and math majors were virgins! Similarly, at MIT 20% of ‘humanities’ majors were virgins, but 73% of biology majors. (Apparently those most likely to read Darwin are also the least Darwinian!)

College Confidential has one paragraph from the study:

How Rolling Stone-ish are the few lucky souls who are doing the horizontal mambo? Well, not very. Considering all the non-virgins on campus, 41% of Wellesley and 32% of MIT students have only had one partner (figure 5). It seems that many Wellesley and MIT students are comfortingly monogamous. Only 9% of those who have gotten it on at MIT have been with more than 10 people and the number is 7% at Wellesley.

Someone needs to find the original study and PUT IT BACK ON THE INTERNET.

But this lack of early sexual success seems to translate into long-term marital happiness, once nerds find “the one.”Lex Fridman’s Divorce Rates by Profession offers a thorough list. The average divorce rate was 16.35%, with a high of 43% (Dancers) and a low of 0% (“Media and communication equipment workers.”)

I’m not sure exactly what all of these jobs are nor exactly which ones should count as STEM (veterinarian? anthropologists?) nor do I know how many people are employed in each field, but I count 49 STEM professions that have lower than average divorce rates (including computer scientists, economists, mathematical science, statisticians, engineers, biologists, chemists, aerospace engineers, astronomers and physicists, physicians, and nuclear engineers,) and only 23 with higher than average divorce rates (including electricians, water treatment plant operators, radio and telecommunication installers, broadcast engineers, and similar professions.) The purer sciences obviously had lower rates than the more practical applied tech fields.

The big outliers were mathematicians (19.15%), psychologists (19.26%), and sociologists (23.53%), though I’m not sure they count (if so, there were only 22 professions with higher than average divorce rates.)

I’m not sure which professions count as “jock” or “chad,” but athletes had lower than average rates of divorce (14.05%) as did firefighters, soldiers, and farmers. Financial examiners, hunters, and dancers, (presumably an athletic female occupation) however, had very high rates of divorce.

Medical Daily has an article on Who is Most Likely to Cheat? The Top 9 Jobs Unfaithful People Have (according to survey):

According to the survey recently taken by the “infidelity dating website,” Victoria Milan, individuals working in the finance field, such as brokers, bankers, and analysts, are more likely to cheat than those in any other profession. However, following those in finance comes those in the aviation field, healthcare, business, and sports.

With the exception of healthcare and maybe aviation, these are pretty typical Chad occupations, not STEM.

The Mirror has a similar list of jobs where people are most and least likely to be married. Most likely: Dentist, Chief Executive, Sales Engineer, Physician, Podiatrist, Optometrist, Farm product buyer, Precision grinder, Religious worker, Tool and die maker.

Least likely: Paper-hanger, Drilling machine operator, Knitter textile operator, Forge operator, Mail handler, Science technician, Practical nurse, Social welfare clerk, Winding machine operative, Postal clerk.

I struggled to find data on male fertility by profession/education/IQ, but there’s plenty on female fertility, eg the deceptively titled High-Fliers have more Babies:

…American women without any form of high-school diploma have a fertility rate of 2.24 children. Among women with a high-school diploma the fertility rate falls to 2.09 and for women with some form of college education it drops to 1.78.

However, among women with college degrees, the economists found the fertility rate rises to 1.88 and among women with advanced degrees to 1.96. In 1980 women who had studied for 16 years or more had a fertility rate of just 1.2.

As the economists prosaically explain: “The relationship between fertility and women’s education in the US has recently become U-shaped.”

Here is another article about the difference in fertility rates between high and low-IQ women.

But female fertility and male fertility may not be the same–I recall data elsewhere indicating that high-IQ men have more children than low IQ men, which implies those men are having their children with low-IQ women. (For example, while Bill and Hillary seem about matched on IQ, and have only one child, Melania Trump does not seem as intelligent as Trump, who has five children.)

Amusingly, I did find data on fertility rate by father’s profession for 1920, in the Birth Statistics for the Birth Registration Area of the US:

Of the 1,508,874 children born in 1920 in the birth registration area of the United states, occupations of fathers are stated for … 96.9%… The average number of children ever born to the present wives of these occupied fathers is 3.3 and the average number of children living 2.9.

The average number of children ever born ranges from 4.6 for foremen, overseers, and inspectors engaged in the extraction of minerals to 1.8 for soldiers, sailors, and marines. Both of these extreme averages are easily explained, for soldier, sailors and marines are usually young, while such foremen, overseers, and inspectors are usually in middle life. For many occupations, however, the ages of the fathers are presumably about the same and differences shown indicate real differences in the size of families. For example, the low figure for dentists, (2), architects, (2.1), and artists, sculptors, and teachers of art (2.2) are in striking contrast with the figure for mine operatives (4.3), quarry operatives (4.1) bootblacks, and brick and stone masons (each 3.9). …

As a rule the occupations credited with the highest number of children born are also credited with the highest number of children living, the highest number of children living appearing for foremen, overseers, and inspectors engaged in the extraction of minerals (3.9) and for steam and street railroad foremen and overseer (3.8), while if we exclude groups plainly affected by the age of fathers, the highest number of children living appear for mine and quarry operatives (each 3.6).

Obviously the job market was very different in 1920–no one was majoring in computer science. Perhaps some of those folks who became mine and quarry operatives back then would become engineers today–or perhaps not. Here are the average numbers of surviving children for the most obviously STEM professions (remember average for 1920 was 2.9):

Electricians 2.1, Electrotypers 2.2, telegraph operator 2.2, actors 1.9, chemists 1.8, Inventors 1.8, photographers and physicians 2.1, technical engineers 1.9, veterinarians 2.2.

I don’t know what paper hangers do, but the Mirror said they were among the least likely to be married, and in 1920, they had an average of 3.1 children–above average.

What about athletes? How smart are they?

Athletes Show Huge Gaps on SAT Scores” is not a promising title for the “nerds are athletic” crew.

The Journal-Constitution studied 54 public universities, “including the members of the six major Bowl Championship Series conferences and other schools whose teams finished the 2007-08 season ranked among the football or men’s basketball top 25.”…

  • Football players average 220 points lower on the SAT than their classmates. Men’s basketball was 227 points lower.
  • University of Florida won the prize for biggest gap between football players and the student body, with players scoring 346 points lower than their peers.
  • Georgia Tech had the nation’s best average SAT score for football players, 1028 of a possible 1600, and best average high school GPA, 3.39 of a possible 4.0. But because its student body is apparently very smart, Tech’s football players still scored 315 SAT points lower than their classmates.
  • UCLA, which has won more NCAA championships in all sports than any other school, had the biggest gap between the average SAT scores of athletes in all sports and its overall student body, at 247 points.

From the original article, which no longer seems to be up on the Journal-Constitution website:

All 53 schools for which football SAT scores were available had at least an 88-point gap between team members’ average score and the average for the student body. …

Football players performed 115 points worse on the SAT than male athletes in other sports.

The differences between athletes’ and non-athletes’ SAT scores were less than half as big for women (73 points) as for men (170).

Many schools routinely used a special admissions process to admit athletes who did not meet the normal entrance requirements. … At Georgia, for instance, 73.5 percent of athletes were special admits compared with 6.6 percent of the student body as a whole.

On the other hand, as Discover Magazine discusses in “The Brain: Why Athletes are Geniuses,” athletic tasks–like catching a fly ball or slapping a hockey puck–require exceptionally fast and accurate brain signals to trigger the correct muscle movements.

Ryan Stegal studied the GPAs of highschool student athletes vs. non-athletes and found that the athletes had higher average GPAs than the non-athletes, but he also notes that the athletes were required to meet certain minimum GPA requirements in order to play.

But within athletics, it looks like the smarter athletes perform better than dumber ones, which is why the NFL uses the Wonderlic Intelligence Test:

NFL draft picks have taken the Wonderlic test for years because team owners need to know if their million dollar player has the cognitive skills to be a star on the field.

What does the NFL know about hiring that most companies don’t? They know that regardless of the position, proof of intelligence plays a profound role in the success of every individual on the team. It’s not enough to have physical ability. The coaches understand that players have to be smart and think quickly to succeed on the field, and the closer they are to the ball the smarter they need to be. That’s why, every potential draft pick takes the Wonderlic Personnel Test at the combine to prove he does–or doesn’t—have the brains to win the game. …

The first use of the WPT in the NFL was by Tom Landry of the Dallas Cowboys in the early 70s, who took a scientific approach to finding players. He believed players who could use their minds where it counted had a strategic advantage over the other teams. He was right, and the test has been used at the combine ever since.

For the NFL, years of testing shows that the higher a player scores on the Wonderlic, the more likely he is to be in the starting lineup—for any position. “There is no other reasonable explanation for the difference in test scores between starting players and those that sit on the bench,” Callans says. “Intelligence plays a role in how well they play the game.”

Let’s look at Exercising Intelligence: How Research Shows a Link Between Physical Activity and Smarts:

A large study conducted at the Sahlgrenska Academy and Sahlgrenska University Hospital in Gothenburg, Sweden, reveals that young adults who regularly exercise have higher IQ scores and are more likely to go on to university.

The study was published in the Proceedings of the National Academy of Sciences (PNAS), and involved more than 1.2 million Swedish men. The men were performing military service and were born between the years 1950 and 1976. Both their physical and IQ test scores were reviewed by the research team. …

The researchers also looked at data for twins and determined that primarily environmental factors are responsible for the association between IQ and fitness, and not genetic makeup. “We have also shown that those youngsters who improve their physical fitness between the ages of 15 and 18 increase their cognitive performance.”…

I have seen similar studies before, some involving mice and some, IIRC, the elderly. It appears that exercise is probably good for you.

I have a few more studies I’d like to mention quickly before moving on to discussion.

Here’s Grip Strength and Physical Demand of Previous Occupation in a Well-Functioning Cohort of Chinese Older Adults (h/t prius_1995) found that participants who had previously worked in construction had greater grip strength than former office workers.

Age and Gender-Specific Normative Data of Grip and Pinch Strength in a Healthy Adult Swiss Population (h/t prius_1995).


If the nerds are in the sedentary cohort, then they be just as athletic if not more athletic than all of the other cohorts except the heavy work.

However, in Revised normative values for grip strength with the Jamar dynamometer, the authors found no effect of profession on grip strength.

And Isometric muscle strength and anthropometric characteristics of a Chinese sample (h/t prius_1995).

And Pumpkin Person has an interesting post about brain size vs. body size.


Discussion: Are nerds real?

Overall, it looks like smarter people are more athletic, more athletic people are smarter, smarter athletes are better athletes, and exercise may make you smarter. For most people, the nerd/jock dichotomy is wrong.

However, there is very little overlap at the very highest end of the athletic and intelligence curves–most college (and thus professional) athletes are less intelligent than the average college student, and most college students are less athletic than the average college (and professional) athlete.

Additionally, while people with STEM degrees make excellent spouses (except for mathematicians, apparently,) their reproductive success is below average: they have sex later than their peers and, as far as the data I’ve been able to find shows, have fewer children.

Stephen Hawking

Even if there is a large overlap between smart people and athletes, they are still separate categories selecting for different things: a cripple can still be a genius, but can’t play football; a dumb person can play sports, but not do well at math. Stephen Hawking can barely move, but he’s still one of the smartest people in the world. So the set of all smart people will always include more “stereotypical nerds” than the set of all athletes, and the set of all athletes will always include more “stereotypical jocks” than the set of all smart people.

In my experience, nerds aren’t socially awkward (aside from their shyness around women.) The myth that they are stems from the fact that they have different interests and communicate in a different way than non-nerds. Let nerds talk to other nerds, and they are perfectly normal, communicative, socially functional people. Put them in a room full of non-nerds, and suddenly the nerds are “awkward.”

Unfortunately, the vast majority of people are not nerds, so many nerds have to spend the majority of their time in the company of lots of people who are very different than themselves. By contrast, very few people of normal IQ and interests ever have to spend time surrounded by the very small population of nerds. If you did put them in a room full of nerds, however, you’d find that suddenly they don’t fit in. The perception that nerds are socially awkward is therefore just normie bias.

Why did the nerd/jock dichotomy become so popular in the 70s? Probably in part because science and technology were really taking off as fields normal people could aspire to major in, man had just landed on the moon and the Intel 4004 was released in 1971.  Very few people went to college or were employed in sciences back in 1920; by 1970, colleges were everywhere and science was booming.

And at the same time, colleges and highschools were ramping up their athletics programs. I’d wager that the average school in the 1800s had neither PE nor athletics of any sort. To find those, you’d probably have to attend private academies like Andover or Exeter. By the 70s, though, schools were taking their athletics programs–even athletic recruitment–seriously.

How strong you felt the dichotomy probably depends on the nature of your school. I have attended schools where all of the students were fairly smart and there was no anti-nerd sentiment, and I have attended schools where my classmates were fiercely anti-nerd and made sure I knew it.

But the dichotomy predates the terminology. Take Superman, first 1938. His disguise is a pair of glasses, because no one can believe that the bookish, mild-mannered, Clark Kent is actually the super-strong Superman. Batman is based on the character of El Zorro, created in 1919. Zorro is an effete, weak, foppish nobleman by day and a dashing, sword-fighting hero of the poor by night. Of course these characters are both smart and athletic, but their disguises only work because others do not expect them to be. As fantasies, the characters are powerful because they provide a vehicle for our own desires: for our everyday normal failings to be just a cover for how secretly amazing we are.

But for the most part, most smart people are perfectly fit, healthy, and coordinated–even the ones who like math.


Parsis, Travellers, and Human Niches

Irish Travellers, 1954


Why are there many kinds of plants and animals? Why doesn’t the best out-compete, eat, and destroy the others, rising to be the sole dominant species on Earth?

In ecology, a niche is an organism’s specific place within the environment. Some animals eat plants; some eat dung. Some live in the sea; others in trees. Different plants flower and grow in different seasons; some are pollinated by bees and some by flies. Every species has its specific niche.

The Competitive Exclusion Principle (aka Gause’s Law) states that ‘no two species can occupy the same niche’ (or positively, ‘two species coexisting must have different niches.’) For example, if squirrels and chipmunks both want to nest in the treetops and eat nuts, (and there are limited treetops and nuts,) then over time, whichever species is better at finding nuts and controlling the treetops will dominate the niche and the other, less successful species will have to find a new niche.

If squirrels are dominating the treetops and nuts, this leaves plenty of room for rabbits to eat grass and owls to eat squirrels.

II. So I was reading recently about the Parsis and the Travellers. The Parsis, as we discussed on Monday, are Zoroastrians, originally from Persia (modern-day Iran,) who settled in India about a thousand yeas ago. They’re often referred to as the “Jews of India” because they played a similar role in Indian society to that historically played by Jews in Europe.*

*Yes I know there are actual Jews in India.

The Travellers are an Irish group that’s functionally similar to Gypsies, but in fact genetically Irish:

In 2011 an analysis of DNA from 40 Travellers was undertaken at the Royal College of Surgeons in Dublin and the University of Edinburgh. The study provided evidence that Irish Travellers are a distinct Irish ethnic minority, who separated from the settled Irish community at least 1000 years ago; the claim was made that they are as distinct from the settled community as Icelanders are from Norwegians.[36]

It appears that Ireland did not have enough Gypsies of Indian extraction and so had to invent its own.

And though I originally thought that only in jest, why not? Gypsies occupy a particular niche, and if there are Gypsies around, I doubt anyone else is going to out-compete them for that niche. But if there aren’t any, then surely someone else could.

According to Wikipedia, the Travellers traditionally were tinkers, mended tinware (like pots) and acquiring dead/old horses for slaughter.

The Gypsies appear to have been originally itinerant musicians/entertainers, but have also worked as tinkers, smiths, peddlers, miners, and horse traders (today, car salesmen.)

These are not glorious jobs, but they are jobs, and peripatetic people have done them.

Jews (and Parsis, presumably) also filled a social niche, using their network of family/religious ties to other Jews throughout the diaspora as the basis of a high-trust business/trading network at a time when trade was difficult and routes were dangerous.

On the subject of “Madeburg rights” or law in Eastern Europe, Wikipedia notes:

In medieval Poland, Jews were invited along with German merchants to settle in cities as part of the royal city development policy.

Jews and Germans were sometimes competitors in those cities. Jews lived under privileges that they carefully negotiated with the king or emperor. They were not subject to city jurisdiction. These privileges guaranteed that they could maintain communal autonomy, live according to their laws, and be subjected directly to the royal jurisdiction in matters concerning Jews and Christians. One of the provisions granted to Jews was that a Jew could not be made Gewährsmann, that is, he could not be compelled to tell from whom he acquired any object which had been sold or pledged to him and which was found in his possession. Other provisions frequently mentioned were a permission to sell meat to Christians, or employ Christian servants.

External merchants coming into the city were not allowed to trade on their own, but instead forced to sell the goods they had brought into the city to local traders, if any wished to buy them.

Note that this situation is immensely better if you already know the guy you’re selling to inside the city and he’s not inclined to cheat you because you both come from the same small, tight-knit group.


Under Bolesław III (1102–1139), the Jews, encouraged by the tolerant regime of this ruler, settled throughout Poland, including over the border in Lithuanian territory as far as Kiev.[32] Bolesław III recognized the utility of Jews in the development of the commercial interests of his country. … Mieszko III employed Jews in his mint as engravers and technical supervisors, and the coins minted during that period even bear Hebraic markings.[30] … Jews enjoyed undisturbed peace and prosperity in the many principalities into which the country was then divided; they formed the middle class in a country where the general population consisted of landlords (developing into szlachta, the unique Polish nobility) and peasants, and they were instrumental in promoting the commercial interests of the land.

If you need merchants and goldsmiths, someone will become merchants and goldsmiths. If it’s useful for those merchants and goldsmiths to all be part of one small, close-knit group, then a small, close-knit group is likely to step into that niche and out-compete anyone else trying to occupy it.

The similarity of the Parsis to the Jews probably has less to do with them both being monotheists (after all, Christians, Muslims, and Sikhs are also monotheists,) and more to do with them both being small but widely-flung diasporic communities united by a common religion that allows them to use their group as a source of trustworthy business partners.

Over hundreds or thousands of years, humans might not just move into social niches, but actually become adapted to them–Jew and Parsis are both reported to be very smart, for example.

III. I can think of several other cases of ethnic groups moving into a particular niche. In the US, the gambling and bootleg alcohol trade were long dominated by ethnic Sicilians, while the crack and heroin trades have been dominated by black and Hispanic gangs.

Note that, while these activities are (often) illegal, they are still things that people want to do and the mafia/gangs are basically providing a goods/services to their customers. As they see it, they’re just businessmen. They’re out to make money, not commit random violence.

That said, these guys do commit lots of violence, including murder, blackmail and extortion. Even violent crime can be its own niche, if it pays well enough.

(Ironically, police crackdown on ethnic Sicilian control in NYC coincided with a massive increase in crime–did the mafia, by controlling a particular territory, keep out competing bands of criminals?)

On a more pleasant note, society is now rich enough that many people can make a living as professional sports stars, marry other professional sports stars, and have children who go on to also be professional sports stars. It’s not quite at the level of “a caste of professional athletes genetically optimized for particular sports,” but if this kept up for a few hundred years, it could be.

Similarly, over in Nepal, “Sherpa” isn’t just a job, it’s an ethnic group. Sherpas, due to their high elevation adaptation, have an advantage over the rest of us when it comes to scaling Mt. Everest, and I hear the global mountain-climbing industry pays them well for their services. A Sherpa who can successfully scale Mt. Everest many times, make lots of money, and raise lots of children in an otherwise impoverished nation is thus a successful Sherpa–and contributing to the group’s further genetic and cultural specialization in the “climbing Mt. Everest” niche.

India, of course, is the ultimate case of ethnic groups specializing into specific jobs–it’d be interesting to see what adaptations different groups have acquired over the years.

I also wonder if the caste system is an effective way to minimize competition between groups in a multi-ethnic society, or if it leads to more resentment and instability in the long run.

Zoroastrian (Parsi) DNA

Farvahar. Persepolis, Iran.

Zoroastrianism is one of the world’s oldest surviving religions and possibly its first monotheistic one. It emerged in now-Iran about 3,000 years ago, but following the Arab (Islamic) conquest of Persia, many Zoroastrians migrated to India, where they became known as the Parsi (from the word for “Persian.”) To be clear, where this post refers to “Parsis” it means the specific Zoroastrian community in India, and where it refers to “Iranian Zoroastrians” it means the Zoroastrians currently living in Iran.

Although Zoroastrianism was once the official state religion of Persia, today only about 190,000 believers remain (according to Wikipedia,) and their numbers are declining.

If you’re thinking that a diasporic community of monotheists sounds familiar, you’re in good company. According to Wikipedia:

Portuguese physician Garcia de Orta observed in 1563 that “there are merchants … in the kingdom of Cambaia … known as Esparcis. We Portuguese call them Jews, but they are not so. They are Gentios.”

Another parallel: Ashkenazi Jews and Parsis are both reported to be very smart. Famous Parsis include Queen Guitarist Freddy Mercury, nuclear physicist Homi J. Bhabha, and our Harvard-employed friend, Homi K. Bhabha.

Lopez et al have recently carried out a very interesting study of Zoroastrian DNA, The Genetic Legacy of Zoroastrianism in Iran and India: Insights into Population Structure, Gene Flow, and Selection:

Historical records indicate that migrants from Persia brought Zoroastrianism to India, but there is debate over the timing of these migrations. Here we present genome-wide autosomal, Y chromosome, and mitochondrial DNA data from Iranian and Indian Zoroastrians and neighboring modern-day Indian and Iranian populations and conduct a comprehensive genome-wide genetic analysis in these groups. … we find that Zoroastrians in Iran and India have increased genetic homogeneity relative to other sampled groups in their respective countries, consistent with their current practices of endogamy. Despite this, we infer that Indian Zoroastrians (Parsis) intermixed with local groups sometime after their arrival in India, dating this mixture to 690–1390 CE and providing strong evidence that Iranian Zoroastrian ancestry was maintained primarily through the male line.

Note that all diasporic–that is, migrant–groups appear to be heavily male. Women tend to stay put while men move and take new wives in their new homelands.

By making use of the rich information in DNA from ancient human remains, we also highlight admixture in the ancestors of Iranian Zoroastrians dated to 570 BCE–746 CE, older than admixture seen in any other sampled Iranian group, consistent with a long-standing isolation of Zoroastrians from outside groups. …

Admixture with whom? (Let’s just read the paper and see if it answers the question):

Furthermore, a recent study using genome-wide autosomal DNA found that haplotype patterns in Iranian Zoroastrians matched more than other modern Iranian groups to a high-coverage early Neolithic farmer genome from Iran

A study of four restriction fragment length polymorphisms (RFLPs) suggested a closer genetic affinity of Parsis to Southern Europeans than to non-Parsis from Bombay. Furthermore, NRY haplotype analysis and patterns of variation at the HLA locus in the Parsis of Pakistan support a predominately Iranian origin. …

In (1) and (2), we detected admixture in the Parsis dated to 27 (range: 17–38) and 32 (19–44) generations ago, respectively, in each case between one predominantly Indian-like source and one predominantly Iranian-like source. This large contribution from an Iranian-like source (∼64%–76%) is not seen in any of our other 7 Indian clusters, though we detect admixture in each of these 7 groups from wide-ranging sources related to modern day individuals from Bangladesh, Cambodia, Europe, Pakistan, or of Jewish heritage (Figures 2 and S7, Tables S5–S7). For Iranian Zoroastrians, we detect admixture only under analysis (2), occurring 66 (42–89) generations ago between a source best genetically explained as a mixture of modern-day Croatian and Cypriot samples, and a second source matching to the Neolithic Iranian farmer WC1. … The two Iranian Zoroastrians that had been excluded as outliers exhibited admixture patterns more similar to the Lebanese, Turkish Jews, or Iranian Bandari individuals than to Zoroastrians (Table S8).

Parsi Wedding, 1905

If I assume a generation is about 25 years long, 27 generations was about 675 years ago; 32 was about 800 years ago. (Though given the wide range on these dates, perhaps we should estimate between 425 and 1,100 years ago.) This sounds consistent with Parsis taking local wives after they arrived in India between the 8th and 10th century CE (after the Arab conquest of Perisa.) Also consistently, this admixture isn’t found in Iranian Zoroastrians.

The Iranians’ admixture occurred about 1,050 and 2,225 years ago, which is an awfully broad time range. Could Croatian or Cypriot migrants have arrived due to the Greek/Roma/ Byzantine Empires? Were they incorporated into the Persian Empire as a result of its territorial conquests or the Arab conquest? Or were they just long-distance merchants who happened to wander into the area?

The Fire Temple of Baku

The authors found that Parsi priests had “the lowest gene diversity values of all population samples studied for both Y and mtDNA,” though they didn’t have enough Iranian Zoroastrian priest samples to compare them to Parsi priests. (I bet this is similar to what you’d find if you sampled Orthodox rabbis.)

Finally, in the genetic selection and diseases section, the authors write:

In the case of the Iranian Zoroastrians, … some of the most significant SNPs… are located upstream of gene SLC39A10 … with an important role in humoral immunity61 or in CALB2 … which plays a major role in the cerebellar physiology.62

With regard to the positive selection tests on Parsis versus India Hindu/Gujarati groups, the most significant SNPs were embedded in WWOX … associated with neurological disorders like early epilepsy … and in a region in chromosome 20 … (see Table S11 for a complete list). …

Genetic isolation and endogamous practices can be associated with higher frequencies of disease prevalence. For example, there are reports claiming a high recurrence of diseases such as diabetes among the Iranian Zoroastrians, and Parkinson, colon cancer, or the deficiency of G6PD, an enzyme that triggers the sudden reduction of red blood cells, among the Parsis.

However, the authors warn that these results are weak (these are rare conditions in an already small population) and cannot not be depended upon.

Navigation and the Wealth of Nations

Global Determinants of Navigational Ability, by Coutrot et al:

Using a mobile-based virtual reality navigation task, we measured spatial navigation ability in more than 2.5 million people globally. Using a clustering approach, we find that navigation ability is not smoothly distributed globally but clustered into five distinct yet geographically related groups of countries. Furthermore, the economic wealth of a nation (Gross Domestic Product per capita) was predictive of the average navigation ability of its inhabitants and gender inequality (Gender Gap Index) was predictive of the size of performance difference between males and females. Thus, cognitive abilities, at least for spatial navigation, are clustered according to economic wealth and gender inequalities globally.

This is an incredible study. They got 2.5 million people from all over the world to participate.

If you’ve been following any of the myriad debates about intelligence, IQ, and education, you’re probably familiar with the concept of “multiple intelligences” and the fact that there’s rather little evidence that people actually have “different intelligences” that operate separately from each other. In general, it looks like people who have brains that are good at working out how to do one kind of task tend to be good at working out other sorts of tasks.

I’ve long held navigational ability as a possible exception to this: perhaps people in, say, Polynesian societies depended historically far more on navigational abilities than the rest of us, even though math and literacy were nearly absent.

Unfortunately, it doesn’t look like the authors got enough samples from Polynesia to include it in the study, but they did get data from Indonesia and the Philippines, which I’ll return to in a moment.

Frankly, I don’t see what the authors mean by “five distinct yet geographically related groups of countries.” South Korea is ranked between the UK and Belgium; Russia is next to Malaysia; Indonesia is next to Portugal and Hungary.

GDP per capita appears to be a stronger predictor than geography:

Some people will say these results merely reflect experience playing video games–people in wealthier countries have probably spent more time and money on computers and games. But assuming that the people who are participating in the study in the first place are people who have access to smartphones, computers, video games, etc., the results are not good for the multiple-intelligences hypothesis.

In the GDP per Capita vs. Conditional Modes (ie how well a nation scored overall, with low scores better than high scores) graph, countries above the trend line are under-performing relative to their GDPs, and countries below the line are over-performing relative to their GDPs.

South Africa, for example, significantly over-performs relative to its GDP, probably due to sampling bias: white South Africans with smartphones and computers were probably more likely to participate in the study than the nation’s 90% black population, but the GDP reflects the entire population. Finland and New Zealand are also under-performing economically, perhaps because Finland is really cold and NZ is isolated.

On the other side of the line, the UAE, Saudi Arabia, and Greece over-perform relative to GDP. Two of these are oil states that would be much poorer if not for geographic chance, and as far as I can tell, the whole Greek economy is being propped up by German loans. (There is also evidence that Greek IQ is falling, though this may be a near universal problem in developed nations.)

Three other nations stand out in the “scoring better than GDP predicts” category: Ukraine, (which suffered under Communism–Communism seems to do bad things to countries,) Indonesia and the Philippines. While we could be looking at selection bias similar to South Africa, these are island nations in which navigational ability surely had some historical effect on people’s ability to survive.

Indonesia and the Philippines still didn’t do as well as first-world nations like Norway and Canada, but they outperformed other nations with similar GDPs like Egypt, India, and Macedonia. This is the best evidence I know of for independent selection for navigational ability in some populations.

The study’s other interesting findings were that women performed consistently worse than men, both across countries and age groups (except for the post-90 cohort, but that might just be an error in the data.) Navigational ability declines steeply for everyone post-23 years old until about 75 years; the authors suggest the subsequent increase in abilities post-70s might be sampling error due to old people who are good at video games being disproportionately likely to seek out video game related challenges.

The authors note that people who drive more (eg, the US and Canada) might do better on navigational tasks than people who use public transportation more (eg, Europeans) but also that Finno-Scandians are among the world’s best navigators despite heavy use of public transport in those countries. The authors write:

We speculate that this specificity may be linked to Nordic countries sharing a culture of participating in a sport related to navigation: orienteering. Invented as an official sport in the late 19th century in Sweden, the first orienteering competition open to the public was held in Norway in 1897. Since then, it has been more popular in Nordic countries than anywhere else in the world, and is taught in many schools [26]. We found that ‘orienteering world championship’ country results significantly correlated with countries’ CM (Pearson’s correlation ρ = .55, p = .01), even after correcting for GDP per capita (see Extended Data Fig. 15). Future targeted research will be required to evaluate the impact of cultural activities on navigation skill.

I suggest a different causal relationship: people make hobbies out of things they’re already good at and enjoy doing, rather than things they’re bad at.



Please note that the study doesn’t look at a big chunk of countries, like most of Africa. Being at the bottom in navigational abilities in this study by no means indicates that a country is at the bottom globally–given the trends already present in the data, it is likely that the poorer countries that weren’t included in the study would do even worse.