Anthropology Friday: Crackers pt 2

uk-origins3
From JayMan’s post on the American Nations

I am frequently frustrated by our culture’s lack of good ethnonyms. Take “Hispanic.” It just means “someone who speaks Spanish or whose ancestors spoke Spanish.” It includes everyone from Lebanese-Mexican billionaire Carlos Slim to Japanese-Peruvian Alberto Fujimori, from Sephardi Jews to native Bolivians, from white Argentinians to black Cubans, but doesn’t include Brazilians because speaking Portuguese instead of Spanish is a really critical ethnic difference.*

*In conversation, most people use “Hispanic” to mean “Mexican or Central American who’s at least partially Native American,” but the legal definition is what colleges and government agencies are using when determining who gets affirmative action. People think “Oh, those programs are to help poor, brown people,” when in reality the beneficiaries are mostly well-off and light-skinned–people who were well-off back in their home countries.

This is the danger of using euphemisms instead of saying what you actually mean.

Our ethnonyms for other groups are equally terrible. All non-whites are often lumped together under a single “POC” label, as though Nigerian Igbo and Han Chinese were totally equivalent and fungible peoples. Whites are similarly lumped, as if a poor white from the backwoods of Georgia and a wealthy Boston Puritan had anything in common. There are technical names for these groups, used in historical or academic contexts, but if you tell the average person you hail from a mix of “Cavalier-Yeoman and Cracker ancestors,” they’re just going to be confused.

north-american-nations-4-3
map of the American Nations

With the exception of Cajuns and recent immigrants who retain an old-world ethnic identity (eg, Irish, Jewish,) we simply lack common vernacular ethnonyms for the different white groups that settled the US–even though they are actually different.

The map at left comes from Colin Woodard’s American Nations: A History of the 11 Rival Regional Cultures of North America. 

As Woodard himself has noted, DNA studies have confirmed his map to an amazing degree.

American ethnic groups are not just Old World ethnic groups that happen to live in America. They’re real ethnicities that have developed over here during the past 500 years, but we have failed to adopt common names for them.

Woodard’s map implies a level of ethnic separation that is probably not entirely accurate, as these groups settled the American frontier in waves, creating layers of ethnicity that are thicker or thinner in different places. Today, we call these social classes, which is not entirely inaccurate.

Take the South. The area is dominated by two main ethnic blocks, Appalachians (in the mountains) and Cavalier-Plantation owners in the flatter areas. But the Cavalier area was never majority wealthy, elite plantation owners; it has always had a large contingent of middling-class whites, poor whites, and of course poor blacks. In areas of the “Deep South” where soils were poor or otherwise unsuited to cultivated, elite planters never penetrated, leaving the heartier backwoods whites–the Crackers–to their own devices.

If their ancestors spoke French, we recognize them as different, but if not, they’re just “poor”–or worse, “trash.”

Southern identity is a curious thing. Though I was born in the South (and my ancestors have lived there for over 400 years,) I have no meaningful “Southern identity” to speak of–nor do, I think, most southerners. It’s just a place; the core historical event of going to war to protect the interests of rich elites in perpetuating slavery doesn’t seem to resonate with most people I’ve met.

My interest in the region and its peoples stems not from Southern Pride, but the conventional curiosity adoptees tend to feel about their birth families: Where did I come from? What were they like? Were they good people? and Can I find a place where I feel comfortable and fit in? (No.)

My immediate biological family hails from parts of the South that never had any plantations (I had ancestors in Georgia in the 1800s, and ancestors in Virginia in the 1700s, but they’ve been dead for a while; my father lives within walking distance of his great-grandparent’s homestead.)

5a74b9a780f8c.image
Dust Storm, Tulsa, Oklahoma, 1935 “This was a bad idea.”–Grandma

As previously discussed, I don’t exactly feel at home in cities;  perhaps this is because calling my ancestors “farmers” is a rather generous description for folks who thought it was a good idea to move to Oklahoma during the Dust Bowl.

(By the way, the only reason the prairies are consistently farmed today is due to irrigation, drawing water up from the Ogallala and other aquifers, and we are drawing water from those aquifers much faster than it is being replenished. If we keep using water at this rate–or faster, due to population growth–WE WILL RUN OUT. The prairies will go dry and dust storms will rage again.)

To be fair, some of my kin were successful farmers when it actually rained, but some were never so sedentary. Pastoralists, ranchers, hoe-farmers–they were the sorts of people who settled frontiers and moved on when places got too crowded, who drank hard and didn’t always raise their children. They match pretty closely Richard Sapp’s description of the Florida Crackers.

6KmUzif

From a genetic standpoint, the Crackers are either descended from borderlanders and Scotch-Irish (the pink region on the map at the top of the post,) or from folks who got along well with borderlanders and decided to move alongside them. I find it amazing that a relatively small place like Britain could produce such temperamentally different peoples as Puritans and Crackers–the former hard working, domesticated, stiff, and proper; the latter loud, liberty-loving, and more violent.

Peter Frost (evo and proud) has a theory that “core” Europe managed to decrease its homicide rates by executing criminals, thus removing them from the gene pool; the borderlands of Scotland and Ireland were perhaps beyond the reach of the hangman’s noose, or hopping the border allowed criminals to escape the police.

individualism-map-2-hajnal-line
from HBD Chick’s big summary post on the Hajnal Line

HBD Chick’s work focuses primarily on the effects of manorialism and outbreeding within the Hajnal line. Of the Crackers, she writes:

“The third American Revolution reached its climax in the years from 1779 to 1781. This was a rising of British borderers in the southern backcountry against American loyalists and British regulars who invaded the region. The result was a savage struggle which resembled many earlier conflicts in North Britain, with much family feuding and terrible atrocities committed on both sides. Prisoners were slaughtered, homes were burned, women were raped and even small children were put to the sword.” …

i’ve got a couple of posts related to those rambunctious folks from the backcountry whose ancestors came from the borderlands between england and scotland. libertarian crackers takes a quick look at why this group tends to love being independent and is distrustful of big gubmint — to make a long story short, the border folks married closely for much longer than the southern english — and they didn’t experience much manorialism, either (the lowland scots did, but not so much the border groups). did i mention that they’re a bit hot-headed? (not that there’s anything wrong with that! (~_^) ) see also: hatfields and mccoys. not surprising that this group’s war of independence involved “much family feuding.”

Less manorialism, less government control, less executing criminals, more cousin-marriage, more clannishness.

And the differences here aren’t merely cultural. As Nisbett and Cohen found (PDF; h/t HBD Chick):

During the experiment, a confederate bumped some subjects and muttered ‘asshole’ at them. Cortisol (a stress hormone) and testosterone (rises in preparation for violence) were measured before and after the insult. Insulted Southerners showed big jumps in both cortisol and testosterone compared to uninsulted Southerners and insulted Northerners. The difference in psychological and physiological responses to insults was manifest in behavior. Nisbett and Cohen recruited a 6’3” 250 lb (190 cm, 115 kg) American style football player whose task was to walk down the middle of a narrow hall as subjects came the other direction. The experimenters measured how close subjects came to the football player before stepping aside. Northerners stepped aside at around 6 feet regardless of whether they had been insulted. Un-insulted Southerners stepped aside at an average distance of 9 feet, whereas insulted Southerners approached to an average of about 3 feet. Polite but prepared to be violent, un-insulted Southerners take more care, presumably because they attribute a sense of honor to the football player and are normally respectful of others’ honor. When their honor is challenged, they are prepared and willing to challenge someone at considerable risk to their own safety.”

It’s genetic.

(The bit about honor is… not right. I witnessed a lot of football games as a child, and no one ever referred to the players as “honorable.” Southerners just don’t like to get close to each other, which is very sensible if people in your area get aggressive and angry easily. The South also has a lower population density than the North, so people are used to more space.)

As my grandmother says, “You don’t get to pick your ancestors.” I don’t know what I would think of my relatives had I actually grown up with them. They have their sins, like everyone else. But from a distance, as an adult, they’re fine people and they always have entertaining stories.

“Oh, yes, yet another time I almost died…”

As for racial attitudes, if you’re curious, they vary between “probably marched for Civil Rights back in the 50s” and “has never spoken a word, good or bad, generalizing about any ethnic group.” (I have met vocally anti-black people in the South; just not in my family.) I think my relatives are more interested in various strains of Charismatic Christianity than race.

It seems rather unfortunate that Southern identity is so heavily linked to the historical interests of the Plantation Elites. After all, it did the poor whites no good to die in a war fought to protect the interests of the rich. I think the desire to take pride in your ancestors and group is normal, healthy, and instinctive, but Southerners are in an unfortunate place where that identity is heavily infused with a racial ideology most Southerners don’t even agree with.

> Be white
> Be from the south
> Not into Confederacy
> Want an identity of some sort

> Now what?

In my case, I identify with nerds. This past is not an active source of ethnic identity, nor is the Cracker lifestyle even practical in the modern day. But my ancestors have still contributed (mostly genetically) to who I am.

Well, this was going to just be an introduction to today’s anthropology selection, but it turned out rather longer than expected, so let’s just save the real anthropology for next week.

Advertisements

Neanderthal Skull for 3D Printing

e3fa487b36f43641fc86d1fbe40665b4_preview_featured Meet Nandy the Neanderthal. You can download him at Thingiverse.

This is my first creation, Nandy the Neanderthal, based on the Chapelle-aux-Saints 1 skull and this side view. Note that he is based on two different skulls, but still very much a Neanderthal.

Since this is my very first creation and I don’t have a 3D printer yet, (I expect to receive one soon and am planning ahead,) I am still learning all of the ins and outs of this technology and so would appreciate any technical feedback.

Neanderthals evolved around 600,000-800,000 years ago and spread into the Middle East, Europe, and central Asia. They made stone tools, controlled fire, and hunted. They survived in a cold and difficult climate, but likely could make no more than the simplest of clothes. As a result, they may have been, unlike modern humans, hairy.

Cochran and Harpending of West Hunter write in The 10,000 Year Explosion: 

 Chimpanzees have ridges on their finger bones that stem from the way that they clutch their mothers’ fur as infants. Modern humans don’t have these ridges, but Neanderthals do.

Hoffecker, in The Spread of Modern Humans in Europe writes:

Neanderthal sites show no evidence of tools for making tailored clothing. There are only hide scrapers, which might have been used to make blankets or ponchos. This is in contrast to Upper Paleolithic (modern human) sites, which have an abundance of eyed bone needles and bone awls.

Their skulls were, on average, larger than ours, with wide noses, round eyes, and an elongated braincase. Their facial features were robust–that is, strong, thick, and heavy.

The Chappel-aux-Saints 1 Neanderthal lived to about 40 years old. He had lost most of his teeth years before his death, (I gave Nandy some teeth, though,) suffered arthritis, and must have been cared for in his old age by the rest of his tribe. At his death he was most likely buried in a pit dug by his family, which preserved his skeleton in nearly complete condition for 60,000 years.

Anatomically modern humans, Homo sapiens, encountered and interbred with Neanderthals around 40,000 years ago. (Neanderthals are also humans–Homo neanderthalensis.) Today, about 1-5% of the DNA in non-Sub-Saharan Africans hails originally from a Neanderthal ancestor. (Melanesians also have DNA from a cousin of the Neanderthals, the Denisovans, and Sub-Saharan Africans may have their own archaic ancestors.)

Unfortunately for Nandy and his relations, the Neanderthals also began to disappear around 40,000 years ago. Perhaps it was the weather, or Homo sapiens out competed them, or their enormous skulls just caused too much trouble in childbirth. Whatever happened, the Neanderthals remain a mystery, evidence of the past when we weren’t the only human species in town.

Denny: the Neanderthal-Denisovan Hybrid

Carte_Neandertaliens
Neanderthal Sites (source: Wikipedia)

Homo Sapiens–that is, us, modern humans, are about 200-300,000 years old. Our ancestor, Homo heidelbergensis, lived in Africa around 700,000-300,000 years ago.

Around 700,000 years ago, another group of humans split off from the main group. By 400,000 years ago, their descendants, Homo neanderthalensis–Neanderthals–had arrived in Europe, and another band of their descendants, the enigmatic Denisovans, arrived in Asia.

While we have found quite a few Neanderthal remains and archaeological sites with tools, hearths, and other artifacts, we’ve uncovered very few Denisovan remains–a couple of teeth, a finger bone, and part of an arm in Denisova Cave, Russia. (Perhaps a few other remains I am unaware of.)

Yet from these paltry remains scientists have extracted enough DNA to ascertain that no only were Denisovans a distinct species, but also that Melanesians, Papuans, and Aborigines derive about 3-6% of their DNA from a Denisovan ancestors. (All non-African populations also have a small amount of Neanderthal DNA, derived from a Neanderthal ancestors.)

If Neanderthals and Homo sapiens interbred, and Denisovans and Homo sapiens interbred, did Neanderthals and Denisovans ever mate?

nature-siberian-neanderthals-17.02.16-v2
The slightly more complicated family tree, not including Denny

Yes.

The girl, affectionately nicknamed Denny, lived and died about 90,000 years ago in Siberia. The remains of an arm, found in Denisova Cave, reveal that her mother was a Neanderthal, her father a Denisovan.

We don’t yet know what Denisovans looked like, because we don’t have any complete skeletons of them, much less good skulls to examine, so we don’t know what a Neanderthal-Denisovan hybrid like Denny looked like.

But the fact that we can extract so much information from a single bone–or fragment of bone–preserved in a Siberian cave for 90,000 years–is amazing.

We are still far from truly understanding what sorts of people our evolutionary cousins were, but we are gaining new insights all the time.

Why are there no Han Chinese Fields Medalists?

IQ by country

I am specifically referring to Han Chinese from the People’s Republic of China (hereafter simply called “China,”) but wanted to keep the title to a reasonable length.

There are about a billion Han Chinese. They make up about 90% of the PRC, and they have some of the highest average IQs on the planet, with particularly good math scores.

Of the 56 Fields Medals (essentially, the Nobel for Math) awarded since 1936, 12 (21%) have been French. 14 or 15 have been Jewish, or 25%-27%.

By contrast, 0 have been Han Chinese from China itself.

France is a country of 67.15 million people, of whom about 51 million are native French. The world has about 14-17.5 million Jews. China has about 1.37 billion people, of whom 91.51% are Han, or about 1.25 billion.

Two relatively Chinese people have won Fields medals:

Shing-Tung Yau was born in China, but is of Hakka ancestry (the Hakka are an Asian “market-dominant minority,”) not Han. His parents moved to Hong Kong when he was a baby; after graduating from the Chinese University of Hong Kong, he moved to the US, where he received his PhD from Berkley. Yau was a citizen of British-owned Hong Kong (not the People’s Republic of China), when he won the Fields Medal, in 1982; today he holds American citizenship.

Terence Tao, the 2006 recipient, is probably Han (Wikipedia does not list his ethnicity.) His father hailed from Shanghai, China, but moved to Hong Kong, where he graduated from medical school and met Tao’s mother, another Hong Kong-ian. Tao himself was born in Australia and later moved to the US. (Tao appears to be a dual Australian-American citizen.)

(With only 7.4 million people, Hong Kong is doing pretty well for itself in terms of Fields Medalists with some form of HK ancestry or citizenship.)

Since not many Fields Medals have been awarded, it is understandable why the citizens of small countries, even very bright ones, like Singapore, might not have any. It’s also understandable why top talent often migrates to places like Hong Kong, Australia, or the US. But China is a huge country with a massive pool of incredibly smart people–just look at Shanghai’s PISA scores. Surely Beijing has at least a dozen universities filled with math geniuses.

So where are they?

Is it a matter of funding? Has China chosen to funnel its best mathematicians into applied work? A matter of translation? Does the Fields Medal Committee have trouble reading papers written in Chinese? A matter of time? Did China’s citizens simply spent too much of the of the past century struggling at the edge of starvation to send a bunch of kids off to university to study math, and only recently achieved the level of mass prosperity necessary to start on the Fields path?

Whatever the causes of current under-representation, I have no doubt the next century will show an explosion in Han Chinese mathematical accomplishments.

Further thoughts on Epigenetics and Public Policy

Shea Robison of Epigenetics and Public Policy has kindly replied to my previous comment there with a post of his own, Why Epigenetics and Politics?

To briefly quote Robison:

In general, there are two main emphases for my interests in epigenetics: the scientific side, and the political/philosophical aspects. These are necessarily related to each other in many different ways (e.g., the political and philosophical aspects would not exist without the scientific work being done in epigenetics, just as politics has had a substantial influence on the development of the science—again, a major focus of my book), but they can also be quite disconnected from each other in different contexts (e.g., political uses can be made of the science which ignore important findings or critical assumptions).

He points out that there is in fact a fair amount of legislation criminalizing drug use during pregnancy:

… just after I read this comment someone posted an extensive thread on my Twitter feed loaded with references on just this issue of ‘crack babies,’ and particularly about the differences in framing and policy narratives due to politically salient issues such as race (here: http://bit.ly/2KQHWFh). I also did a quick Google search on “criminalization of drug use during pregnancy” (http://bit.ly/2wrpvE4), which came up with 31,900,000 results about all the different ways that this kind of thing is actually a substantial focus of public policy and government action. …

An interesting subject in its own right, but I shan’t quote the whole post; you can read it on Robison’s blog.

For those of you following along, here is my response:

Disclaimer: a friend of mine was a crack baby. She’s a lovely person, but is constantly in pain. I don’t want people to end up like her, but I’m glad she exists. My first impulse is that someone who would do such a horrible thing to a baby deserves to be punished, but would this kind of legislation simply encouraged her (biological) mother to have an abortion? My friend is not suicidal; she wants to be alive, even if life is difficult.

In general, such legislation strikes me as misguided. Getting the police involved is highly punitive (what are the effects on a developing fetus of being arrested and involuntarily committed?) and has a major negative effect on people’s ability to do the kind of productive behaviors that are associated with getting off drugs, like hold down a job or have stable housing.

We can look at the gov’t’s history with public health programs. Some turned out better than others. Prohibition went quite badly. Everything “Opioid Epidemic” looks like it’s spiraling out of control. On the other hand, regulation on cigarette advertising has probably been beneficial; I suppose the jury is still out on the long-term effects of marijuana deregulation in some states.

Gov’t nutrition policy was probably good when it gave people food stamps but quite bad when it promoted trans fats (based on nutrition “research” that was not nearly as sound as people thought it was–which should be a sobering lesson about the urge to let politicians make public policy off what they think the science says.)

Not that my opinions count for much, but I think programs that promote healthy behaviors would be more effective and beneficial in the long run.

But let’s look at another example. Drug use is generally limited to individual behavior, often of very poor or otherwise marginalized people. What about mining disasters, toxic waste spills, wars, etc? Perhaps the guilty parties should be made to pay–but mining companies already routinely declare bankruptcy to avoid paying for their mistakes–I doubt this behavior is going to change. Any legislation in this direction, while well-intended, seems likely to be ineffective at best.

Of course the gov’t itself has certain obligations. What about the children of the men marched into atomic bomb blasts, or exposed to agent orange? (eg https://www.youtube.com/watch?v=ZWSMoE3A5DI) But the gov’t has so far been really bad about paying up for the damage it caused directly, much less the claims of the children of those hurt, so I’m not holding my breath on this. [“I got tortured in a Nazi POW camp and all I got for it was an accusation of going AWOL.”]

There seems also a risk of discounting people’s present abilities based on epigenetic claims. Right now the left likes to claim things like “Native Americans suffer epigenetic trauma that makes them do badly in school and continue the cycle of violence,” but it is easy to see how this can morph into “Native Americans are helplessly destined to be dumb and violent.”

I think the Left wants epigenetics to be a “get out of genetics free” card, but in the process they’re replicating genetic determinism, just at a different point in the organism. They’re going to be very disappointed if the results of their advocacy are the police arresting poor black women because they couldn’t afford prenatal vitamins.

Philosophy:
The question of what exactly the Founding Fathers (and common man) thought in 1776 is fascinating, but not necessarily relevant to current policies. Those Americans believed a wide variety of things, from Puritans (predestination) to Southern aristocrats (slavery) to affable Quakers. I suspect equality was not so much a philosophical position for the average man so much as a practical one–the man who survived by his wits in the wilderness, clearing the land, building his farm, etc., far from the civilizing and protective umbrella of the cities, was a law unto himself, enforced by violence or not at all. In essence, man was free not because he had read Locke, but because he had a gun and would shot anyone who said otherwise.

Regardless, no matter how, erm, clean your genome is, you don’t get from genes to Lockeian blank slates. You still have genes, some good, some bad.

Most people (even libs) acknowledge a combination of “nature and nurture” in shaping the individual. It’s only at the extremes that people start wholesale denying that genetic differences exist (“Women only do worse than men at sports because they’ve internalized norms of feminity that make them lose;” “Dog breeds are identical in temperament; any differences in behavior are entirely due to training,”) but there is always pressure on moderates not to contradict the extremes; not to mention a fear among libs that any acknowledgement of genetic differences between people or groups will embolden the conservatives.

In this they are simply wrong–genetic differences exist; the Blank Slate is nonsense; and people who think genes influence behavior tend to be more tolerant, not less (eg https://mobile.twitter.com/SteveStuWill/status/995978801518559234/photo/1#tweet_995978801518559234 ).

Modern liberalism cannot be saved so long as it rests on incorrect factual statements about the world. Sooner or later the results are either mass suffering (eg, the Soviet Union) or mass discrediting of the idea and the rise of a new one.

>”We already have ways of describing and discussing inheritance of behaviors from environmental exposures (e.g., psychology),”

Unfortunately, much of psychology is terribly broken. Priming, stereotype threat, implicit bias, Freudianism… they’re all either nonsense or have failed to replicate. The most reliable results, imo, are drug-related. Prozac works pretty well for depression; lithium works for bi-polar; risperidone lets schizophrenics lead relatively normal lives. Any drug can be abused or mis-prescribed, but the results with some of the most effective psychiatric drugs are really quite amazing. Psychology, by contrast, is at best talking with people about their problems, giving them a supportive space to vent and think things through.

In the end, I suppose my thoughts summarize to caution. It is easy to over-estimate how much we know and toss together legislation that ends up having unexpected effects. There might be some areas where better knowledge of epigenetic effects can lead to superior policy making (various dietary/nutrition supplementation programs like the promotion of folic acid for expectant mothers, fluoridated water, iodized salt, etc., have already IMO caused great improvement in public health.)

I hope people can be on the lookout for ways to improve life, rather than merely punish.

 

North Africa in Genetics and History

detailed map of African and Middle Eastern ethnicities in Haaks et al’s dataset

North Africa is an often misunderstood region in human genetics. Since it is in Africa, people often assume that it contains the same variety of people referenced in terms like “African Americans,” “black Africans,” or even just “Africans.” In reality, the African content contains members of all three of the great human clades–Sub-Saharan Africans in the south, Polynesians (Asian clade) in Madagascar, and Caucasians in the north.

The North African Middle Stone Age and its place in recent human evolution provides an overview of the first 275,000 years of humanity’s history in the region(300,000-25,000 years ago, more or less), including the development of symbolic culture and early human dispersal. Unfortunately the paper is paywalled.

Throughout most of human history, the Sahara–not the Mediterranean or Red seas–has been the biggest local impediment to human migration–thus North Africans are much closer, genetically, to their neighbors in Europe and the Middle East than their neighbors across the desert (and before the domestication of the camel, about 3,000 years ago, the Sahara was even harder to cross.)

But from time to time, global weather patterns change and the Sahara becomes a garden: the Green Sahara. The last time we had a Green Sahara was about 9-7,000 years ago; during this time, people lived, hunted, fished, herded and perhaps farmed throughout areas that are today nearly uninhabited wastes.

The Peopling of the last Green Sahara revealed by high-coverage resequencing of trans-Saharan patrilineages sheds light on how the Green (and subsequently brown) Sahara affected the spread (and separation) of African groups into northern and sub-Saharan:

In order to investigate the role of the last Green Sahara in the peopling of Africa, we deep-sequence the whole non-repetitive portion of the Y chromosome in 104 males selected as representative of haplogroups which are currently found to the north and to the south of the Sahara. … We find that the coalescence age of the trans-Saharan haplogroups dates back to the last Green Sahara, while most northern African or sub-Saharan clades expanded locally in the subsequent arid phase. …

Our findings suggest that the Green Sahara promoted human movements and demographic expansions, possibly linked to the adoption of pastoralism. Comparing our results with previously reported genome-wide data, we also find evidence for a sex-biased sub-Saharan contribution to northern Africans, suggesting that historical events such as the trans-Saharan slave trade mainly contributed to the mtDNA and autosomal gene pool, whereas the northern African paternal gene pool was mainly shaped by more ancient events.

In other words, modern North Africans have some maternal (female) Sub-Saharan DNA that arrived recently via the Islamic slave trade, but most of their Sub-Saharan Y-DNA (male) is much older, hailing from the last time the Sahara was easy to cross.

Note that not much DNA is shared across the Sahara:

After the African humid period, the climatic conditions became rapidly hyper-arid and the Green Sahara was replaced by the desert, which acted as a strong geographic barrier against human movements between northern and sub-Saharan Africa.

A consequence of this is that there is a strong differentiation in the Y chromosome haplogroup composition between the northern and sub-Saharan regions of the African continent. In the northern area, the predominant Y lineages are J-M267 and E-M81, with the former being linked to the Neolithic expansion in the Near East and the latter reaching frequencies as high as 80 % in some north-western populations as a consequence of a very recent local demographic expansion [810]. On the contrary, sub-Saharan Africa is characterised by a completely different genetic landscape, with lineages within E-M2 and haplogroup B comprising most of the Y chromosomes. In most regions of sub-Saharan Africa, the observed haplogroup distribution has been linked to the recent (~ 3 kya) demic diffusion of Bantu agriculturalists, which brought E-M2 sub-clades from central Africa to the East and to the South [1117]. On the contrary, the sub-Saharan distribution of B-M150 seems to have more ancient origins, since its internal lineages are present in both Bantu farmers and non-Bantu hunter-gatherers and coalesce long before the Bantu expansion [1820].

In spite of their genetic differentiation, however, northern and sub-Saharan Africa share at least four patrilineages at different frequencies, namely A3-M13, E-M2, E-M78 and R-V88.

A recent article in Nature, “Whole Y-chromosome sequences reveal an extremely recent origin of the most common North African paternal lineage E-M183 (M81),” tells some of North Africa’s fascinating story:

Here, by using whole Y chromosome sequences, we intend to shed some light on the historical and demographic processes that modelled the genetic landscape of North Africa. Previous studies suggested that the strategic location of North Africa, separated from Europe by the Mediterranean Sea, from the rest of the African continent by the Sahara Desert and limited to the East by the Arabian Peninsula, has shaped the genetic complexity of current North Africans15,16,17. Early modern humans arrived in North Africa 190–140 kya (thousand years ago)18, and several cultures settled in the area before the Holocene. In fact, a previous study by Henn et al.19 identified a gradient of likely autochthonous North African ancestry, probably derived from an ancient “back-to-Africa” gene flow prior to the Holocene (12 kya). In historic times, North Africa has been populated successively by different groups, including Phoenicians, Romans, Vandals and Byzantines. The most important human settlement in North Africa was conducted by the Arabs by the end of the 7th century. Recent studies have demonstrated the complexity of human migrations in the area, resulting from an amalgam of ancestral components in North African groups15,20.

According to the article, E-M81 is dominant in Northwest Africa and absent almost everywhere else in the world.

The authors tested various men across north Africa in order to draw up a phylogenic tree of the branching of E-M183:

The distribution of each subhaplogroup within E-M183 can be observed in Table 1 and Fig. 2. Indeed, different populations present different subhaplogroup compositions. For example, whereas in Morocco almost all subhaplogorups are present, Western Sahara shows a very homogeneous pattern with only E-SM001 and E-Z5009 being represented. A similar picture to that of Western Sahara is shown by the Reguibates from Algeria, which contrast sharply with the Algerians from Oran, which showed a high diversity of haplogroups. It is also worth to notice that a slightly different pattern could be appreciated in coastal populations when compared with more inland territories (Western Sahara, Algerian Reguibates).

Overall, the authors found that the haplotypes were “strikingly similar” to each other and showed little geographic structure besides the coastal/inland differences:

As proposed by Larmuseau et al.25, the scenario that better explains Y-STR haplotype similarity within a particular haplogroup is a recent and rapid radiation of subhaplogroups. Although the dating of this lineage has been controversial, with dates proposed ranging from Paleolithic to Neolithic and to more recent times17,22,28, our results suggested that the origin of E-M183 is much more recent than was previously thought. … In addition to the recent radiation suggested by the high haplotype resemblance, the pattern showed by E-M183 imply that subhaplogroups originated within a relatively short time period, in a burst similar to those happening in many Y-chromosome haplogroups23.

In other words, someone went a-conquering.

Alternatively, given the high frequency of E-M183 in the Maghreb, a local origin of E-M183 in NW Africa could be envisaged, which would fit the clear pattern of longitudinal isolation by distance reported in genome-wide studies15,20. Moreover, the presence of autochthonous North African E-M81 lineages in the indigenous population of the Canary Islands, strongly points to North Africa as the most probable origin of the Guanche ancestors29. This, together with the fact that the oldest indigenous inviduals have been dated 2210 ± 60 ya, supports a local origin of E-M183 in NW Africa. Within this scenario, it is also worth to mention that the paternal lineage of an early Neolithic Moroccan individual appeared to be distantly related to the typically North African E-M81 haplogroup30, suggesting again a NW African origin of E-M183. A local origin of E-M183 in NW Africa > 2200 ya is supported by our TMRCA estimates, which can be taken as 2,000–3,000, depending on the data, methods, and mutation rates used.

However, the authors also note that they can’t rule out a Middle Eastern origin for the haplogroup since their study simply doesn’t include genomes from Middle Eastern individuals. They rule out a spread during the Neolithic expansion (too early) but not the Islamic expansion (“an extensive, male-biased Near Eastern admixture event is registered ~1300 ya, coincidental with the Arab expansion20.”) Alternatively, they suggest E-M183 might have expanded near the end of the third Punic War. Sure, Carthage (in Tunisia) was defeated by the Romans, but the era was otherwise one of great North African wealth and prosperity.

 

Interesting papers! My hat’s off to the authors. I hope you enjoyed them and get a chance to RTWT.

How to Minimize “Emotional Labor” and “Mental Load”: A Guide for Frazzled Women

A comic strip in the Guardian recently alerted me to the fact that many women are exhausted from the “Mental Load” of thinking about things and need their husbands to pitch in and help. Go ahead and read it.

Whew. There’s a lot to unpack here:

  1. Yes, you have to talk to men. DO NOT EXPECT OTHER PEOPLE TO KNOW WHAT YOU ARE THINKING. Look, if I can get my husband to help me when I need it, you certainly can too. That or you married the wrong man.
  2. Get a dayplanner and write things like “grocery lists” and doctors appointments in it. There’s probably one built into your phone.

There, I solved your problems.

That said, female anxiety (at least in our modern world) appears to be a real thing:

(though American Indians are the real untold story in this graph.)

According to the America’s State of Mind Report (PDF):

Medco data shows that antidepressants are the most commonly used mental health medications and that women have the highest utilization rates.  In 2010, 21 percent of women ages 20 and older were using an antidepressant.  … Men’s use of antidepressants is almost half that of women, but has also been on the rise with a 28 percent increase over the past decade. …

Anxiety disorders are the most common psychiatric illnesses affecting children and adults. … Although anxiety disorders are highly treatable, only about one‐third of sufferers receive treatment. …

Medco data shows that women have the highest utilization rate of anti‐anxiety medications; in
fact, 11 percent of middle‐aged women (ages 45‐64) were on an anti‐anxiety drug treatment in
2010, nearly twice the rate of their male counterparts (5.7 percent).

And based on the age group data, women in their prime working years (but waning childbearing years) have even higher rates of mental illness. (Adult women even take ADHD medicine at slightly higher rates than adult men.)

What causes this? Surely 20% of us–one in 5–can’t actually be mentally ill, can we? Is it biology or culture? Or perhaps a mismatch between biology and culture?

Or perhaps we should just scale back a little, and when we have friends over for dinner, just order a pizza instead of trying to cook two separate meals?

But if you think that berating your husband for merely taking a bottle out of the dishwasher when you asked him to get a bottle out of the dishwasher (instead of realizing this was code for “empty the entire dishwasher”) will make you happier, think again. “Couples who share the workload are more likely to divorce, study finds“:

Divorce rates are far higher among “modern” couples who share the housework than in those where the woman does the lion’s share of the chores, a Norwegian study has found. …

Norway has a long tradition of gender equality and childrearing is shared equally between mothers and fathers in 70 per cent of cases.But when it comes to housework, women in Norway still account for most of it in seven out of 10 couples. The study emphasised women who did most of the chores did so of their own volition and were found to be as “happy” those in “modern” couples. …

The researchers expected to find that where men shouldered more of the burden, women’s happiness levels were higher. In fact they found that it was the men who were happier while their wives and girlfriends appeared to be largely unmoved.

Those men who did more housework generally reported less work-life conflict and were scored slightly higher for wellbeing overall.

Theory: well-adjusted people who love each other are happy to do what it takes to keep the household running and don’t waste time passive-aggressively trying to convince their spouse that he’s a bad person for not reading her mind.

Now let’s talk about biology. The author claims,

Of course, there’s nothing genetic or innate about this behavior. We’re not born with an all-consuming passion for clearing tables, just like boys aren’t born with an utter disinterest for thing lying around.

Of course, the author doesn’t cite any papers from the fields of genetics or behavior psychology to back up her claims–just like she feels entitled to claim that other people should read her mind and absurdly thinks that a good project manager at work doesn’t bother to tell their team what needs to be done, she doesn’t feel any compulsion to cite any proof of her claims. Science says s. We know because some cartoonist on the internet claimed it did.

Over in reality-land, when we make scientific claims about things like genetics, we cite our sources. And women absolutely have an instinct for cleaning things: the Nesting Instinct. No, it isn’t present when we’re born. It kicks in when we’re pregnant–often shortly before going into labor. Here’s an actual scientific paper on the Nesting Instinct published in the scientific journal Evolution and Human Behavior:

In altricial mammals, “nesting” refers to a suite of primarily maternal behaviours including nest-site selection, nest building and nest defense, and the many ways that nonhuman animals prepare themselves for parturition are well studied. In contrast, little research has considered pre-parturient preparation behaviours in women from a functional perspective.

According to the university’s press release about the study:

The overwhelming urge that drives many pregnant women to clean, organize and get life in order—otherwise known  as nesting—is not irrational, but an adaptive behaviour stemming from humans’ evolutionary past.

Researchers from McMaster University suggest that these behaviours—characterized by unusual bursts of energy and a compulsion to organize the household—are a result of a mechanism to protect and prepare for the unborn baby.

Women also become more selective about the company they keep, preferring to spend time only with people they trust, say researchers.

In short, having control over the environment is a key feature of preparing for childbirth, including decisions about where the birth will take place and who will be welcome.

“Nesting is not a frivolous activity,” says Marla Anderson, lead author of the study and a graduate student in the Department of Psychology, Neuroscience & Behaviour.  “We have found that it peaks in the third trimester as the birth of the baby draws near and is an important task that probably serves the same purpose in women as it does in other animals.”

Even Wikipeidia cites a number of sources on the subject:

Nesting behaviour refers to an instinct or urge in pregnant animals caused by the increase of estradiol (E2) [1] to prepare a home for the upcoming newborn(s). It is found in a variety of animals such as birds, fish, squirrels, mice and pigs as well as humans.[2][3]

Nesting is pretty much impossible to miss if you’ve ever been pregnant or around pregnant women.

Of course, this doesn’t prove the instinct persists (though in my personal case it definitely did.)

By the way, estradiol is a fancy name for estrogen, which is found in much higher levels in women than men. (Just to be rigorous, here’s data on estrogen levels in normal men and women.)

So if high estradiol levels make a variety of mammals–including humans–want to clean things, and women between puberty and menopause consistently have higher levels of estrogen than men, then it seems fairly likely that women actually do have, on average, a higher innate, biological, instinctual, even genetic urge to clean and organize their homes than men do.

But returning to the comic, the author claims:

But we’re born into a society in which very early on, we’re given dolls and miniature vacuum cleaners, and in which it seems shameful for boys to like those same toys.

What bollocks. I used to work at a toystore. Yes, we stocked toy vacuum cleaners and the like in a “Little Helpers” set. We never sold a single one, and I worked there over Christmas. (Great times.)

I am always on the lookout for toys my kids would enjoy and receive constant feedback on whether they like my choices. (“A book? Why did Santa bring me a book? Books are boring!”)

I don’t spend money getting more of stuff my kids aren’t interested in. A child who doesn’t like dolls isn’t going to get a bunch of dolls and be ordered to sit and play with them and nothing else. A child who doesn’t like trucks isn’t going to get a bunch of trucks.

Assuming that other parents are neither stupid (unable to tell which toys their children like) nor evil (forcing their children to play with specific toys even though they know they don’t like them,) I conclude that children’s toys reflect the children’s actual preferences, not the parents’ (for goodness’s sakes, it if it were up to me, I’d socialize my children to be super-geniuses who spend all of their time reading textbooks and whose toys are all science and math manipulatives, not toy dump trucks!)

Even young rhesus monkeys–who cannot talk and obviously have not been socialized into human gender norms–have the same gendered toy preferences as humans:

We compared the interactions of 34 rhesus monkeys, living within a 135 monkey troop, with human wheeled toys and plush toys. Male monkeys, like boys, showed consistent and strong preferences for wheeled toys, while female monkeys, like girls, showed greater variability in preferences. Thus, the magnitude of preference for wheeled over plush toys differed significantly between males and females. The similarities to human findings demonstrate that such preferences can develop without explicit gendered socialization.

Young female chimps also make their own dolls:

Now new research suggests that such gender-driven desires are also seen in young female chimpanzees in the wild—a behavior that possibly evolved to make the animals better mothers, experts say.

Young females of the Kanyawara chimpanzee community in Kibale National Park, Uganda, use sticks as rudimentary dolls and care for them like the group’s mother chimps tend to their real offspring. The behavior, which was very rarely observed in males, has been witnessed more than a hundred times over 14 years of study.

In Jane Goodall’s revolutionary research on the Gombe Chimps, she noted the behavior of young females who often played with or held their infant siblings, in contrast to young males who generally preferred not to.

And just as estradiol levels have an effect on how much cleaning women want to do, so androgen levels have an effect on which toys children prefer to play with:

Gonadal hormones, particularly androgens, direct certain aspects of brain development and exert permanent influences on sex-typical behavior in nonhuman mammals. Androgens also influence human behavioral development, with the most convincing evidence coming from studies of sex-typical play. Girls exposed to unusually high levels of androgens prenatally, because they have the genetic disorder, congenital adrenal hyperplasia (CAH), show increased preferences for toys and activities usually preferred by boys, and for male playmates, and decreased preferences for toys and activities usually preferred by girls. Normal variability in androgen prenatally also has been related to subsequent sex-typed play behavior in girls, and nonhuman primates have been observed to show sex-typed preferences for human toys. These findings suggest that androgen during early development influences childhood play behavior in humans at least in part by altering brain development.

But the author of the comic strip would like us to believe that gender roles are a result of watching the wrong stuff on TV:

And in which culture and media essentially portray women as mothers and wives, while men are heroes who go on fascinating adventures away from home.

I don’t know about you, but I grew up in the Bad Old Days of the 80s when She-Ra, Princess of Power, was kicking butt on TV; little girls were being magically transported to Ponyland to fight evil monsters: and Rainbow Bright defeated the evil King of Shadows and saved the Color Kids.

 

If you’re older than me, perhaps you grew up watching Wonder Woman (first invented in 1941) and Leia Skywalker; and if you’re younger, Dora the Explorer and Katniss Everdeen.

If you can’t find adventurous female characters in movies or TV, YOU AREN’T LOOKING.

I mentioned this recently: it’s like the Left has no idea what the past–anytime before last Tuesday–actually contained. Somehow the 60s, 70s, 80s, 90s, and 2000s have entirely disappeared, and they live in a timewarp where we are connected directly to the media and gender norms of over half a century ago.

Enough. The Guardian comic is a load of entitled whining from someone who actually thinks that other people are morally obligated to try to read her mind. She has the maturity of a bratty teenager (“You should have known I hate this band!”) and needs to learn how to actually communicate with others instead of complaining that it’s everyone else who has a problem.

/fin.

Review: Numbers and the Making of Us, by Caleb Everett

I’m about halfway through Caleb Everett’s Numbers and the Making of Us: Counting and the Course of Human Cultures. Everett begins the book with a lengthy clarification that he thinks everyone in the world has equal math abilities, some of us just happen to have been exposed to more number ideas than others. Once that’s out of the way, the book gets interesting.

When did humans invent numbers? It’s hard to say. We have notched sticks from the Paleolithic, but no way to tell if these notches were meant to signify numbers or were just decorated.

The slightly more recent Ishango, Lebombo, and Wolf bones (30,000 YA, Czech Republic) seem more likely to indicate that someone was at least counting–if not keeping track–of something.

The Ishango bone (estimated 20,000 years old, found in the Democratic Republic of the Congo near the headwaters of the Nile,) has three sets of notches–two sets total to 60, the third to 48. Interestingly, the notches are grouped, with both sets of sixty composed of primes: 19 + 17 + 13 + 11 and 9 + 19 + 21 + 11. The set of 48 contains groups of 3, 6, 4, 8, 10, 5, 5, and 7. Aside from the stray seven, the sequence tantalizingly suggests that someone was doubling numbers.

Ishango Bone

The Ishango bone also has a quartz point set into the end, which perhaps allowed it to be used for scraping, drawing, or etching–or perhaps it just looked nice atop someone’s decorated bone.

The Lebombo bone, (estimated 43-44,2000 years old, found near the border between South Africa and Swaziland,) is quite similar to the Ishango bone, but only contains 29 notches (as far as we can tell–it’s broken.)

I’ve seen a lot of people proclaiming “Scientists think it was used to keep track of menstrual cycles. Menstruating African women were the first mathematicians!” so I’m just going to let you in on a little secret: scientists have no idea what it was for. Maybe someone was just having fun putting notches on a bone. Maybe someone was trying to count all of their relatives. Maybe someone was counting days between new and full moons, or counting down to an important date.

Without a far richer archaeological assembly than one bone, we have no idea what this particular person might have wanted to count or keep track of. (Also, why would anyone want to keep track of menstrual cycles? You’ll know when they happen.)

The Wolf bone (30,000 years old, Czech Republic,) has received far less interest from folks interested in proclaiming that menstruating African women were the first mathematicians, but is a nice looking artifact with 60 notches–notches 30 and 31 are significantly longer than the others, as though marking a significant place in the counting (or perhaps just the middle of the pattern.)

Everett cites another, more satisfying tally stick: a 10,000 year old piece of antler found in the anoxic waters of Little Salt Spring, Florida. The antler contains two sets of marks: 28 (or possibly 29–the top is broken in a way that suggests another notch might have been a weak point contributing to the break) large, regular, evenly spaced notches running up the antler, and a much smaller set of notches set beside and just slightly beneath the first. It definitely looks like someone was ticking off quantities of something they wanted to keep track of.

Here’s an article with more information on Little Salt Spring and a good photograph of the antler.

I consider the bones “maybes” and the Little Salt Spring antler a definite for counting/keeping track of quantities.

Inca Quipu

Everett also mentions a much more recent and highly inventive tally system: the Incan quipu.

A quipu is made of knotted strings attached to one central string. A series of knots along the length of each string denotes numbers–one knot for 1, two for 2, etc. The knots are grouped in clusters, allowing place value–first cluster for the ones, second for the tens, third for hundreds, etc. (And a blank space for a zero.)

Thus a sequence of 2 knots, 4 knots, a space, and 5 knots = 5,402

The Incas, you see, had an empire to administer, no paper, but plenty of lovely alpaca wool. So being inventive people, they made do.

Everett then discusses the construction of names for numbers/base systems in different languages. Many languages use a combination of different bases, eg, “two twos” for four, (base 2,) “two hands” to signify 10 (base 5,) and from there, words for multiples of 10 or 20, (base 10 or 20,) can all appear in the same language. He argues convincingly that most languages derived their counting words from our original tally sticks: fingers and toes, found in quantities of 5, 10, and 20. So the number for 5 in a language might be “one hand”, the number for 10, “Two hands,” and the number for 20 “one person” (two hands + two feet.) We could express the number 200 in such a language by saying “two hands of one person”= 10 x 20.

(If you’re wondering how anyone could come up with a base 60 system, such as we inherited from the Babylonians for telling time, try using the knuckles of the four fingers on one hand [12] times the fingers of the other hand [5] to get 60.)

Which begs the question of what counts as a “number” word (numeral). Some languages, it is claimed, don’t have words for numbers higher than 3–but put out an array of 6 objects, and their speakers can construct numbers like “three twos.” Is this a number? What about the number in English that comes after twelve: four-teen, really just a longstanding mispronunciation of four and ten?

Perhaps a better question than “Do they have a word for it,” is “Do they have a common, easy to use word for it?” English contains the world nonillion, but you probably don’t use it very often (and according to the dictionary, a nonillion is much bigger in Britain than in the US, which makes it especially useless.) By contrast, you probably use quantities like a hundred or a thousand all the time, especially when thinking about household budgets.

Roman Numerals are really just an advanced tally system with two bases: 5 and 10. IIII are clearly regular tally marks. V (5) is similar to our practice of crossing through four tally marks. X (10) is two Vs set together. L (50) is a rotated V. C (100) is an abbreviation for the Roman word Centum, hundred. (I, V, X, and L are not abbreviations.) I’m not sure why 500 is D; maybe just because D follows C and it looks like a C with an extra line. M is short for Mille, or thousand. Roman numerals are also fairly unique in their use of subtraction in writing numbers, which few people do because it makes addition horrible. Eg, IV and VI are not the same number, nor do they equal 15 and 51. No, they equal 4 (v-1) and 6 (v+1,) respectively. Adding or multiplying large Roman numerals quickly becomes cumbersome; if you don’t believe me, try XLVII times XVIII with only a pencil and paper.

Now imagine you’re trying to run an empire this way.

You’re probably thinking, “At least those quipus had a zero and were reliably base ten,” about now.

Interestingly, the Mayans (and possibly the Olmecs) already had a proper symbol that they used for zero in their combination base-5/base-20 system with pretty functional place value at a time when the Greeks and Romans did not (the ancient Greeks were philosophically unsure about this concept of a “number that isn’t there.”)

(Note: given the level of sophistication of Native American civilizations like the Inca, Aztec, and Maya, and the fact that these developed in near total isolation, they must have been pretty smart. Their current populations appear to be under-performing relative to their ancestors.)

But let’s let Everett have a chance to speak:

Our increasingly refined means of survival and adaptation are the result of a cultural ratchet. This term, popularized by Duke University psychologist and primatologist Michael Tomasello, refers to the fact that humans cooperatively lock in knowledge from one generation to the next, like the clicking of a ratchet. In other word, our species’ success is due in large measure to individual members’ ability to learn from and emulate the advantageous behavior of their predecessors and contemporaries in their community. What makes humans special is not simply that we are so smart, it is that we do not have to continually come up with new solutions to the same old problems. …

Now this is imminently reasonable; I did not invent the calculus, nor could I have done so had it not already existed. Luckily for me, Newton and Leibniz already invented it and I live in a society that goes to great lengths to encode math in textbooks and teach it to students.

I call this “cultural knowledge” or “cultural memory,” and without it we’d still be monkeys with rocks.

The importance of gradually acquired knowledge stored in the community, culturally reified but not housed in the mind of any one individual, crystallizes when we consider cases in which entire cultures have nearly gone extinct because some of their stored knowledge dissipated due to the death of individuals who served as crucial nodes in their community’s knowledge network. In the case of the Polar Inuit of Northwest Greenland, population declined in the mid-nineteenth century after an epidemic killed several elders of the community. These elders were buried along with their tool sand weapons, in accordance with local tradition, and the Inuits’ ability to manufacture the tools and weapons in question was severely compromised. … As a result, their population did not recover until about 40 years later, when contact with another Inuit group allowed for the restoration of the communal knowledge base.

The first big advance, the one that separates us from the rest of the animal kingdom, was language itself. Yes, other animals can communicate–whales and birds sing; bees do their waggle dance–but only humans have full-fledged, generative language which allows us to both encode and decode new ideas with relative ease. Language lets different people in a tribe learn different things and then pool their ideas far more efficiently than mere imitation.

The next big leap was the development of visual symbols we could record–and read–on wood, clay, wax, bones, cloth, cave walls, etc. Everett suggests that the first of these symbols were likely tally marks such us those found on the Lebombo bone, though of course the ability to encode a buffalo on the wall of the Lascaux cave, France, was also significant. From these first symbols we developed both numbers and letters, which eventually evolved into books.

Books are incredible. Books are like external hard drives for your brain, letting you store, access, and transfer information to other people well beyond your own limits of memorization and well beyond a human lifetime. Books reach across the ages, allowing us to read what philosophers, poets, priests and sages were thinking about a thousand years ago.

Recently we invented an even more incredible information storage/transfer device: computers/the internet. To be fair, they aren’t as sturdy as clay tablets, (fired clay is practically immortal,) but they can handle immense quantities of data–and make it searchable, an incredibly important task.

But Everett tries to claim that cultural ratchet is all there is to human mathematical ability. If you live in a society with calculus textbooks, then you can learn calculus, and if you don’t, you can’t. Everett does not want to imply that Amazonian tribesmen with no words for numbers bigger than three are in any way less able to do math than the Mayans with their place value system and fancy zero.

But this seems unlikely for two reasons. First, we know very well that even in societies with calculus textbooks, not everyone can make use of them. Even among my own children, who have been raised with about as similar an environment as a human can make and have very similar genetics, there’s a striking difference in intellectual strengths and weaknesses. Humans are not identical in their abilities.

Moreover, we know that different mental tasks are performed in different, specialized parts of the brain. For example, we decode letters in the “visual word form area” of the brain; people whose VWAs have been damaged can still read, but they have to use different parts of their brains to work out the letters and they end up reading more slowly than they did before.

Memorably, before he died, the late Henry Harpending (of West Hunter) had a stroke while in Germany. He initially didn’t notice the stroke because it was located in the part of the brain that decodes letters into words, but since he was in Germany, he didn’t expect to read the words, anyway. It was only when he looked at something written in English later that day that he realized he couldn’t read it, and soon after I believe he passed out and was taken to the hospital.

Why should our brains have a VWA at all? It’s not like our primate ancestors did a whole lot of reading. It turns out that the VWA is repurposed from the part of our brain that recognizes faces :)

Likewise, there are specific regions of the brain that handle mathematical tasks. People who are better at math not only have more gray matter in these regions, but they also have stronger connections between them, letting the work together in harmony to solve different problems. We don’t do math by just throwing all of our mental power at a problem, but by routing it through specific regions of our brain.

Interestingly, humans and chimps differ in their ability to recognize faces and perceive emotions. (For anatomical reasons, chimps are more inclined to identify each other’s bottoms than each other’s faces.) We evolved the ability to recognize faces–the region of our brain we use to decode letters–when we began walking upright and interacting to each other face to face, though we do have some vestigial interest in butts and butt-like regions (“My eyes are up here.”) Our brains have evolved over the millenia to get better at specific tasks–in this case, face reading, a precursor to decoding symbolic language.

And there is a tremendous quantity of evidence that intelligence is at least partly genetic–estimates for the heritablity of intelligence range between 60 and 80%. The rest of the variation–the environmental part–looks to be essentially random chance, such as accidents, nutrition, or perhaps your third grade teacher.

So, yes, we absolutely can breed people for mathematical or linguistic ability, if that’s what the environment is selecting for. By contrast, if there have been no particular mathematical or linguistic section pressures in an environment (a culture with no written language, mathematical notation, and very few words for numbers clearly is not experiencing much pressure to use them), then you won’t select for such abilities. The question is not whether we can all be Newtons, (or Leibnizes,) but how many Newtons a society produces and how many people in that society have the potential to understand calculus, given the chance.

I do wonder why he made the graph so much bigger than the relevant part
Lifted gratefully from La Griffe Du Lion’s Smart Fraction II article

Just looking at the state of different societies around the world (including many indigenous groups that live within and have access to modern industrial or post-industrial technologies), there is clear variation in the average abilities of different groups to build and maintain complex societies. Japanese cities are technologically advanced, clean, and violence-free. Brazil, (which hasn’t even been nuked,) is full of incredibly violent, unsanitary, poorly-constructed favelas. Some of this variation is cultural, (Venezuela is doing particularly badly because communism doesn’t work,) or random chance, (Saudi Arabia has oil,) but some of it, by necessity, is genetic.

But if you find that a depressing thought, take heart: selective pressures can be changed. Start selecting for mathematical and verbal ability (and let everyone have a shot at developing those abilities) and you’ll get more mathematical and verbal abilities.

But this is getting long, so let’s continue our discussion next week.

2 Interesting studies: Early Humans in SE Asia and Genetics, Relationships, and Mental Illness

Ancient Teeth Push Back Early Arrival of Humans in Southeast Asia :

New tests on two ancient teeth found in a cave in Indonesia more than 120 years ago have established that early modern humans arrived in Southeast Asia at least 20,000 years earlier than scientists previously thought, according to a new study. …

The findings push back the date of the earliest known modern human presence in tropical Southeast Asia to between 63,000 and 73,000 years ago. The new study also suggests that early modern humans could have made the crossing to Australia much earlier than the commonly accepted time frame of 60,000 to 65,000 years ago.

I would like to emphasize that nothing based on a couple of teeth is conclusive, “settled,” or “proven” science. Samples can get contaminated, machines make errors, people play tricks–in the end, we’re looking for the weight of the evidence.

I am personally of the opinion that there were (at least) two ancient human migrations into south east Asia, but only time will tell if I am correct.

Genome-wide association study of social relationship satisfaction: significant loci and correlations with psychiatric conditions, by Varun Warrier, Thomas Bourgeron, Simon Baron-Cohen:

We investigated the genetic architecture of family relationship satisfaction and friendship satisfaction in the UK Biobank. …

In the DSM-55, difficulties in social functioning is one of the criteria for diagnosing conditions such as autism, anorexia nervosa, schizophrenia, and bipolar disorder. However, little is known about the genetic architecture of social relationship satisfaction, and if social relationship dissatisfaction genetically contributes to risk for psychiatric conditions. …

We present the results of a large-scale genome-wide association study of social
relationship satisfaction in the UK Biobank measured using family relationship satisfaction and friendship satisfaction. Despite the modest phenotypic correlations, there was a significant and high genetic correlation between the two phenotypes, suggesting a similar genetic architecture between the two phenotypes.

Note: the two “phenotypes” here are “family relationship satisfaction” and “friendship satisfaction.”

We first investigated if the two phenotypes were genetically correlated with
psychiatric conditions. As predicted, most if not all psychiatric conditions had a significant negative correlation for the two phenotypes. … We observed significant negative genetic correlation between the two phenotypes and a large cross-condition psychiatric GWAS38. This underscores the importance of social relationship dissatisfaction in psychiatric conditions. …

In other words, people with mental illnesses generally don’t have a lot of friends nor get along with their families.

One notable exception is the negative genetic correlation between measures of cognition and the two phenotypes. Whilst subjective wellbeing is positively genetically correlated with measures of cognition, we identify a small but statistically significant negative correlation between measures of correlation and the two phenotypes.

Are they saying that smart people have fewer friends? Or that dumber people are happier with their friends and families? I think they are clouding this finding in intentionally obtuse language.

A recent study highlighted that people with very high IQ scores tend to report lower satisfaction with life with more frequent socialization.

Oh, I think I read that one. It’s not the socialization per se that’s the problem, but spending time away from the smart person’s intellectual activities. For example, I enjoy discussing the latest genetics findings with friends, but I don’t enjoy going on family vacations because they are a lot of work that does not involve genetics. (This is actually something my relatives complain about.)

…alleles that increase the risk for schizophrenia are in the same haplotype as
alleles that decrease friendship satisfaction. The functional consequences of this locus must be formally tested. …

Loss of function mutations in these genes lead to severe biochemical consequences, and are implicated in several neuropsychiatric conditions. For
example, de novo loss of function mutations in pLI intolerant genes confers significant risk for autism. Our results suggest that pLI > 0.9 genes contribute to psychiatric risk through both common and rare genetic variation.

When Did Black People Evolve?

In previous posts, we discussed the evolution of Whites and Asians, so today we’re taking a look at people from Sub-Saharan Africa.

Modern humans only left Africa about 100,000 to 70,000 yeas ago, and split into Asians and Caucasians around 40,000 years ago. Their modern appearances came later–white skin, light hair and light eyes, for example, only evolved in the past 20,000 and possibly within the past 10,000 years.

What about the Africans, or specifically, Sub-Saharans? (North Africans, like Tunisians and Moroccans, are in the Caucasian clade.) When did their phenotypes evolve?

The Sahara, an enormous desert about the size of the United States, is one of the world’s biggest, most ancient barriers to human travel. The genetic split between SSAs and non-SSAs, therefore, is one of the oldest and most substantial among human populations. But there are even older splits within Africa–some of the ancestors of today’s Pygmies and Bushmen may have split off from other Africans 200,000-300,000 years ago. We’re not sure, because the study of archaic African DNA is still in its infancy.

Some anthropologists refer to Bushmen as “gracile,” which means they are a little shorter than average Europeans and not stockily built

The Bushmen present an interesting case, because their skin is quite light (for Africans.) I prefer to call it golden. The nearby Damara of Namibia, by contrast, are one of the world’s darkest peoples. (The peoples of South Sudan, eg Malik Agar, may be darker, though.) The Pygmies are the world’s shortest peoples; the peoples of South Sudan, such as the Dinka and Shiluk, are among the world’s tallest.

Sub-Saharan Africa’s ethnic groups can be grouped, very broadly, into Bushmen, Pygmies, Bantus (aka Niger-Congo), Nilotics, and Afro-Asiatics. Bushmen and Pygmies are extremely small groups, while Bantus dominate the continent–about 85% of Sub Saharan Africans speak a language from the Niger-Congo family. The Afro-Asiatic groups, as their name implies, have had extensive contact with North Africa and the Middle East.

Most of America’s black population hails from West Africa–that is, the primarily Bantu region. The Bantus and similar-looking groups among the Nilotics and Afro-Asiatics (like the Hausa) are, therefore, have both Africa’s most iconic and most common phenotypes.

For the sake of this post, we are not interested in the evolution of traits common to all humans, such as bipedalism. We are only interested in those traits generally shared by most Sub-Saharans and generally not shared by people outside of Africa.

detailed map of African and Middle Eastern ethnicities in Haaks et al’s dataset

One striking trait is black hair: it is distinctively “curly” or “frizzy.” Chimps and gorrilas do not have curly hair. Neither do whites and Asians. (Whites and Asians, therefore, more closely resemble chimps in this regard.) Only Africans and a smattering of other equatorial peoples like Melanesians have frizzy hair.

Black skin is similarly distinct. Chimps, who live in the shaded forest and have fur, do not have high levels of melanin all over their bodies. While chimps naturally vary in skin tone, an unfortunate, hairless chimp is practically “white.

Humans therefore probably evolved both black skin and frizzy hair at about the same time–when we came out of the shady forests and began running around on the much sunnier savannahs. Frizzy hair seems well-adapted to cooling–by standing on end, it lets air flow between the follicles–and of course melanin is protective from the sun’s rays. (And apparently, many of the lighter-skinned Bushmen suffer from skin cancer.)

Steatopygia also comes to mind, though I don’t know if anyone has studied its origins.

According to Wikipedia, additional traits common to Sub-Saharan Africans include:

In modern craniofacial anthropometry, Negroid describes features that typify skulls of black people. These include a broad and round nasal cavity; no dam or nasal sill; Quonset hut-shaped nasal bones; notable facial projection in the jaw and mouth area (prognathism); a rectangular-shaped palate; a square or rectangular eye orbit shape;[21] a large interorbital distance; a more undulating supraorbital ridge;[22] and large, megadontic teeth.[23] …

Modern cross-analysis of osteological variables and genome-wide SNPs has identified specific genes, which control this craniofacial development. Of these genes, DCHS2, RUNX2, GLI3, PAX1 and PAX3 were found to determine nasal morphology, whereas EDAR impacts chin protrusion.[27] …

Ashley Montagu lists “neotenous structural traits in which…Negroids [generally] differ from Caucasoids… flattish nose, flat root of the nose, narrower ears, narrower joints, frontal skull eminences, later closure of premaxillarysutures, less hairy, longer eyelashes, [and] cruciform pattern of second and third molars.”[28]

The Wikipedia page on Dark Skin states:

As hominids gradually lost their fur (between 4.5 and 2 million years ago) to allow for better cooling through sweating, their naked and lightly pigmented skin was exposed to sunlight. In the tropics, natural selection favoured dark-skinned human populations as high levels of skin pigmentation protected against the harmful effects of sunlight. Indigenous populations’ skin reflectance (the amount of sunlight the skin reflects) and the actual UV radiation in a particular geographic area is highly correlated, which supports this idea. Genetic evidence also supports this notion, demonstrating that around 1.2 million years ago there was a strong evolutionary pressure which acted on the development of dark skin pigmentation in early members of the genus Homo.[25]

About 7 million years ago human and chimpanzee lineages diverged, and between 4.5 and 2 million years ago early humans moved out of rainforests to the savannas of East Africa.[23][28] They not only had to cope with more intense sunlight but had to develop a better cooling system. …

Skin colour is a polygenic trait, which means that several different genes are involved in determining a specific phenotype. …

Data collected from studies on MC1R gene has shown that there is a lack of diversity in dark-skinned African samples in the allele of the gene compared to non-African populations. This is remarkable given that the number of polymorphisms for almost all genes in the human gene pool is greater in African samples than in any other geographic region. So, while the MC1Rf gene does not significantly contribute to variation in skin colour around the world, the allele found in high levels in African populations probably protects against UV radiation and was probably important in the evolution of dark skin.[57][58]

Skin colour seems to vary mostly due to variations in a number of genes of large effect as well as several other genes of small effect (TYR, TYRP1, OCA2, SLC45A2, SLC24A5, MC1R, KITLG and SLC24A4). This does not take into account the effects of epistasis, which would probably increase the number of related genes.[59] Variations in the SLC24A5 gene account for 20–25% of the variation between dark and light skinned populations of Africa,[60] and appear to have arisen as recently as within the last 10,000 years.[61] The Ala111Thr or rs1426654 polymorphism in the coding region of the SLC24A5 gene reaches fixation in Europe, and is also common among populations in North Africa, the Horn of Africa, West Asia, Central Asia and South Asia.[62][63][64]

That’s rather interesting about MC1R. It could imply that the difference in skin tone between SSAs and non-SSAs is due to active selection in Blacks for dark skin and relaxed selection in non-Blacks, rather than active selection for light skin in non-Blacks.

The page on MC1R states:

MC1R is one of the key proteins involved in regulating mammalianskin and hair color. …It works by controlling the type of melanin being produced, and its activation causes the melanocyte to switch from generating the yellow or red phaeomelanin by default to the brown or black eumelanin in replacement. …

This is consistent with active selection being necessary to produce dark skin, and relaxed selection producing lighter tones.

Studies show the MC1R Arg163Gln allele has a high frequency in East Asia and may be part of the evolution of light skin in East Asian populations.[40] No evidence is known for positive selection of MC1R alleles in Europe[41] and there is no evidence of an association between MC1R and the evolution of light skin in European populations.[42] The lightening of skin color in Europeans and East Asians is an example of convergent evolution.

However, we should also note:

Dark-skinned people living in low sunlight environments have been recorded to be very susceptible to vitamin D deficiency due to reduced vitamin D synthesis. A dark-skinned person requires about six times as much UVB than lightly pigmented persons.

PCA graph and map of sampling locations. Modern people are indicated with gray circles.

Unfortunately, most of the work on human skin tones has been done among Europeans (and, oddly, zebra fish,) limiting our knowledge about the evolution of African skin tones, which is why this post has been sitting in my draft file for months. Luckily, though, two recent studies–Loci Associated with Skin Pigmentation Identified in African Populations and Reconstructing Prehistoric African Population Structure–have shed new light on African evolution.

In Reconstructing Prehistoric African Population Structure, Skoglund et al assembled genetic data from 16 prehistoric Africans and compared them to DNA from nearby present-day Africans. They found:

  1. The ancestors of the Bushmen (aka the San/KhoiSan) once occupied a much wider area.
  2. They contributed about 2/3s of the ancestry of ancient Malawi hunter-gatherers (around 8,100-2,500 YA)
  3. Contributed about 1/3 of the ancestry of ancient Tanzanian hunter-gatherers (around 1,400 YA)
  4. Farmers (Bantus) spread from west Africa, completely replacing hunter-gatherers in some areas
  5. Modern Malawians are almost entirely Bantu.
  6. A Tanzanian pastoralist population from 3,100 YA spread out across east Africa and into southern Africa
  7. Bushmen ancestry was not found in modern Hadza, even though they are hunter-gatherers and speak a click language like the Bushmen.
  8. The Hadza more likely derive most of their ancestry from ancient Ethiopians
  9. Modern Bantu-speakers in Kenya derive from a mix between western Africans and Nilotics around 800-400 years ago.
  10. Middle Eastern (Levant) ancestry is found across eastern Africa from an admixture event that occurred around 3,000 YA, or around the same time as the Bronze Age Collapse.
  11. A small amount of Iranian DNA arrived more recently in the Horn of Africa
  12. Ancient Bushmen were more closely related to modern eastern Africans like the Dinka (Nilotics) and Hadza than to modern west Africans (Bantus),
  13. This suggests either complex relationships between the groups or that some Bantus may have had ancestors from an unknown group of humans more ancient than the Bushmen.
  14. Modern Bushmen have been evolving darker skins
  15. Pygmies have been evolving shorter stature
Automated clustering of ancient and modern populations (moderns in gray)

I missed #12-13 on my previous post about this paper, though I did note that the more data we get on ancient African groups, the more likely I think we are to find ancient admixture events. If humans can mix with Neanderthals and Denisovans, then surely our ancestors could have mixed with Ergaster, Erectus, or whomever else was wandering around.

Distribution of ancient Bushmen and Ethiopian DNA in south and east Africa

#15 is interesting, and consistent with the claim that Bushmen suffer from a lot of skin cancer–before the Bantu expansion, they lived in far more forgiving climates than the Kalahari desert. But since Bushmen are already lighter than their neighbors, this begs the question of how light their ancestors–who had no Levantine admixture–were. Could the Bantus’ and Nilotics’ darker skins have evolved after the Bushmen/everyone else split?

Meanwhile, in Loci Associated with Skin Pigmentation Identified in African Populations, Crawford et al used genetic samples from 1,570 people from across Africa to find six genetic areas–SLC24A5, MFSD12, DDB1, TMEM138, OCA2 and HERC2–which account for almost 30% of the local variation in skin color.

Bantu (green) and Levantine/pastoralist DNA in modern peoples

SLC24A5 is a light pigment introduced to east Africa from the Levant, probably around 3,000 years ago. Today, it is common in Ethiopia and Tanzania.

Interestingly, according to the article, “At all other loci, variants associated with dark pigmentation in Africans are identical by descent in southern Asian and Australo-Melanesian populations.”

These are the world’s other darkest peoples, such as the Jarawas of the Andaman Islands or the Melanesians of Bougainville, PNG. (And, I assume, some groups from India such as the Tamils.) This implies that these groups 1. had dark skin already when they left Africa, and 2. Never lost it on their way to their current homes. (If they had gotten lighter during their journey and then darkened again upon arrival, they likely would have different skin color variants than their African cousins.)

This implies that even if the Bushmen split off (around 200,000-300,000 YA) before dark skin evolved, it had evolved by the time people left Africa and headed toward Australia (around 100,000-70,000 YA.) This gives us a minimum threshold: it most likely evolved before 70,000 YA.

(But as always, we should be careful because perhaps there are even more skin color variant that we don’t know about yet in these populations.)

MFSD12 is common among Nilotics and is related to darker skin.

And according to the abstract, which Razib Khan posted:

Further, the alleles associated with skin pigmentation at all loci but SLC24A5 are ancient, predating the origin of modern humans. The ancestral alleles at the majority of predicted causal SNPs are associated with light skin, raising the possibility that the ancestors of modern humans could have had relatively light skin color, as is observed in the San population today.

The full article is not out yet, so I still don’t know when all of these light and dark alleles emerged, but the order is absolutely intriguing. For now, it looks like this mystery will still have to wait.