If epigenetics does introduce scientific novelties to the conventional understanding of biology, then according to the model it also has equally significant ethical and political implications.
What responsibility do I–as an egg-bearing person–have to ensure the health of my children and grandchildren’s epigenenomes? Society affirms my right to smoke cigarettes, even though they may give me cancer down the road–it’s my body and I am allowed to do what I wish with it. But what if my smoking cigarettes today causes cancer in a future, as yet unborn grandchild whom I never meet? What about her right to chose not to be exposed to carcinogens? Who am I to take that from her–and what right has society, the government, or anyone else to tell me what I may or may not do with my own body in the interests of some future people who may never come into existence?
I am summarizing, perhaps badly; you may read the whole post over on Dr. Robison’s blog. (Of course Robison is himself trying to summarize an argument I am sure he lays out in much more detail in his book.)
Here is my hastily written response, in the interest of clear conversational threading:
I’m not sure epigenetics constitutes such a fundamental shift in our understandings of genetics and inheritance as to actually warrant much change in our present policies. For example, you question whether policies should be enacted to restrict a 12 yr old girl’s right to eat what she wishes in defense of her unborn grandchild’s epigenome, but we today don’t even restrict a pregnant woman’s right to drink or smoke. Cocaine is illegal, but last time I checked, women didn’t go to prison for giving birth to crack babies. For that matter, women are allowed to kill unborn babies. I’m not commenting pro or against abortion, just noting that it is legal and most people consider death kind of a big deal. So I don’t think society is about to start outlawing stuff because of its negative effects two generations down the road.
On the other hand, if you look at the data on smoking, rates have definitely been falling ever since the tobacco-cancer link became news. The gov’t didn’t have to outlaw smoking for a lot of women to stop smoking for their children’s health.
But let’s return to the philosophical argument. All men are created equal… or are they? I do not think the Founding Fathers ever meant equality in a genetic sense. They could see with their own eyes that some men were tall and others short, some wise and others foolish, some virtuous and others criminal. They could see see that sons and daughters took after their parents and that a great many people started life in horribly unfair circumstances while others lived in luxury. They could see the cruel unfairness of disease, disability, and early death. Their rejection was not of biological or factual inequalities but of spiritual inequality. They rejected the notion that some men are created special by God to rule over others, and some men are created inferior by God, to be ruled over.
You state, “However, the evidence emerging from epigenetics suggests this is not the case. Instead of individuals of each generation being born with a pristine copy of their biological essence, they are inheriting a genetic endowment riddled with markers of the experiences of their parents and grandparents and great-grandparents, and so on. And these inherited epigenetic markers, as more and more research is showing, are having direct effects on the physical and mental health of individuals from causes not actually experienced by these individuals.”
I think there is a mistake here in regarding genetics as “pristine” in some form. What if my mother is an anxious person, and I, through environmental exposure, grow into a similarly anxious person? What if my mother has a gene for anxiety, and I inherit it? What if I possess a de novo genetic mutation that causes me to be anxious? And what if I suffer a genetic deletion in one of my chromosomes that causes anxiety? How is any of this different, functionally, from some trauma my mother suffered (say, a car accident) causing epigenetic changes that are subsequently passed on to me?
What is pristine about Down’s Syndrome, Williams’, or Klinefelter’s? Or just having the random bad luck to get genes for short, dumb, and ugly?
“For example, research in epigenetics shows that the choices and experiences of individuals in one generation are conditioning the basic nature of individuals of subsequent generations, which indelibly affects how those new individuals will exercise their own rights. ”
It can’t be indelible. For starters, you only inherit half of each parent’s genome–thus half their epigenome. So right there’s a 50% chance you won’t inherit any particular epigenetic marker. By gen two we’re talking 25% chance, and that’s not counting the constant re-writing of our epigenomes. However, I don’t think the policy implications for countries are all that different from our current thinking. We can say, for example, “If we have X level of pollution in the water, then Y number of people will get cancer,” and it’s a public health problem even if we don’t know “they’ll get cancer because of epigenetics.”
So let’s broaden the inquiry a bit. Not how does epigenetics impact classical liberalism (which is behind us, anyway,) but how do genetics, epigenetics, heritability, et at all influence our modern sensibilities? Modern liberalism is built almost as a reaction against former racialist notions of “blood”, with a consequent belief that people are, on average, about genetically equal. This butts up against the realization that some people are gifted and talented from birth, which many people quietly rationalize away while knowing they are being a bit dishonest, perhaps on the grounds that this is tantamount to statistical noise.
But the whole notion of “meritocracy” becomes more problematic if we admit that there’s a large genetic (or accidental, or environmental, or anything outside of free will,) contribution to IQ, educational attainment, mental illness, your chances of getting a good job, how other people treat you (because of attractiveness,) etc. Should a person who is dumb through no fault of their own suffer poverty? Should an ugly person be denied a job or a date? There’s an essential unfairness to it, after all.
But by the same token, what are you going to do about it? Declare that everyone under a certain IQ gets free money? What sort of incentives does that set up for society? And what does it do to someone’s self-image if they are Officially Delcared Stupid?
But this is all focused on the negative. What if we find ways to make people smarter, healthier, stronger? I think we’d take them. Sure, we’d have a few hold-outs who worry about “playing god,” (much as today we have people who worry about vaccines despite the massive health improvements public vaccination campaigns have cause.) But in the end we’d take them. Similarly, in the end, I think most people would try to avoid damaging their descendants’ epigenomes–even if not through direct public policy.
Addendum: while I am skeptical of most claims about epigenetics, eg, people claiming that epigenetic trauma can be transmitted for over a century, there do seem to be some things that cause what we can here characterize as multi-generational epigenetic effects. For example, the drug diethylstilbestrol (DES), given to pregnant women to prevent miscarriages back in the 70s, not only causes cancer in the women it was given to, but also in their daughters. (It also results in intersex disorders in male fetuses.) In the third generation (that is, the sons daughters of the fetuses that were exposed to DES their mothers took during pregnancy,) there are still effects, like an increased risk of irregular periods. This is not necessarily “epigenetic” but similar enough to include in the conversation.
The smartest non-human primates, like Kanzi the bonobo and Koko the gorilla, understand about 2,000 to 4,000 words. Koko can make about 1,000 signs in sign language and Kanzi can use about 450 lexigrams (pictures that stand for words.) Koko can also make some onomatopoetic words–that is, she can make and use imitative sounds in conversation.
A four year human knows about 4,000 words, similar to an exceptional gorilla. An adult knows about 20,000-35,000 words. (Another study puts the upper bound at 42,000.)
Somewhere along our journey from ape-like hominins to homo sapiens sapiens, our ancestors began talking, but exactly when remains a mystery. The origins of writing have been amusingly easy to discover, because early writers were fond of very durable surfaces, like clay, stone, and bone. Speech, by contrast, evaporates as soon as it is heard–leaving no trace for archaeologists to uncover.
But we can find the things necessary for speech and the things for which speech, in turn, is necessary.
The main reason why chimps and gorillas, even those taught human language, must rely on lexigrams or gestures to communicate is that their voiceboxes, lungs, and throats work differently than ours. Their semi-arborial lifestyle requires using the ribs as a rigid base for the arm and shoulder muscles while climbing, which in turn requires closing the lungs while climbing to provide support for the ribs.
Full bipedalism released our early ancestors from the constraints on airway design imposed by climbing, freeing us to make a wider variety of vocalizations.
Now is the perfect time to break out my file of relevant human evolution illustrations:
We humans split from our nearest living ape relatives about 7-8 million years ago, but true bipedalism may not have evolved for a few more million years. Since there are many different named hominins, here is a quick guide:
Australopithecines (light blue in the graph,) such as the famous Lucy, are believed to have been the first fully bipedal hominins, although, based on the shape of their toes, they may have still occasionally retreated into the trees. They lived between 4 and 2 million years ago.
Without delving into the myriad classification debates along the lines of “should we count this set of skulls as a separate species or are they all part of the natural variation within one species,” by the time the homo genus arises with H Habilis or H. Rudolfensis around 2.8 million years ag, humans were much worse at climbing trees.
Interestingly, one direction humans have continued evolving in is up.
The reliable production of stone tools represents an enormous leap forward in human cognition. The first known stone tools–Oldowan–are about 2.5-2.6 million years old and were probably made by homo Habilis. These simple tools are typically shaped only one one side.
By the Acheulean–1.75 million-100,000 years ago–tool making had become much more sophisticated. Not only did knappers shape both sides of both the tops and bottoms of stones, but they also made tools by first shaping a core stone and then flaking derivative pieces from it.
The first Acheulean tools were fashioned by h Erectus; by 100,000 years ago, h Sapiens had presumably taken over the technology.
Flint knapping is surprisingly difficult, as many an archaeology student has discovered.
These technological advances were accompanied by steadily increasing brain sizes.
I propose that the complexities of the Acheulean tool complex required some form of language to facilitate learning and teaching; this gives us a potential lower bound on language around 1.75 million years ago. Bipedalism gives us an upper bound around 4 million years ago, before which our voice boxes were likely more restricted in the sounds they could make.
A Different View
Even though “homo Sapiens” has been around for about 300,000 years (or so we have defined the point where we chose to differentiate between our species and the previous one,) “behavioral modernity” only emerged around 50,000 years ago (very awkward timing if you know anything about human dispersal.)
Everything about behavioral modernity is heavily contested (including when it began,) but no matter how and when you date it, compared to the million years or so it took humans to figure out how to knap the back side of a rock, human technologic advance has accelerated significantly over the past 100,000 and even moreso over the past 50,000 and even 10,000.
Fire was another of humanity’s early technologies:
Claims for the earliest definitive evidence of control of fire by a member of Homo range from 1.7 to 0.2 million years ago (Mya). Evidence for the controlled use of fire by Homo erectus, beginning some 600,000 years ago, has wide scholarly support. Flint blades burned in fires roughly 300,000 years ago were found near fossils of early but not entirely modern Homo sapiens in Morocco. Evidence of widespread control of fire by anatomically modern humans dates to approximately 125,000 years ago.
What prompted this sudden acceleration? Noam Chomsky suggests that it was triggered by the evolution of our ability to use and understand language:
Noam Chomsky, a prominent proponent of discontinuity theory, argues that a single chance mutation occurred in one individual in the order of 100,000 years ago, installing the language faculty (a component of the mind–brain) in “perfect” or “near-perfect” form.
More specifically, we might say that this single chance mutation created the capacity for figurative or symbolic language, as clearly apes already have the capacity for very simple language. It was this ability to convey abstract ideas, then, that allowed humans to begin expressing themselves in other abstract ways, like cave painting.
I disagree with this view on the grounds that human groups were already pretty widely dispersed by 100,000 years ago. For example, Pygmies and Bushmen are descended from groups of humans who had already split off from the rest of us by then, but they still have symbolic language, art, and everything else contained in the behavioral modernity toolkit. Of course, if a trait is particularly useful or otherwise successful, it can spread extremely quickly (think lactose tolerance,) and neither Bushmen nor Pygmies were 100% genetically isolated for the past 250,000 years, but I simply think the math here doesn’t work out.
However, that doesn’t mean Chomsky isn’t on to something. For example, Johanna Nichols (another linguist,) used statistical models of language differentiation to argue that modern languages split around 100,000 years ago. This coincides neatly with the upper bound on the Out of Africa theory, suggesting that Nichols may actually have found the point when language began differentiating because humans left Africa, or perhaps she found the origin of the linguistic skills necessary to accomplish humanity’s cross-continental trek.
In normal adults these two portions of the SVT form a right angle to one another and are approximately equal in length—in a 1:1 proportion. Movements of the tongue within this space, at its midpoint, are capable of producing tenfold changes in the diameter of the SVT. These tongue maneuvers produce the abrupt diameter changes needed to produce the formant frequencies of the vowels found most frequently among the world’s languages—the “quantal” vowels [i], [u], and [a] of the words “see,” “do,” and “ma.” In contrast, the vocal tracts of other living primates are physiologically incapable of producing such vowels.
(Since juvenile humans are shaped differently than adults, they pronounce sounds slightly differently until their voiceboxes fully develop.)
…Neanderthal necks were too short and their faces too long to have accommodated equally proportioned SVTs. Although we could not reconstruct the shape of the SVT in the Homo erectus fossil because it does not preserve any cervical vertebrae, it is clear that its face (and underlying horizontal SVT) would have been too long for a 1:1 SVT to fit into its head and neck. Likewise, in order to fit a 1:1 SVT into the reconstructed Neanderthal anatomy, the larynx would have had to be positioned in the Neanderthal’s thorax, behind the sternum and clavicles, much too low for effective swallowing. …
Surprisingly, our reconstruction of the 100,000-year-old specimen from Israel, which is anatomically modern in most respects, also would not have been able to accommodate a SVT with a 1:1 ratio, albeit for a different reason. … Again, like its Neanderthal relatives, this early modern human probably had an SVT with a horizontal dimension longer than its vertical one, translating into an inability to reproduce the full range of today’s human speech.
It was only in our reconstruction of the most recent fossil specimens—the modern humans postdating 50,000 years— that we identified an anatomy that could have accommodated a fully modern, equally proportioned vocal tract.
Just as small children who can’t yet pronounce the letter “r” can nevertheless make and understand language, I don’t think early humans needed to have all of the same sounds as we have in order to communicate with each other. They would have just used fewer sounds.
The change in our voiceboxes may not have triggered the evolution of language, but been triggered by language itself. As humans began transmitting more knowledge via language, humans who could make more sounds could utter a greater range of words perhaps had an edge over their peers–maybe they were seen as particularly clever, or perhaps they had an easier time organizing bands of hunters and warriors.
One of the interesting things about human language is that it is clearly simultaneously cultural–which language you speak is entirely determined by culture–and genetic–only humans can produce language in the way we do. Even the smartest chimps and dolphins cannot match our vocabularies, nor imitate our sounds. Human infants–unless they have some form of brain damage–learn language instinctually, without conscious teaching. (Insert reference to Steven Pinker.)
Some kind of genetic changes were obviously necessary to get from apes to human language use, but exactly what remains unclear.
A variety of genes are associated with language use, eg FOXP2. H Sapiens and chimps have different versions of the FOXP2 gene, (and Neanderthals have a third, but more similar to the H Sapiens version than the chimp,) but to my knowledge we have yet to discover exactly when the necessary mutations arose.
Despite their impressive skulls and survival in a harsh, novel climate, Neanderthals seem not to have engaged in much symbolic activity, (though to be fair, they were wiped out right about the time Sapiens really got going with its symbolic activity.) Homo Sapiens and Homo Nanderthalis split around 800-400,000 years ago–perhaps the difference in our language genes ultimately gave Sapiens the upper hand.
Just as farming appears to have emerged relatively independently in several different locations around the world at about the same time, so behavioral modernity seems to have taken off in several different groups around the same time. Of course we can’t rule out the possibility that these groups had some form of contact with each other–peaceful or otherwise–but it seems more likely to me that similar behaviors emerged in disparate groups around the same time because the cognitive precursors necessary for those behaviors had already begun before they split.
Based on genetics, the shape of their larynges, and their cultural toolkits, Neanderthals probably did not have modern speech, but they may have had something similar to it. This suggests that at the time of the Sapiens-Neanderthal split, our common ancestor possessed some primitive speech capacity.
By the time Sapiens and Neanderthals encountered each other again, nearly half a million years later, Sapiens’ language ability had advanced, possibly due to further modification of FOXP2 and other genes like it, plus our newly modified voiceboxes, while Neanderthals’ had lagged. Sapiens achieved behavioral modernity and took over the planet, while Neanderthals disappeared.
When my kids don’t want to do their work (typically word problems in math,) they start coming up with all kinds of crazy scenarios to try to evade the question. “What if Susan cloned herself?” “What if Joe is actually the one driving the car, and he only saw the car pass by because he was looking at himself in a mirror?” “What if John used a wormhole to travel backwards in time and so all of the people at the table were actually Joe and so I only need to divide by one?” “What if Susan is actually a boy but her parents accidentally gave him the wrong name?” “What if ALIENS?”
After banging my head on the wall, I started asking, “Which is more likely: Sally and Susan are two different people, or Sally cloned herself, something no human has ever done before in the 300,000 years of homo Sapiens’ existence?” And sometimes they will, grudgingly, admit that their scenarios are slightly less likely than the assumptions the book is making.*
I forgive my kids, because they’re children. When adults do the same thing, I am much less sympathetic.
Folks on all sides of the political spectrum are probably guilty of this, but my inclinations/bubble lead me to encounter certain ones more often. Sex/gender is a huge one (even I have been led astray by sophistry on this subject, for which I apologize.)
Over in biology, sex is simply defined: Females produce large gametes. Males produce small gametes. It doesn’t matter how gametes are produced. It doesn’t matter what determines male or femaleness. All that matters is gamete size. There is no such thing (at least in humans) as a sex “spectrum”: reproduction requires one small gamete and one large gamete. Medium-sized gametes are not part of the process.
About 99.9% of people fit into the biological categories of “male” and “female.” An extremely small minority (<1%) have rare biological issues that interfere with gamete formation–people with Klinefelter’s, for example, are genetically XXY instead of XX or XY. People with Klinefelter’s are also infertile–unlike large gametes and small gametes, XXY isn’t part of a biological reproduction strategy. Like trisomy 21, it’s just an unfortunate accident in cell division.
In a mysterious twist, the vast majority of people have a “gender” identity that matches their biological sex. Even female athletes–women who excel at a stereotypically and highly masculine field–tend to identify as “women,” not men. Even male fashion designers tend to self-identify as men. There are a few people who identify as transgender, but in my personal experience, most of them are actually intersex in some way (eg, a woman who has autism, a condition characterized as “extreme male brain,” may legitimately feel like she thinks more like a guy than a girl.) Again, this is an extremely small percent of the population. For 99% of people you meet, normal gender assumptions apply.
So jumping into a conversation about “men” and “women” with “Well actually, ‘men’ and ‘women’ are just social constructs and gender is actually a spectrum and there are many different valid gender expressions–” is a great big NO.
Jumping into a discussion of women’s issues (like childbirth) with “Actually, men can give birth, too,” or the Women’s March with “Pussyhats are transphobic because some women have penises; vaginas don’t define what it means to be female,” is an even bigger NO, and I’m not even a fan of pussyhats.
Only biological females can give birth. That’s how the species works. When it comes to biology, leave things that you admit aren’t biology at the door. If a transgender man with a uterus gives birth to a child, he is still a biological female and we don’t need to confuse things by implying that someone gestated a fetus in his testicles. Over the millennia that humans have existed, a handful of people with some form of biological chimerism (basically, an internalized conjoined twin who never fully developed but ended up contributing an organ or two) who thought of themselves as male may have nonetheless given birth. These cases are so rare that you will probably never meet someone with them in your entire life.
Having lost a leg due to an accident (or 4 legs, due to being a pair of conjoined twins,) does not make “number of legs in humans” a spectrum ranging from 0-4. Humans have 2 legs; a few people have unfortunate accidents. Saying so doesn’t imply that people with 0 legs are somehow less human. They just had an accident.
In a conversation I read recently, Person A asserted that if two blue-eyed parents had a brown-eyed baby, the mother would be suspected of infidelity. A whole bunch of people immediately jumped on Person A, claiming he was scientifically ignorant and hadn’t paid attention in school–sadly, these overconfident people are actually the ones who don’t understand genetics, because blue eyes are recessive and thus two blue eyed people can’t make a brown-eyed biological child. A few people, however, asserted that Person A was scientifically illiterate because there is an extremely rare brown-eyed gene that two blue-eyed people can carry, resulting in a brown-eyed child.
But this is not scientific illiteracy. The recessive brown-eyed gene is extremely rare, and both parents would have to have it. Infidelity, by contrast, is much more common. It’s not that common, but it’s more common than two parent both having recessive brown-eyed genes. Insisting that Person A is scientifically illiterate because of an extremely rare exception to the rule is ignoring statistics–statistically, the child is more likely to be not biological than to have an extremely rare variant. Statistically, men and women are far more likely to match in gender and sex than to not.
Let’s look at immigration, another topic near and dear to everyone’s hearts. After Trump’s comments about Haiti came out (and let’s be honest, Haiti’s capital, Port au Prince, is one of the world’s largest cities without a functioning sewer system, so “shithole” is actually true,) people began popping up with statements like “I’d rather a Ugandan immigrant who believes in American values than a socialist Norwegian.”
I, too, would rather a Ugandan with American values than a socialist Norwegian. However, what percentage of Ugandans actually have American values? Just a wild guess, but I suspect most Ugandans have Ugandan values. Most Ugandans probably think Ugandan culture is pretty nice and that Ugandan norms and values are the right ones to have, otherwise they wouldn’t have different values and we’d call those Ugandan values.
While we’re at it, I suspect most Chinese people have Chinese values, most Australians have Australian values, most Brazilians hold Brazilian values, and most people from Vatican City have Catholic values.
I don’t support blindly taking people from any country, because some people are violent criminals just trying to escape conviction. But some countries are clearly closer to each other, culturally, than others, and thus have a larger pool of people who hold each other’s values.
(Even when people hold very different values, some values conflict more than others.)
To be clear: I’ve been picking on one side, but I’m sure both sides do this.
What’s the point? None of this is very complicated. Most people can figure out if a person they have just met is male or female instantly and without fail. It takes a very smart person to get confused by a few extremely rare exceptions into thinking that the broad categories don’t functionally exist.
Sometimes this obfuscation is compulsive–the person just wants to show how smart they are, or maybe everyone around them is saying it so they start repeating it–but since most people seem capable of understanding probabilities in everyday life (“Sometimes the stoplight is glitched but usually it isn’t, so I’ll assume the stoplight is functioning properly and obey it,”) if someone suddenly seems incapable of distinguishing between extremely rare and extremely common events in the political realm, then they are doing so on purpose or suffering severe cognitive dissonance.
*Oddly, I solved the problem by giving the kids harder problems. It appears that when their brains are actively engaged with trying to solve the problem, they don’t have time/energy left to come up with alternatives. When the material is too easy (or, perhaps, way too hard) they start trying to get creative to make things more interesting.
North Africa is an often misunderstood region in human genetics. Since it is in Africa, people often assume that it contains the same variety of people referenced in terms like “African Americans,” “black Africans,” or even just “Africans.” In reality, the African content contains members of all three of the great human clades–Sub-Saharan Africans in the south, Polynesians (Asian clade) in Madagascar, and Caucasians in the north.
Throughout most of human history, the Sahara–not the Mediterranean or Red seas–has been the biggest local impediment to human migration–thus North Africans are much closer, genetically, to their neighbors in Europe and the Middle East than their neighbors across the desert (and before the domestication of the camel, about 3,000 years ago, the Sahara was even harder to cross.)
But from time to time, global weather patterns change and the Sahara becomes a garden: the Green Sahara. The last time we had a Green Sahara was about 9-7,000 years ago; during this time, people lived, hunted, fished, herded and perhaps farmed throughout areas that are today nearly uninhabited wastes.
In order to investigate the role of the last Green Sahara in the peopling of Africa, we deep-sequence the whole non-repetitive portion of the Y chromosome in 104 males selected as representative of haplogroups which are currently found to the north and to the south of the Sahara. … We find that the coalescence age of the trans-Saharan haplogroups dates back to the last Green Sahara, while most northern African or sub-Saharan clades expanded locally in the subsequent arid phase. …
Our findings suggest that the Green Sahara promoted human movements and demographic expansions, possibly linked to the adoption of pastoralism. Comparing our results with previously reported genome-wide data, we also find evidence for a sex-biased sub-Saharan contribution to northern Africans, suggesting that historical events such as the trans-Saharan slave trade mainly contributed to the mtDNA and autosomal gene pool, whereas the northern African paternal gene pool was mainly shaped by more ancient events.
In other words, modern North Africans have some maternal (female) Sub-Saharan DNA that arrived recently via the Islamic slave trade, but most of their Sub-Saharan Y-DNA (male) is much older, hailing from the last time the Sahara was easy to cross.
Note that not much DNA is shared across the Sahara:
After the African humid period, the climatic conditions became rapidly hyper-arid and the Green Sahara was replaced by the desert, which acted as a strong geographic barrier against human movements between northern and sub-Saharan Africa.
A consequence of this is that there is a strong differentiation in the Y chromosome haplogroup composition between the northern and sub-Saharan regions of the African continent. In the northern area, the predominant Y lineages are J-M267 and E-M81, with the former being linked to the Neolithic expansion in the Near East and the latter reaching frequencies as high as 80 % in some north-western populations as a consequence of a very recent local demographic expansion [8–10]. On the contrary, sub-Saharan Africa is characterised by a completely different genetic landscape, with lineages within E-M2 and haplogroup B comprising most of the Y chromosomes. In most regions of sub-Saharan Africa, the observed haplogroup distribution has been linked to the recent (~ 3 kya) demic diffusion of Bantu agriculturalists, which brought E-M2 sub-clades from central Africa to the East and to the South [11–17]. On the contrary, the sub-Saharan distribution of B-M150 seems to have more ancient origins, since its internal lineages are present in both Bantu farmers and non-Bantu hunter-gatherers and coalesce long before the Bantu expansion [18–20].
In spite of their genetic differentiation, however, northern and sub-Saharan Africa share at least four patrilineages at different frequencies, namely A3-M13, E-M2, E-M78 and R-V88.
Here, by using whole Y chromosome sequences, we intend to shed some light on the historical and demographic processes that modelled the genetic landscape of North Africa. Previous studies suggested that the strategic location of North Africa, separated from Europe by the Mediterranean Sea, from the rest of the African continent by the Sahara Desert and limited to the East by the Arabian Peninsula, has shaped the genetic complexity of current North Africans15,16,17. Early modern humans arrived in North Africa 190–140 kya (thousand years ago)18, and several cultures settled in the area before the Holocene. In fact, a previous study by Henn et al.19 identified a gradient of likely autochthonous North African ancestry, probably derived from an ancient “back-to-Africa” gene flow prior to the Holocene (12 kya). In historic times, North Africa has been populated successively by different groups, including Phoenicians, Romans, Vandals and Byzantines. The most important human settlement in North Africa was conducted by the Arabs by the end of the 7th century. Recent studies have demonstrated the complexity of human migrations in the area, resulting from an amalgam of ancestral components in North African groups15,20.
According to the article, E-M81 is dominant in Northwest Africa and absent almost everywhere else in the world.
The authors tested various men across north Africa in order to draw up a phylogenic tree of the branching of E-M183:
The distribution of each subhaplogroup within E-M183 can be observed in Table 1 and Fig. 2. Indeed, different populations present different subhaplogroup compositions. For example, whereas in Morocco almost all subhaplogorups are present, Western Sahara shows a very homogeneous pattern with only E-SM001 and E-Z5009 being represented. A similar picture to that of Western Sahara is shown by the Reguibates from Algeria, which contrast sharply with the Algerians from Oran, which showed a high diversity of haplogroups. It is also worth to notice that a slightly different pattern could be appreciated in coastal populations when compared with more inland territories (Western Sahara, Algerian Reguibates).
Overall, the authors found that the haplotypes were “strikingly similar” to each other and showed little geographic structure besides the coastal/inland differences:
As proposed by Larmuseau et al.25, the scenario that better explains Y-STR haplotype similarity within a particular haplogroup is a recent and rapid radiation of subhaplogroups. Although the dating of this lineage has been controversial, with dates proposed ranging from Paleolithic to Neolithic and to more recent times17,22,28, our results suggested that the origin of E-M183 is much more recent than was previously thought. … In addition to the recent radiation suggested by the high haplotype resemblance, the pattern showed by E-M183 imply that subhaplogroups originated within a relatively short time period, in a burst similar to those happening in many Y-chromosome haplogroups23.
In other words, someone went a-conquering.
Alternatively, given the high frequency of E-M183 in the Maghreb, a local origin of E-M183 in NW Africa could be envisaged, which would fit the clear pattern of longitudinal isolation by distance reported in genome-wide studies15,20. Moreover, the presence of autochthonous North African E-M81 lineages in the indigenous population of the Canary Islands, strongly points to North Africa as the most probable origin of the Guanche ancestors29. This, together with the fact that the oldest indigenous inviduals have been dated 2210 ± 60 ya, supports a local origin of E-M183 in NW Africa. Within this scenario, it is also worth to mention that the paternal lineage of an early Neolithic Moroccan individual appeared to be distantly related to the typically North African E-M81 haplogroup30, suggesting again a NW African origin of E-M183. A local origin of E-M183 in NW Africa > 2200 ya is supported by our TMRCA estimates, which can be taken as 2,000–3,000, depending on the data, methods, and mutation rates used.
However, the authors also note that they can’t rule out a Middle Eastern origin for the haplogroup since their study simply doesn’t include genomes from Middle Eastern individuals. They rule out a spread during the Neolithic expansion (too early) but not the Islamic expansion (“an extensive, male-biased Near Eastern admixture event is registered ~1300 ya, coincidental with the Arab expansion20.”) Alternatively, they suggest E-M183 might have expanded near the end of the third Punic War. Sure, Carthage (in Tunisia) was defeated by the Romans, but the era was otherwise one of great North African wealth and prosperity.
Interesting papers! My hat’s off to the authors. I hope you enjoyed them and get a chance to RTWT.
A comic strip in the Guardian recently alerted me to the fact that many women are exhausted from the “Mental Load” of thinking about things and need their husbands to pitch in and help. Go ahead and read it.
Whew. There’s a lot to unpack here:
Yes, you have to talk to men. DO NOT EXPECT OTHER PEOPLE TO KNOW WHAT YOU ARE THINKING. Look, if I can get my husband to help me when I need it, you certainly can too. That or you married the wrong man.
Get a dayplanner and write things like “grocery lists” and doctors appointments in it. There’s probably one built into your phone.
There, I solved your problems.
That said, female anxiety (at least in our modern world) appears to be a real thing:
(though American Indians are the real untold story in this graph.)
Medco data shows that antidepressants are the most commonly used mental health medications and that women have the highest utilization rates. In 2010, 21 percent of women ages 20 and older were using an antidepressant. … Men’s use of antidepressants is almost half that of women, but has also been on the rise with a 28 percent increase over the past decade. …
Anxiety disorders are the most common psychiatric illnesses affecting children and adults. … Although anxiety disorders are highly treatable, only about one‐third of sufferers receive treatment. …
Medco data shows that women have the highest utilization rate of anti‐anxiety medications; in
fact, 11 percent of middle‐aged women (ages 45‐64) were on an anti‐anxiety drug treatment in
2010, nearly twice the rate of their male counterparts (5.7 percent).
And based on the age group data, women in their prime working years (but waning childbearing years) have even higher rates of mental illness. (Adult women even take ADHD medicine at slightly higher rates than adult men.)
What causes this? Surely 20% of us–one in 5–can’t actually be mentally ill, can we? Is it biology or culture? Or perhaps a mismatch between biology and culture?
Or perhaps we should just scale back a little, and when we have friends over for dinner, just order a pizza instead of trying to cook two separate meals?
Divorce rates are far higher among “modern” couples who share the housework than in those where the woman does the lion’s share of the chores, a Norwegian study has found. …
Norway has a long tradition of gender equality and childrearing is shared equally between mothers and fathers in 70 per cent of cases.But when it comes to housework, women in Norway still account for most of it in seven out of 10 couples. The study emphasised women who did most of the chores did so of their own volition and were found to be as “happy” those in “modern” couples. …
The researchers expected to find that where men shouldered more of the burden, women’s happiness levels were higher. In fact they found that it was the men who were happier while their wives and girlfriends appeared to be largely unmoved.
Those men who did more housework generally reported less work-life conflict and were scored slightly higher for wellbeing overall.
Theory: well-adjusted people who love each other are happy to do what it takes to keep the household running and don’t waste time passive-aggressively trying to convince their spouse that he’s a bad person for not reading her mind.
Now let’s talk about biology. The author claims,
Of course, there’s nothing genetic or innate about this behavior. We’re not born with an all-consuming passion for clearing tables, just like boys aren’t born with an utter disinterest for thing lying around.
Of course, the author doesn’t cite any papers from the fields of genetics or behavior psychology to back up her claims–just like she feels entitled to claim that other people should read her mind and absurdly thinks that a good project manager at work doesn’t bother to tell their team what needs to be done, she doesn’t feel any compulsion to cite any proof of her claims. Science says s. We know because some cartoonist on the internet claimed it did.
Over in reality-land, when we make scientific claims about things like genetics, we cite our sources. And women absolutely have an instinct for cleaning things: the Nesting Instinct. No, it isn’t present when we’re born. It kicks in when we’re pregnant–often shortly before going into labor. Here’s an actual scientific paper on the Nesting Instinct published in the scientific journal Evolution and Human Behavior:
In altricial mammals, “nesting” refers to a suite of primarily maternal behaviours including nest-site selection, nest building and nest defense, and the many ways that nonhuman animals prepare themselves for parturition are well studied. In contrast, little research has considered pre-parturient preparation behaviours in women from a functional perspective.
The overwhelming urge that drives many pregnant women to clean, organize and get life in order—otherwise known as nesting—is not irrational, but an adaptive behaviour stemming from humans’ evolutionary past.
Researchers from McMaster University suggest that these behaviours—characterized by unusual bursts of energy and a compulsion to organize the household—are a result of a mechanism to protect and prepare for the unborn baby.
Women also become more selective about the company they keep, preferring to spend time only with people they trust, say researchers.
In short, having control over the environment is a key feature of preparing for childbirth, including decisions about where the birth will take place and who will be welcome.
“Nesting is not a frivolous activity,” says Marla Anderson, lead author of the study and a graduate student in the Department of Psychology, Neuroscience & Behaviour. “We have found that it peaks in the third trimester as the birth of the baby draws near and is an important task that probably serves the same purpose in women as it does in other animals.”
Even Wikipeidia cites a number of sources on the subject:
Nesting behaviour refers to an instinct or urge in pregnant animals caused by the increase of estradiol (E2) to prepare a home for the upcoming newborn(s). It is found in a variety of animals such as birds, fish, squirrels, mice and pigs as well as humans.
Nesting is pretty much impossible to miss if you’ve ever been pregnant or around pregnant women.
Of course, this doesn’t prove the instinct persists (though in my personal case it definitely did.)
By the way, estradiol is a fancy name for estrogen, which is found in much higher levels in women than men. (Just to be rigorous, here’s data on estrogen levels in normal men and women.)
So if high estradiol levels make a variety of mammals–including humans–want to clean things, and women between puberty and menopause consistently have higher levels of estrogen than men, then it seems fairly likely that women actually do have, on average, a higher innate, biological, instinctual, even genetic urge to clean and organize their homes than men do.
But returning to the comic, the author claims:
But we’re born into a society in which very early on, we’re given dolls and miniature vacuum cleaners, and in which it seems shameful for boys to like those same toys.
What bollocks. I used to work at a toystore. Yes, we stocked toy vacuum cleaners and the like in a “Little Helpers” set. We never sold a single one, and I worked there over Christmas. (Great times.)
I am always on the lookout for toys my kids would enjoy and receive constant feedback on whether they like my choices. (“A book? Why did Santa bring me a book? Books are boring!”)
I don’t spend money getting more of stuff my kids aren’t interested in. A child who doesn’t like dolls isn’t going to get a bunch of dolls and be ordered to sit and play with them and nothing else. A child who doesn’t like trucks isn’t going to get a bunch of trucks.
Assuming that other parents are neither stupid (unable to tell which toys their children like) nor evil (forcing their children to play with specific toys even though they know they don’t like them,) I conclude that children’s toys reflect the children’s actual preferences, not the parents’ (for goodness’s sakes, it if it were up to me, I’d socialize my children to be super-geniuses who spend all of their time reading textbooks and whose toys are all science and math manipulatives, not toy dump trucks!)
We compared the interactions of 34 rhesus monkeys, living within a 135 monkey troop, with human wheeled toys and plush toys. Male monkeys, like boys, showed consistent and strong preferences for wheeled toys, while female monkeys, like girls, showed greater variability in preferences. Thus, the magnitude of preference for wheeled over plush toys differed significantly between males and females. The similarities to human findings demonstrate that such preferences can develop without explicit gendered socialization.
Now new research suggests that such gender-driven desires are also seen in young female chimpanzees in the wild—a behavior that possibly evolved to make the animals better mothers, experts say.
Young females of the Kanyawara chimpanzee community in Kibale National Park, Uganda, use sticks as rudimentary dolls and care for them like the group’s mother chimps tend to their real offspring. The behavior, which was very rarely observed in males, has been witnessed more than a hundred times over 14 years of study.
Gonadal hormones, particularly androgens, direct certain aspects of brain development and exert permanent influences on sex-typical behavior in nonhuman mammals. Androgens also influence human behavioral development, with the most convincing evidence coming from studies of sex-typical play. Girls exposed to unusually high levels of androgens prenatally, because they have the genetic disorder, congenital adrenal hyperplasia (CAH), show increased preferences for toys and activities usually preferred by boys, and for male playmates, and decreased preferences for toys and activities usually preferred by girls. Normal variability in androgen prenatally also has been related to subsequent sex-typed play behavior in girls, and nonhuman primates have been observed to show sex-typed preferences for human toys. These findings suggest that androgen during early development influences childhood play behavior in humans at least in part by altering brain development.
But the author of the comic strip would like us to believe that gender roles are a result of watching the wrong stuff on TV:
And in which culture and media essentially portray women as mothers and wives, while men are heroes who go on fascinating adventures away from home.
I don’t know about you, but I grew up in the Bad Old Days of the 80s when She-Ra, Princess of Power, was kicking butt on TV; little girls were being magically transported to Ponyland to fight evil monsters: and Rainbow Bright defeated the evil King of Shadows and saved the Color Kids.
If you’re older than me, perhaps you grew up watching Wonder Woman (first invented in 1941) and Leia Skywalker; and if you’re younger, Dora the Explorer and Katniss Everdeen.
If you can’t find adventurous female characters in movies or TV, YOU AREN’T LOOKING.
I mentioned this recently: it’s like the Left has no idea what the past–anytime before last Tuesday–actually contained. Somehow the 60s, 70s, 80s, 90s, and 2000s have entirely disappeared, and they live in a timewarp where we are connected directly to the media and gender norms of over half a century ago.
Enough. The Guardian comic is a load of entitled whining from someone who actually thinks that other people are morally obligated to try to read her mind. She has the maturity of a bratty teenager (“You should have known I hate this band!”) and needs to learn how to actually communicate with others instead of complaining that it’s everyone else who has a problem.
For example, there is about 25% overlap between the human genome and that of grapes. (And we have fewer genes than grapes!) So some caution should be exercised before reading too much into percentages of genomic correspondence across species. I doubt, after all that you consider yourself one-quarter grape. … canine and bovine species generally exhibit about an 85% rate of genomic correspondence with humans. … small changes in genetic makeup can, among other influences, lead to large changes in brain size.
On the development of numbers:
After all, for the vast majority of our species’ existence, we lived as hunters and gatherers in Africa … A reasonable interpretation of the contemporary distribution of cultural and number-system types, then, is that humans did not rely on complex number system for the bulk of their history. We can also reasonably conclude that transitions to larger, more sedentary, and more trade-based cultures helped pressure various groups to develop more involved numerical technologies. … Written numerals, and writing more generally, were developed first in the Fertile Crescent after the agricultural revolution began there. … These pressures ultimately resulted in numerals and other written symbols, such as the clay-token based numerals … The numerals then enabled new forms of agriculture and trade that required the exact discrimination and representation of quantities. The ancient Mesopotamian case is suggestive, then, of the motivation for the present-day correlation between subsistence and number types: larger agricultural and trade-based economies require numerical elaboration to function. …
Intriguingly, though, the same maybe true of Chinese writing, the earliest samples of which date to the Shang Dynasty and are 3,000 years old. The most ancient of these samples are oracle bones. These bones were inscribed with nuemerals quantifying such items as enemy prisoners, birds and animals hunted, and sacrificed animals. … Ancient writing around the world is numerically focused.
Changes in the Jungle as population growth makes competition for resources more intense and forces people out of their traditional livelihoods:
Consider the case of one of my good friends, a member of an indigenous group known as the Karitiana. … Paulo spent the majority of his childhood, in the 1980s and 1990s in the largest village of his people’s reservation. … While some Karitiana sought to make a living in nearby Porto Velho, many strived to maintain their traditional way of life on their reservation. At the time this was feasible, and their traditional subsistence strategies of hunting, gathering, and horticulture could be realistically practiced. Recently, however, maintaining their conventional way of life has become a less tenable proposition. … many Karitiana feel they have little choice but to seek employment in the local Brazilian economy… This is certainly true of Paulo. He has been enrolled in Brazilian schools for some time, has received some higher education, and is currently employed by a governmental organization. To do these things, of course, Paulo had to learn Portuguese grammar and writing. And he had to learn numbers and math, also. In short, the socioeconomic pressures he has felt to acquire the numbers of another culture are intense.
Everett cites a statistic that >90% of the world’s approximately 7,000 languages are endangered.
They are endangered primarily because people like Paulo are being conscripted into larger nation-states, gaining fluency in more economically viable languages. … From New Guinea to Australia to Amazonia and elsewhere, the mathematizing of people is happening.
On the advantages of different number systems:
Recent research also suggests that the complexity of some non-linguistic number systems have been under appreciated. Many counting boards and abaci that have been used, and are still in use across the world’s culture, present clear advantages to those using them … the abacus presents some cognitive advantages. That is because, research now suggests, children who are raised using the abacus develop a “mental abacus” with time. … According to recent cross-cultural findings, practitioners of abacus-based mathematical strategies outperform those unfamiliar with such strategies,a t least in some mathematical tasks. The use of the Soroban abacus has, not coincidentally, now been adopted in many schools throughout Asia.
I suspect these higher math scores are more due to the mental abilities of the people using the abacus than the abacus itself. I have also just ordered an abacus.
… in 2015 the world’s oldest known unambiguous inscription of a circular zero was rediscovered in Cambodia. The zero in question, really a large dot, serves as a placeholder in the ancient Khmer numeral for 605. It is inscribed on a stone tablet, dating to 683 CE, that was found only kilometers from the faces of Bayon and other ruins of Angkor Wat and Angkor Thom. … the Maya also developed a written form for zero, and the Inca encoded the concept in their Quipu.
In 1202, Fibonacci wrote the Book of Calculation, which promoted the use of the superior Arabic (yes Hindu) numerals (zero included) over the old Roman ones. Just as the introduction of writing jump-started the Cherokee publishing industry, so the introduction of superior numerals probably helped jump-start the Renaissance.
Cities and the rise of organized religion:
…although creation myths, animistic practices, and other forms of spiritualism are universal or nearly universal, large-scale hierarchical religions are restricted to relatively few cultural lineages. Furthermore, these religions… developed only after people began living in larger groups and settlements because of their agricultural lifestyles. … A phalanx of scholars has recently suggested that the development of major hierarchical religions, like the development of hierarchical governments, resulted from the agglomeration of people in such places. …
Organized religious beliefs, with moral-enforcing deities and priest case, were a by-product of the need for large groups of people to cooperate via shared morals and altruism. As the populations of cultures grew after the advent of agricultural centers… individuals were forced to rely on shared trust with many more individuals, including non-kin, than was or is the case in smaller groups like bands or tribes. … Since natural selection is predicated on the protection of one’s genes, in-group altruism and sacrifice are easier to make sense of in bands and tribes. But why would humans in much larger populations–humans who have no discernible genetic relationship… cooperate with these other individuals in their own culture? … some social mechanism had to evolve so that larger cultures would not disintegrate due to competition among individuals and so that many people would not freeload off the work of others. One social mechanism that foster prosocial and cooperative behavior is an organized religion based on shared morals and omniscient deities capable of keeping track of the violation of such morals. …
When Moses descended from Mt. Sinai with his stone tablets, they were inscribed with ten divine moral imperatives. … Why ten? … Here is an eleventh commandment that could likely be uncontroversially adopted by many people: “thou shalt not torture.” … But then the list would appear to lose some of its rhetorical heft. “eleven commandments’ almost hints of a satirical deity.
Technically there are 613 commandments, but that’s not nearly as catchy as the Ten Commandments–inadvertently proving Everett’s point.
Overall, I found this book frustrating and repetitive, but there were some good parts. I’ve left out most of the discussion of the Piraha and similar cultures, and the rather fascinating case of Nicaraguan homesigners (“homesigners” are deaf people who were never taught a formal sign language but made up their own.) If you’d like to learn more about them, you might want to look up the book at your local library.
Crohn‘s is an inflammatory disease of the digestive tract involving diarrhea, vomiting internal lesions, pain, and severe weight loss. Left untreated, Crohn’s can lead to death through direct starvation/malnutrition, infections caused by the intestinal walls breaking down and spilling feces into the rest of the body, or a whole host of other horrible symptoms, like pyoderma gangrenosum–basically your skin just rotting off.
Crohn’s disease has no known cause and no cure, though several treatments have proven effective at putting it into remission–at least temporarily.
The disease appears to be triggered by a combination of environmental, bacterial, and genetic factors–about 70 genes have been identified so far that appear to contribute to an individual’s chance of developing Crohn’s, but no gene has been found yet that definitely triggers it. (The siblings of people who have Crohn’s are more likely than non-siblings to also have it, and identical twins of Crohn’s patients have a 55% chance of developing it.) A variety of environmental factors, such as living in a first world country, (parasites may be somewhat protective against the disease), smoking, or eating lots of animal protein also correlate with Crohn’s, but since only 3.2/1000 people even in the West have it’s, these obviously don’t trigger the disease in most people.
Crohn’s appears to be a kind of over-reaction of the immune system, though not specifically an auto-immune disorder, which suggests that a pathogen of some sort is probably involved. Most people are probably able to fight off this pathogen, but people with a variety of genetic issues may have more trouble–according to Wikipedia, “There is considerable overlap between susceptibility loci for IBD and mycobacterial infections. ” Mycobacteria are a genus of of bacteria that includes species like tuberculosis and leprosy. A variety of bacteria–including specific strains of e coli, yersinia, listeria, and Mycobacterium avium subspecies paratuberculosis–are found in the intestines of Crohn’s suffers at higher rates than in the intestines of non-sufferers (intestines, of course, are full of all kinds of bacteria.)
Crohn’s treatment depends on the severity of the case and specific symptoms, but often includes a course of antibiotics, (especially if the patient has abscesses,) tube feeding (in acute cases where the sufferer is having trouble digesting food,) and long-term immune-system suppressants such as prednisone, methotrexate, or infliximab. In severe cases, damaged portions of the intestines may be cut out. Before the development of immunosuppressant treatments, sufferers often progressively lost more and more of their intestines, with predictably unpleasant results, like no longer having a functioning colon. (70% of Crohn’s sufferers eventually have surgery.)
A similar disease, Johne’s, infects cattle. Johne’s is caused by Mycobacterium avium subspecies paratuberculosis, (hereafter just MAP). MAP typically infects calves at birth, transmitted via infected feces from their mothers, incubates for two years, and then manifests as diarrhea, malnutrition, dehydration, wasting, starvation, and death. Luckily for cows, there’s a vaccine, though any infectious disease in a herd is a problem for farmers.
If you’re thinking that “paratuberculosis” sounds like “tuberculosis,” you’re correct. When scientists first isolated it, they thought the bacteria looked rather like tuberculosis, hence the name, “tuberculosis-like.” The scientists’ instincts were correct, and it turns out that MAP is in the same bacterial genus as tuberculosis and leprosy (though it may be more closely related to leprosy than TB.) (“Genus” is one step up from “species;” our species is “homo Sapiens;” our genus, homo, we share with homo Neanderthalis, homo Erectus, etc, but chimps and gorillas are not in the homo genus.)
The intestines of cattle who have died of MAP look remarkably like the intestines of people suffering from advanced Crohn’s disease.
MAP can actually infect all sorts of mammals, not just cows, it’s just more common and problematic in cattle herds. (Sorry, we’re not getting through this post without photos of infected intestines.)
So here’s how it could work:
The MAP bacteria–possibly transmitted via milk or meat products–is fairly common and infects a variety of mammals. Most people who encounter it fight it off with no difficulty (or perhaps have a short bout of diarrhea and then recover.)
A few people, though, have genetic issues that make it harder for them to fight off the infection. For example, Crohn’s sufferers produce less intestinal mucus, which normally acts as a barrier between the intestines and all of the stuff in them.
Interestingly, parasite infections can increase intestinal mucus (some parasites feed on mucus), which in turn is protective against other forms of infection; decreasing parasite load can increase the chance of other intestinal infections.
Once MAP enters the intestinal walls, the immune system attempts to fight it off, but a genetic defect in microphagy results in the immune cells themselves getting infected. The body responds to the signs of infection by sending more immune cells to fight it, which subsequently also get infected with MAP, triggering the body to send even more immune cells. These lumps of infected cells become the characteristic ulcerations and lesions that mark Crohn’s disease and eventually leave the intestines riddled with inflamed tissue and holes.
The most effective treatments for Crohn’s, like Infliximab, don’t target infection but the immune system. They work by interrupting the immune system’s feedback cycle so that it stops sending more cells to the infected area, giving the already infected cells a chance to die. It doesn’t cure the disease, but it does give the intestines time to recover.
There were 70 reported cases of tuberculosis after treatment with infliximab for a median of 12 weeks. In 48 patients, tuberculosis developed after three or fewer infusions. … Of the 70 reports, 64 were from countries with a low incidence of tuberculosis. The reported frequency of tuberculosis in association with infliximab therapy was much higher than the reported frequency of other opportunistic infections associated with this drug. In addition, the rate of reported cases of tuberculosis among patients treated with infliximab was higher than the available background rates.
because it is actively suppressing the immune system’s ability to fight diseases in the TB family.
Luckily, if you live in the first world and aren’t in prison, you’re unlikely to catch TB–only about 5-10% of the US population tests positive for TB, compared to 80% in many African and Asian countries. (In other words, increased immigration from these countries will absolutely put Crohn’s suffers at risk of dying.)
There are a fair number of similarities between Crohn’s, TB, and leprosy is that they are all very slow diseases that can take years to finally kill you. By contrast, other deadly diseases, like smallpox, cholera, and yersinia pestis (plague), spread and kill extremely quickly. Within about two weeks, you’ll definitely know if your plague infection is going to kill you or not, whereas you can have leprosy for 20 years before you even notice it.
Tuberculosis is classified as one of the granulomatous inflammatory diseases. Macrophages, T lymphocytes, B lymphocytes, and fibroblasts aggregate to form granulomas, with lymphocytes surrounding the infected macrophages. When other macrophages attack the infected macrophage, they fuse together to form a giant multinucleated cell in the alveolar lumen. The granuloma may prevent dissemination of the mycobacteria and provide a local environment for interaction of cells of the immune system. However, more recent evidence suggests that the bacteria use the granulomas to avoid destruction by the host’s immune system. … In many people, the infection waxes and wanes.
Crohn’s also waxes and wanes. Many sufferers experience flare ups of the disease, during which they may have to be hospitalized, tube fed, and put through another round of antibiotics or sectioning (surgical removal of the intestines) before they improve–until the disease flares up again.
Leprosy is also marked by lesions, though of course so are dozens of other diseases.
Note: Since Crohn’s is a complex, multi-factorial disease, there may be more than one bacteria or pathogen that could infect people and create similar results. Alternatively, Crohn’s sufferers may simply have intestines that are really bad at fighting off all sorts of diseases, as a side effect of Crohn’s, not a cause, resulting in a variety of unpleasant infections.
The MAP hypothesis suggests several possible treatment routes:
Improving the intestinal mucus, perhaps via parasites or medicines derived from parasites
Improving the intestinal microbe balance
Antibiotics that treat Map
Anti-MAP vaccine similar to the one for Johne’s disease in cattle
To determine how the worms could be our frenemies, Cadwell and colleagues tested mice with the same genetic defect found in many people with Crohn’s disease. Mucus-secreting cells in the intestines malfunction in the animals, reducing the amount of mucus that protects the gut lining from harmful bacteria. Researchers have also detected a change in the rodents’ microbiome, the natural microbial community in their guts. The abundance of one microbe, an inflammation-inducing bacterium in the Bacteroides group, soars in the mice with the genetic defect.
The researchers found that feeding the rodents one type of intestinal worm restored their mucus-producing cells to normal. At the same time, levels of two inflammation indicators declined in the animals’ intestines. In addition, the bacterial lineup in the rodents’ guts shifted, the team reports online today in Science. Bacteroides’s numbers plunged, whereas the prevalence of species in a different microbial group, the Clostridiales, increased. A second species of worm also triggers similar changes in the mice’s intestines, the team confirmed.
To check whether helminths cause the same effects in people, the scientists compared two populations in Malaysia: urbanites living in Kuala Lumpur, who harbor few intestinal parasites, and members of an indigenous group, the Orang Asli, who live in a rural area where the worms are rife. A type of Bacteroides, the proinflammatory microbes, predominated in the residents of Kuala Lumpur. It was rarer among the Orang Asli, where a member of the Clostridiales group was plentiful. Treating the Orang Asli with drugs to kill their intestinal worms reversed this pattern, favoring Bacteroides species over Clostridiales species, the team documented.
This sounds unethical unless they were merely tagging along with another team of doctors who were de-worming the Orangs for normal health reasons and didn’t intend on potentially inflicting Crohn’s on people. Nevertheless, it’s an interesting study.
At any rate, so far they haven’t managed to produce an effective medicine from parasites, possibly in part because people think parasites are icky.
But if parasites aren’t disgusting enough for you, there’s always the option of directly changing the gut bacteria: fecal microbiota transplants (FMT). A fecal transplant is exactly what it sounds like: you take the regular feces out of the patient and put in new, fresh feces from an uninfected donor. (When your other option is pooping into a bag for the rest of your life because your colon was removed, swallowing a few poop pills doesn’t sound so bad.) EG, Fecal microbiota transplant for refractory Crohn’s:
Approximately one-third of patients with Crohn’s disease do not respond to conventional treatments, and some experience significant adverse effects, such as serious infections and lymphoma, and many patients require surgery due to complications. .. Herein, we present a patient with Crohn’s colitis in whom biologic therapy failed previously, but clinical remission and endoscopic improvement was achieved after a single fecal microbiota transplantation infusion.
Antibiotics are another potential route. The Redhill Biopharma is conducting a phase III clinical study of antibiotics designed to fight MAP in Crohn’s patients. Redhill is expected to release some of their results in April.
Mechanism of action: The vaccine is what is called a ‘T-cell’ vaccine. T-cells are a type of white blood cell -an important player in the immune system- in particular, for fighting against organisms that hide INSIDE the body’s cells –like MAP does. Many people are exposed to MAP but most don’t get Crohn’s –Why? Because their T-cells can ‘see’ and destroy MAP. In those who do get Crohn’s, the immune system has a ‘blind spot’ –their T-cells cannot see MAP. The vaccine works by UN-BLINDING the immune system to MAP, reversing the immune dysregulation and programming the body’s own T-cells to seek out and destroy cells containing MAP. For general information, there are two informative videos about T Cells and the immune system below.
Efficacy: In extensive tests in animals (in mice and in cattle), 2 shots of the vaccine spaced 8 weeks apart proved to be a powerful, long-lasting stimulant of immunity against MAP. To read the published data from the trial in mice, click here. To read the published data from the trial in cattle, click here.
Dr. Borody (who was influential in the discovery that ulcers are caused by the h. pylori bacteria and not stress,) has had amazing success treating Crohn’s patients with a combination of infliximab, anti-MAP antibiotics, and hyperbaric oxygen. Here are two of his before and after photos of the intestines of a 31 yr old Crohn’s sufferer:
Here are some more interesting articles on the subject:
Last week, Davis and colleagues in the U.S. and India published a case report in Frontiers of Medicine http://journal.frontiersin.org/article/10.3389/fmed.2016.00049/full . The report described a single patient, clearly infected with MAP, with the classic features of Johne’s disease in cattle, including the massive shedding of MAP in his feces. The patient was also ill with clinical features that were indistinguishable from the clinical features of Crohn’s. In this case though, a novel treatment approach cleared the patient’s infection.
The patient was treated with antibiotics known to be effective for tuberculosis, which then eliminated the clinical symptoms of Crohn’s disease, too.
Through luck, hard work, good fortune, perseverance, and wonderful doctors, I seem to be one of the few people in the world who can claim to be “cured” of Crohn’s Disease. … In brief, I was treated for 6 years with medications normally used for multidrug resistant TB and leprosy, under the theory that a particular germ causes Crohn’s Disease. I got well, and have been entirely well since 2004. I do not follow a particular diet, and my recent colonoscopies and blood work have shown that I have no inflammation. The rest of these 3 blogs will explain more of the story.
What about removing Johne’s disease from the food supply? Assuming Johne’s is the culprit, this may be hard to do, (it’s pretty contagious in cattle, can lie dormant for years, and survives cooking) but drinking ultrapasteurized milk may be protective, especially for people who are susceptible to the disease.
However… there are also studies that contradict the MAP theory. For example, a recent study of the rate of Crohn’s disease in people exposed to Johne’s disease found no correllation. (However, Crohn’s is a pretty rare condition, and the survey only found 7 total cases, which is small enough that random chance could be a factor, but we are talking about people who probably got very up close and personal with feces infected with MAP.)
Logistic regression showed no significant association with measures of potential contamination of water sources with MAP, water intake, or water treatment. Multivariate analysis showed that consumption of pasteurized milk (per kg/month: odds ratio (OR) = 0.82, 95% confidence interval (CI): 0.69, 0.97) was associated with a reduced risk of Crohn’s disease. Meat intake (per kg/month: OR = 1.40, 95% CI: 1.17, 1.67) was associated with a significantly increased risk of Crohn’s disease, whereas fruit consumption (per kg/month: OR = 0.78, 95% CI: 0.67, 0.92) was associated with reduced risk.
So even if Crohn’s is caused by MAP or something similar, it appears that people aren’t catching it from milk.
There are other theories about what causes Crohn’s–these folks, for example, think it’s related to consumption of GMO corn. Perhaps MAP has only been found in the intestines of Crohn’s patients because people with Crohn’s are really bad at fighting off infections. Perhaps the whole thing is caused by weird gut bacteria, or not enough parasites, insufficient Vitamin D, or industrial pollution.
New tests on two ancient teeth found in a cave in Indonesia more than 120 years ago have established that early modern humans arrived in Southeast Asia at least 20,000 years earlier than scientists previously thought, according to a new study. …
The findings push back the date of the earliest known modern human presence in tropical Southeast Asia to between 63,000 and 73,000 years ago. The new study also suggests that early modern humans could have made the crossing to Australia much earlier than the commonly accepted time frame of 60,000 to 65,000 years ago.
I would like to emphasize that nothing based on a couple of teeth is conclusive, “settled,” or “proven” science. Samples can get contaminated, machines make errors, people play tricks–in the end, we’re looking for the weight of the evidence.
I am personally of the opinion that there were (at least) two ancient human migrations into south east Asia, but only time will tell if I am correct.
We investigated the genetic architecture of family relationship satisfaction and friendship satisfaction in the UK Biobank. …
In the DSM-55, difficulties in social functioning is one of the criteria for diagnosing conditions such as autism, anorexia nervosa, schizophrenia, and bipolar disorder. However, little is known about the genetic architecture of social relationship satisfaction, and if social relationship dissatisfaction genetically contributes to risk for psychiatric conditions. …
We present the results of a large-scale genome-wide association study of social
relationship satisfaction in the UK Biobank measured using family relationship satisfaction and friendship satisfaction. Despite the modest phenotypic correlations, there was a significant and high genetic correlation between the two phenotypes, suggesting a similar genetic architecture between the two phenotypes.
Note: the two “phenotypes” here are “family relationship satisfaction” and “friendship satisfaction.”
We first investigated if the two phenotypes were genetically correlated with
psychiatric conditions. As predicted, most if not all psychiatric conditions had a significant negative correlation for the two phenotypes. … We observed significant negative genetic correlation between the two phenotypes and a large cross-condition psychiatric GWAS38. This underscores the importance of social relationship dissatisfaction in psychiatric conditions. …
In other words, people with mental illnesses generally don’t have a lot of friends nor get along with their families.
One notable exception is the negative genetic correlation between measures of cognition and the two phenotypes. Whilst subjective wellbeing is positively genetically correlated with measures of cognition, we identify a small but statistically significant negative correlation between measures of correlation and the two phenotypes.
Are they saying that smart people have fewer friends? Or that dumber people are happier with their friends and families? I think they are clouding this finding in intentionally obtuse language.
A recent study highlighted that people with very high IQ scores tend to report lower satisfaction with life with more frequent socialization.
Oh, I think I read that one. It’s not the socialization per se that’s the problem, but spending time away from the smart person’s intellectual activities. For example, I enjoy discussing the latest genetics findings with friends, but I don’t enjoy going on family vacations because they are a lot of work that does not involve genetics. (This is actually something my relatives complain about.)
…alleles that increase the risk for schizophrenia are in the same haplotype as
alleles that decrease friendship satisfaction. The functional consequences of this locus must be formally tested. …
Loss of function mutations in these genes lead to severe biochemical consequences, and are implicated in several neuropsychiatric conditions. For
example, de novo loss of function mutations in pLI intolerant genes confers significant risk for autism. Our results suggest that pLI > 0.9 genes contribute to psychiatric risk through both common and rare genetic variation.
It was only two years ago that researchers found the first ancient human genome in Africa: a skeleton in a cave in Ethiopia yielded DNA that turned out to be 4,500 years old.
On Thursday, an international team of scientists reported that they had recovered far older genes from bone fragments in Malawi dating back 8,100 years. The researchers also retrieved DNA from 15 other ancient people in eastern and southern Africa, and compared the genes to those of living Africans.
We assembled genome-wide data from 16 prehistoric Africans. We show that the anciently divergent lineage that comprises the primary ancestry of the southern African San had a wider distribution in the past, contributing approximately two-thirds of the ancestry of Malawi hunter-gatherers ∼8,100–2,500 years ago and approximately one-third of the ancestry of Tanzanian hunter-gatherers ∼1,400 years ago.
The San are also known as the Bushmen, a famous group of recent hunter-gatherers from southern Africa.
We document how the spread of farmers from western Africa involved complete replacement of local hunter-gatherers in some regions…
…and we track the spread of herders by showing that the population of a ∼3,100-year-old pastoralist from Tanzania contributed ancestry to people from northeastern to southern Africa, including a ∼1,200-year-old southern African pastoralist…
Whereas the two individuals buried in ∼2,000 BP hunter-gatherer contexts in South Africa share ancestry with southern African Khoe-San populations in the PCA, 11 of the 12 ancient individuals who lived in eastern and south-central Africa between ∼8,100 and ∼400 BP form a gradient of relatedness to the eastern African Hadza on the one hand and southern African Khoe-San on the other (Figure 1A).
The Hadza are a hunter-gatherer group from Tanzania who are not obviously related to any other people. Their language has traditionally been classed alongside the languages of the KhoiSan/Bushmen people because they all contain clicks, but the languages otherwise have very little in common and Hadza appears to be a language isolate, like Basque.
The genetic cline correlates to geography, running along a north-south axis with ancient individuals from Ethiopia (∼4,500 BP), Kenya (∼400 BP), Tanzania (both ∼1,400 BP), and Malawi (∼8,100–2,500 BP), showing increasing affinity to southern Africans (both ancient individuals and present-day Khoe-San). The seven individuals from Malawi show no clear heterogeneity, indicating a long-standing and distinctive population in ancient Malawi that persisted for at least ∼5,000 years (the minimum span of our radiocarbon dates) but which no longer exists today. …
We find that ancestry closely related to the ancient southern Africans was present much farther north and east in the past than is apparent today. This ancient southern African ancestry comprises up to 91% of the ancestry of Khoe-San groups today (Table S5), and also 31% ± 3% of the ancestry of Tanzania_Zanzibar_1400BP, 60% ± 6% of the ancestry of Malawi_Fingira_6100BP, and 65% ± 3% of the ancestry of Malawi_Fingira_2500BP (Figure 2A). …
Both unsupervised clustering (Figure 1B) and formal ancestry estimation (Figure 2B) suggest that individuals from the Hadza group in Tanzania can be modeled as deriving all their ancestry from a lineage related deeply to ancient eastern Africans such as the Ethiopia_4500BP individual …
So what’s up with the Tanzanian expansion mentioned in the summary?
Western-Eurasian-related ancestry is pervasive in eastern Africa today … and the timing of this admixture has been estimated to be ∼3,000 BP on average… We found that the ∼3,100 BP individual… associated with a Savanna Pastoral Neolithic archeological tradition, could be modeled as having 38% ± 1% of her ancestry related to the nearly 10,000-year-old pre-pottery farmers of the Levant … These results could be explained by migration into Africa from descendants of pre-pottery Levantine farmers or alternatively by a scenario in which both pre-pottery Levantine farmers and Tanzania_Luxmanda_3100BP descend from a common ancestral population that lived thousands of years earlier in Africa or the Near East. We fit the remaining approximately two-thirds of Tanzania_Luxmanda_3100BP as most closely related to the Ethiopia_4500BP…
…present-day Cushitic speakers such as the Somali cannot be fit simply as having Tanzania_Luxmanda_3100BP ancestry. The best fitting model for the Somali includes Tanzania_Luxmanda_3100BP ancestry, Dinka-related ancestry, and 16% ± 3% Iranian-Neolithic-related ancestry (p = 0.015). This suggests that ancestry related to the Iranian Neolithic appeared in eastern Africa after earlier gene flow related to Levant Neolithic populations, a scenario that is made more plausible by the genetic evidence of admixture of Iranian-Neolithic-related ancestry throughout the Levant by the time of the Bronze Age …and in ancient Egypt by the Iron Age …
There is then a discussion of possible models of ancient African population splits (were the Bushmen the first? How long have they been isolated?) I suspect the more ancient African DNA we uncover, the more complicated the tree will become, just as in Europe and Asia we’ve discovered Neanderthal and Denisovan admixture.
They also compared genomes to look for genetic adaptations and found evidence for selection for taste receptors and “response to radiation” in the Bushmen, which the authors note “could be due to exposure to sunlight associated with the life of the ‡Khomani and Ju|’hoan North people in the Kalahari Basin, which has become a refuge for hunter-gatherer populations in the last millenia due to encroachment by pastoralist and agriculturalist groups.”
(The Bushmen are lighter than Bantus, with a more golden or tan skin tone.)
They also found evidence of selection for short stature among the Pygmies (which isn’t really surprising to anyone, unless you thought they had acquired their heights by admixture with another very short group of people.)
Overall, this is a great paper and I encourage you to RTWT, especially the pictures/graphs.
Examining ethnically diverse African genomes, we identify variants in or near SLC24A5, MFSD12, DDB1, TMEM138, OCA2 and HERC2 that are significantly associated with skin pigmentation. Genetic evidence indicates that the light pigmentation variant at SLC24A5 was introduced into East Africa by gene flow from non-Africans. At all other loci, variants associated with dark pigmentation in Africans are identical by descent in southern Asian and Australo-Melanesian populations. Functional analyses indicate that MFSD12 encodes a lysosomal protein that affects melanogenesis in zebrafish and mice, and that mutations in melanocyte-specific regulatory regions near DDB1/TMEM138 correlate with expression of UV response genes under selection in Eurasians.
I’ve had an essay on the evolution of African skin tones sitting in my draft folder for ages because this research hadn’t been done. There’s plenty of research on European and Asian skin tones (skin appears to have significantly lightened around 10,000 years ago in Europeans,) but much less on Africans. Luckily for me, this paper fixes that.
Looks like SLC24A5 is related to that Levantine/Iranian back-migration into Africa documented in the first paper.
The Negritos are a fascinating group of short-statured, dark-skinned, frizzy-haired peoples from southeast Asia–chiefly the Andaman Islands, Malaysia, Philippines, and Thailand. (Spelling note: “Negritoes” is also an acceptable plural, and some sources use the Spanish Negrillos.)
Because of their appearance, they have long been associated with African peoples, especially the Pygmies. Pygmies are formally defined as any group where adult men are, on average 4’11” or less and is almost always used specifically to refer to African Pygmies; the term pygmoid is sometimes used for groups whose men average 5’1″ or below, including the Negritos. (Some of the Bushmen tribes, Bolivians, Amazonians, the remote Taron, and a variety of others may also be pygmoid, by this definition.)
However, genetic testing has long indicated that they, along with other Melanesians and Australian Aborigines, are more closely related to other east Asian peoples than any African groups. In other words, they’re part of the greater Asian race, albeit a distant branch of it.
But how distant? And are the various Negrito groups closely related to each other, or do there just happen to be a variety of short groups of people in the area, perhaps due to convergent evolution triggered by insular dwarfism?
They found that the Negrito groups they studied “are basal to other East and Southeast Asians,” (basal: forming the bottom layer or base. In this case, it means they split off first,) “and that they diverged from West Eurasians at least 38,000 years ago.” (West Eurasians: Caucasians, consisting of Europeans, Middle Easterners, North Africans, and people from India.) “We also found relatively high traces of Denisovan admixture in the Philippine Negritos, but not in the Malaysian and Andamanese groups.” (Denisovans are a group of extinct humans similar to Neanderthals, but we’ve yet to find many of their bones. Just as Neanderthal DNA shows up in non-Sub-Saharan-Africans, so Denisvoan shows up in Melanesians.)
Figure 1 (A) shows PC analysis of Andamanese, Malaysian, and Philippine Negritos, revealing three distinct clusters:
In the upper right-hand corner, the Aeta, Agta, Batak, and Mamanwa are Philippine Negritos. The Manobo are non-Negrito Filipinos.
In the lower right-hand corner are the Jehai, Kintak and Batek are Malaysian Negritos.
And in the upper left, we have the extremely isolated Andamanese Onge and Jarawa Negritos.
(Phil-NN and Mly-NN I believe are Filipino and Malaysian Non-Negritos.)
You can find the same chart, but flipped upside down, with Papuan and Melanesian DNA in the supplemental materials. Of the three groups, they cluster closest to the Philippine Negritos, along the same line with the Malaysians.
By excluding the Andamanese (and Kintak) Negritos, Figure 1 (B) allows a closer look at the structure of the Philippine Negritos.
The Agta, Aeta, and Batak form a horizontal “comet-like pattern,” which likely indicates admixture with non-Negrito Philipine groups like the Manobo. The Mamanawa, who hail from a different part of the Philippines, also show this comet-like patterns, but along a different axis–likely because they intermixed with the different Filipinos who lived in their area. As you can see, there’s a fair amount of overlap–several of the Manobo individuals clustered with the Mamanwa Negritos, and the Batak cluster near several non-Negrito groups (see supplemental chart S4 B)–suggesting high amounts of mixing between these groups.
ADMIXTURE analysis reveals a similar picture. The non-Negrito Filipino groups show up primarily as Orange. The Aeta, Agta, and Batak form a clear genetic cluster with each other and cline with the Orange Filipinos, with the Aeta the least admixed and Batak the most.
The white are on the chart isn’t a data error, but the unique signature of the geographically separated Mananwa, who are highly mixed with the Manobo–and the Manobo, in turn, are mixed with them.
But this alone doesn’t tell us how ancient these populations are, nor if they’re descended from one ancestral pop. For this, the authors constructed several phylogenetic trees, based on all of the data at hand and assuming from 0 – 5 admixture events. The one on the left assumes 5 events, but for clarity only shows three of them. The Denisovan DNA is fascinating and well-documented elsewhere in Melanesian populatons; that Malaysian and Philippine Negritos mixed with their neighbors is also known, supporting the choice of this tree as the most likely to be accurate.
Regardless of which you pick, all of the trees show very similar results, with the biggest difference being whether the Melanesians/Papuans split before or after the Andamanese/Malaysian Negritos.
In case you are unfamiliar with these trees, I’ll run down a quick explanation: This is a human family tree, with each split showing where one group of humans split off from the others and became an isolated group with its own unique genetic patterns. The orange and red lines mark places where formerly isolated groups met and interbred, producing children that are a mix of both. The first split in the tree, going back million of years, is between all Homo sapiens (our species) and the Denisovans, a sister species related to the Neanderthals.
All humans outside of sub-Saharan Africans have some Neanderthal DNA because their ancestors met and interbred with Neanderthals on their way Out of Africa. Melanesians, Papuans, and some Negritos also have some Denisovan DNA, because their ancestors met and made children with members of this obscure human species, but Denisovan DNA is quite rare outside these groups.
Here is a map of Denisovan DNA levels the authors found, with 4% of Papuan DNA hailing from Denisivan ancestors, and Aeta nearly as high. By contrast, the Andamanese Negritos appear to have zero Denisovan. Either the Andamanese split off before the ancestors of the Philippine Negritos and Papuans met the Denisovans, or all Denisovan DNA has been purged from their bloodlines, perhaps because it just wasn’t helpful for surviving on their islands.
Back to the Tree: The second node is where the Biaka, a group of Pygmies from the Congo Rainforest in central Africa. Pygmy lineages are among the most ancient on earth, potentially going back over 200,000 years, well before any Homo sapiens had left Africa.
The next group that splits off from the rest of humanity are the Yoruba, a single ethnic group chosen to stand in for the entirety of the Bantus. Bantus are the group that you most likely think of when you think of black Africans, because over the past three millennia they have expanded greatly and conquered most of sub-Saharan Africa.
Next we have the Out of Africa event and the split between Caucasians (here represented by the French) and the greater Asian clade, which includes Australian Aborigines, Melanesians, Polynesians, Chinese, Japanese, Siberians, Inuit, and Native Americans.
The first groups to split off from the greater Asian clade (aka race) were the Andamanese and Malaysian Negritos, followed by the Papuans/Melanesians Australian Aborigines are closely related to Papuans, as Australia and Papua New Guinea were connected in a single continent (called Sahul) back during the last Ice Age. Most of Indonesia and parts of the Philippines were also connected into a single landmass, called Sunda. Sensibly, people reached Sunda before Sahul, though (Perhaps at that time the Andaman islands, to the northwest of Sumatra, were also connected or at least closer to the mainland.)
Irrespective of the exact order in which Melanesians and individual Negrito groups split off, they all split well before all of the other Asian groups in the area.
This is supported by legends told by the Filipinos themselves:
Legends, such as those involving the Ten Bornean Datus and the Binirayan Festival, tell tales about how, at the beginning of the 12th century when Indonesia and Philippines were under the rule of Indianized native kingdoms, the ancestors of the Bisaya escaped from Borneo from the persecution of Rajah Makatunaw. Led by Datu Puti and Datu Sumakwel and sailing with boats called balangays, they landed near a river called Suaragan, on the southwest coast of Panay, (the place then known as Aninipay), and bartered the land from an Ati [Negrito] headman named Polpolan and his son Marikudo for the price of a necklace and one golden salakot. The hills were left to the Atis while the plains and rivers to the Malays. This meeting is commemorated through the Ati-atihan festival.
The study’s authors estimate that the Negritos split from Europeans (Caucasians) around 30-38,000 years ago, and that the Malaysian and Philippine Negritos split around
13-15,000 years ago. (This all seems a bit tentative, IMO, especially since we have physical evidence of people in the area going back much further than that, and the authors themselves admit in the discussion that their time estimate may be too short.)
The authors also note:
Both our NJ (fig. 3A) and UPGMA (supplementary fig. S10) trees show that after divergence from Europeans, the ancestral Asians subsequently split into Papuans, Negritos and East Asians, implying a one-wave colonization of Asia. … This is in contrast to the study based on whole genome sequences that suggested Australian Aboriginal/Papuan first split from European/East Asians 60 kya, and later Europeans and East Asians diverged 40 kya (Malaspinas et al. 2016). This implies a two-wave migration into Asia…
The matter is still up for debate/more study.
In conclusion: All of the Negrito groups are likely descended from a common ancestor, (rather than having evolved from separate groups that happened to develop similar body types due to exposure to similar environments,) and were among the very first inhabitants of their regions. Despite their short stature, they are more closely related to other Asian groups (like the Chinese) than to African Pygmies. Significant mixing with their neighbors, however, is quickly obscuring their ancient lineages.
I wonder if all ancient human groups were originally short, and height a recently evolved trait in some groups?
In closing, I’d like to thank Jinam et al for their hard work in writing this article and making it available to the public, their sponsors, and the unique Negrito peoples themselves for surviving so long.