The Evolution of Horses

lascaux-chinese-horseThe ancestors of horses–small, multi-toed quadrupeds–emerged around 50 million years ago, but horses as we know them (and their wild cousins) evolved from a common ancestor around 6 million years ago. Horses in those days were concentrated in North America, but spread via the Bering land bridge to Eurasia and Africa, where they differentiated into zebras, asses, and “wild” horses.

When humans first encountered horses, we ate them. American horses became extinct around 14,000-10,000 years ago, first in Beringia and then in the rest of the continent–coincidentally about the time humans arrived here. The first known transition from hunting horses to herding and ranching them occurred around 6,000 years ago among the Botai of ancient Kazakhstan, not far from the proto-Indo European homeland (though the Botai themselves do not appear to have been pIEs). These herds were still managed for meat, of which the Botai ate tons, until some idiot teenager decided to impress his friends by riding one of the gol-dang things. Soon after, the proto-Indo-Europeans got the idea and went on a rampage, conquering Europe, Iran, and the Indian subcontinent, (and then a little later North and South America, Africa, Australia, and India again). Those horses were useful.

Oddly, though, it appears that those Botai horses are not the ancestors of the modern horses people ride today–but instead are the ancestors of the Przewalski “wild” horse. The Przewalski was though to be a truly wild, undomesticated species, but it appears to have been a kind of domesticated horse* that went feral, much like the mustangs of the wild west. Unlike the mustang, though, the Przewalski is a truly separate species, with 66 chromosomes. Domesticated horses have 64, so the two species cannot produce fertile hybrids. When exactly the Przewalski obtained their extra chromosomes, I don’t know.

*This, of course, depends on the assumption that the Botai horses were “domesticated” in the first place.

Instead, modern, domesticated horses are believed to have descended from the wild Tarpan, though as far as I know, genetic studies proving this have not yet been done. The Tarpan is extinct, but survived up to the cusp of the twentieth century. (Personally, I’m not putting odds on any major tarpan herds in the past couple thousand years having had 100% wild DNA, but I wouldn’t classify them as “feral” just because of a few escaped domestics.)

Thus the horse was domesticated multiple times–especially if we include that other useful member of the equus family, the ass (or donkey, if you’d prefer). The hardworking little donkey does not enjoy its cousin’s glamorous reputation, and Wikipedia reports,

Throughout the world, working donkeys are associated with the very poor, with those living at or below subsistence level.[45] Few receive adequate food, and in general donkeys throughout the Third World are under-nourished and over-worked.[68]

The donkey is believed to have been domesticated from the wild African ass, probably in ancient Nubia (southern Egypt/northern Sudan). From there it spread up the river to the rest of Egypt, where it became an important work animal, and from there to Mesopotamia and the rest of the world.

Wild African asses still exist, but they are critically endangered.

All of the different species within the equus genus, except the domesticated donkey and its wild African cousins, have different numbers of chromosomes:

przewalski: 66
Horse: 64
Donkey: 62
Onager (Persian wild ass): 56
Kulan: 54/55 (??)
Kiang (Asian wild ass): 51/52 (??)
Grevy’s zebra: 46
common zebra: 44
mountain zebra: 42

As far as I know, they can all cross and make mules, but all of these hybrids are infertile except domestic donkey/wild African donkey crosses. (Your only limit is African wild donkeys being endangered.)

I have no idea while equines have so much chromosomal diversity; dogs have been domesticated for much longer than horses, but are still interfertile with wolves and even coyotes (tbf, maybe horses could breed with tarpans.)

Interestingly, domestication causes a suit of changes to a species’ appearance that are not obviously useful. Recently-domesticated foxes exhibit pelt colors and patterns similar to those of domesticated dogs, not wild foxes. We humans have long hair, unlike our chimp-like ancestors. Horses also have long manes, unlike wild zebras, asses, and tarpans. Horses have evolved, then, to look rather like humans.

Also like humans, horses have different male and female histories. Male horses were quite difficult to tame, and so early domesticators only obtained a few male horses. Females, by contrast, were relatively easy to gentle, so breeders often restocked their herds with wild females. As a result, domesticated horses show far more variation in their mitochondrial DNA than their Y chromosomes. The stocking of herds from different groups of wild horses most likely gave rise to 17 major genetic clusters:

From these sequences, a phylogenetic network was constructed that showed that most of the 93 different mitochondrial (mt)DNA types grouped into 17 distinct phylogenetic clusters. Several of the clusters correspond to breeds and/or geographic areas, notably cluster A2, which is specific to Przewalski’s horses, cluster C1, which is distinctive for northern European ponies, and cluster D1, which is well represented in Iberian and northwest African breeds. A consideration of the horse mtDNA mutation rate together with the archaeological timeframe for domestication requires at least 77 successfully breeding mares recruited from the wild. The extensive genetic diversity of these 77 ancestral mares leads us to conclude that several distinct horse populations were involved in the domestication of the horse.

The wild mustangs of North America might have even more interesting DNA:

The researchers said four family groups (13.8%) with 31 animals fell into haplogroup B, with distinct differences to the two haplogroup L lineages identified.

The closest mitochondrial DNA sequence was found in a Thoroughbred racing horse from China, but its sequence was still distinct in several areas.

The testing also revealed links to the mitochondrial DNA of an Italian horse of unspecific breed, the Yunnan horse from China, and the Yakutia horse from central Siberia, Russia.

Haplogroup B seems to be most frequent in North America (23.1%), with lower frequencies in South America (12.68%) and the Middle East (10.94%) and Europe (9.38%).

“Although the frequency of this lineage is low (1.7%) in the Asian sample of 587 horses, this lineage was found in the Bronze Age horses from China and South Siberia.”

Westhunter suggests that this haplogroup could have originated from some surviving remnant of American wild horses that hadn’t actually been completely killed off before the Spanish mustangs arrived and bred with them. I caution a more prosaic possibility that the Russians brought them while colonizing Alaska and the coast down to northern California. Either way, it’s an intriguing finding.

The horse has been man’s companion for thousands of years and helped him conquer most of the Earth, but the recent invention of internal and external combustion engines (eg, the Iron Horse) have put most horses out to pasture. In effect, they have become obsolete. Modern horses have much easier lives than their hard-working plow and wagon-pulling ancestors, but their populations have shrunk enormously. They’re not going to go extinct, because rich people still like them (and they are still useful in parts of the world where cars cannot easily go,) but they may suffer some of the problems of inbreeding found in genetically narrow dog breeds.

Maybe someday, significant herds of wild horses will roam free again.

 

Back Row America, Back Row Bronze Age Europe

Back Row America:

This was a really interesting article–book excerpt–about an upper-class Wallstreet guy who, through his daily walks, begins talking to and photographing the people he basically hadn’t noticed before.

Over the next half hour, she told me her life story. She told me how her mother’s pimp had put her on the streets at twelve. How she had had her first child at thirteen. How she was addicted to heroin. I ended by asking her the question I asked everyone I ­photographed: How do you want to be described? She replied without a pause, “As who I am. A prostitute, a mother of six, and a child of God.”

I spent the next three years following Takeesha and the street family she was a member of—roughly fifty men and women who lived under bridges, in abandoned buildings, in sheds, in pits, in broken-down trucks, on rooftops, or, if they scored enough money, in per-hour motels. What she showed me prompted me to travel to other neighborhoods in cities across America, from Buffalo to New Haven to Cleveland to Selma to El Paso to Amarillo. In each of these places, people have a sense of being left behind and forgotten—or, worse, mocked and stigmatized by the rest of the world as it moves on and up with the GDP.

In many cases, these neighborhoods have literally been left behind by people like me. …

We had compassion for those who got left behind, but thought that our job was to provide them an opportunity (no matter how small) to get where we were. It didn’t occur to us that what we valued wasn’t what everyone else wanted. They were the people who couldn’t or didn’t want to leave their town or their family to get an education at an elite college, the people who cared more about their faith than about science. If we were the front row, they were the back row.

Had I asked people in my hometown why they were still there, I would have received the answer I heard in neighborhoods from Cairo to Amarillo to rural Ohio. They would have looked at me like I was crazy and said, “Because it is my home.”

The book it’s from is Dignity: Seeking Respect in Backrow America.

This article–and the larger book, undoubtedly–touches on a lot of themes I’ve been pondering myself. Unfortunately, the article doesn’t have answers. I’d like answers.

Dignity, as I’ve said before, is one of those principles I am drawn to. I am not sure what can be done for people. Maybe nothing. But I can still treat others with respect, and maybe if we respected each other a little more, we could get our heads out of our collective rear ends and make something better of this country.

Related: Crossing Borders to Afford Insulin:

All told, I bought two cartons of Lantus (5 pens each carton) for $52 each, which is about a year supply for me. I also bought six single Kwikpens of Humalog for $13 dollars each, which is about a six month supply.

My total pharmacy bill that day was $182, and I left Mexico with a year’s supply of one insulin and a 6 month’s supply of another. That same amount of insulin – the exact same, in identical cartridges and boxes with the same graphics and colors and the same words written on them (in Spanish for the Mexican insulin) – would cost me over $3,000 with my American health coverage. Even after adding in a tank and a half of gas, I saved thousands of dollars by buying my life-saving medications in Mexico, instead of the US.

Also related: Mass grave of an extended family probably murdered by invading Corded Ware People–I mean, peacefully interred by migrating pots:

We sequenced the genomes of 15 skeletons from a 5,000-y-old mass grave in Poland associated with the Globular Amphora culture. All individuals had been brutally killed by blows to the head, but buried with great care. Genome-wide analyses demonstrate that this was a large extended family and that the people who buried them knew them well: mothers are buried with their children, and siblings next to each other. From a population genetic viewpoint, the individuals are clearly distinct from neighboring Corded Ware groups because of their lack of steppe-related ancestry. Although the reason for the massacre is unknown, it is possible that it was connected with the expansion of Corded Ware groups, which may have resulted in violent conflict.

Book Club: Legal Systems Very Different from Ours

Our next Book Club pick is David Friedman’s Legal Systems Very Different from Ours, a topic I’ve found intriguing for at least fifteen years.

From the Amazon blurb:

This book looks at thirteen different legal systems, ranging from Imperial China to modern Amish: how they worked, what problems they faced, how they dealt with them. Some chapters deal with a single legal system, others with topics relevant to several, such as problems with law based on divine revelation or how systems work in which law enforcement is private and decentralized. The book’s underlying assumption is that all human societies face the same problems, deal with them in an interesting variety of different ways, are all the work of grown-ups, hence should all be taken seriously. It ends with a chapter on features of past legal systems that a modern system might want to borrow.

Read up, enjoy, and let’s discuss it in about a month.

Be careful what you rationalize

The first few thousand years of “medicine” were pretty bad. We did figure out a few things–an herb that’ll make you defecate faster here, something to staunch bleeding there–but overall, we were idiots. Doctors used to stick leeches on people to make them bleed, because they were convinced that “too much blood” was a problem. A primitive form of CPR invented in the 1700s involved blowing tobacco smoke up a drowned person’s rectum (it didn’t work.) And, of course, people have periodically taken it into their heads that consuming mercury is a good idea.

Did pre-modern (ie, before 1900 or so) doctors even benefit their patients, on net? Consider this account of ancient Egyptian medicine:

The ancient Egyptians had a remarkably well-organized medical system, complete with doctors who specialized in healing specific ailments. Nevertheless, the cures they prescribed weren’t always up to snuff. Lizard blood, dead mice, mud and moldy bread were all used as topical ointments and dressings, and women were sometimes dosed with horse saliva as a cure for an impaired libido.

Most disgusting of all, Egyptian physicians used human and animal excrement as a cure-all remedy for diseases and injuries. According to 1500 B.C.’s Ebers Papyrus, donkey, dog, gazelle and fly dung were all celebrated for their healing properties and their ability to ward off bad spirits. While these repugnant remedies may have occasionally led to tetanus and other infections, they probably weren’t entirely ineffective—research shows the microflora found in some types of animal dung contain antibiotic substances.

Bed rest, nurturing care, a bowl of hot soup–these are obviously beneficial. Dog feces, not so much.

Very ancient medicine and primitive shamanism seem inherently linked–early medicine can probably be divided into “secret knowledge” (ie, useful herbs); magical rites like painting a patient suffering from yellow fever with yellow paint and then washing it off to “wash away” the disease; and outright charlatanry.

It’s amazing that medicine persisted as a profession for centuries despite its terrible track record; you’d think disgruntled patients–or their relatives–would have put a quick and violent end to physicians bleeding patients.

The Christian Scientists got their start when a sickly young woman observed that she felt better when she didn’t go to the doctor than when she did, because this was the 1800s and medicine in those days did more harm than good. Yet the Christian Scientists were (and remain) an exception. Society at large never (to my knowledge) revolted against the “expertise” of supposed doctors.

Our desire for answers in the face of the unknown, our desire to do something when the optimal course is actually doing nothing and just hoping you don’t die, has overwhelmed medicine’s terrible track record for centuries.

Modern medicine is remarkably good. We can set bones, cure bubonic plague, prevent smallpox, and transplant hearts. There are still lots of things we can’t do–we can’t cure the common cold, for example–but modern medicine is, on the whole, positive. So this post is not about modern medicine.

But our tendency to trust too much, to trust the guy who offers answers and solutions over the guy who says “We don’t know, we can’t know, you’re probably best off doing nothing and hoping for the best,” is still with us. It’s probably a cognitive bias, and very hard to combat without purposefully setting out to do so.

So be careful what you rationalize.

Bio-thermodynamics and aging

I suspect nature is constrained by basic physics/chemistry/thermodynamics in a variety of interesting ways.

For example, chemical reactions (and thus biological processes) proceed more quickly when they are warm than cold–this is pretty much a tautology, since temperature=movement–and thus it seems reasonable to expect certain biological processes to proceed more slowly in colder places/seasons than in warmer ones.

The Greenland Shark, which lives in very cold waters, lives to be about 300-500 years old. It’s no coincidence:

Temperature is a basic and essential property of any physical system, including living systems. Even modest variations in temperature can have profound effects on organisms, and it has long been thought that as metabolism increases at higher temperatures so should rates of ageing. Here, we review the literature on how temperature affects longevity, ageing and life history traits. From poikilotherms to homeotherms, there is a clear trend for lower temperature being associated with longer lifespans both in wild populations and in laboratory conditions. Many life-extending manipulations in rodents, such as caloric restriction, also decrease core body temperature.

This implies, in turn, that people (or animals) who overeat will tend to die younger, not necessarily due to any particular effects of having extra lumps of fat around, but because they burn hotter and thus faster.

Weighing more may trigger certain physiological changes–like menarchy–to begin earlier due to the beneficial presence of fat–you don’t want to menstruate if you don’t have at least a little weight to spare–which may in turn speed up certain other parts of aging, but there could be an additional effect on aging just from the presence of more cells in the body, each requiring additional metabolic processes to maintain.

Increased human height (due to better nutrition) over the past century could have a similar effect–shorter men do seem to live longer than taller men, eg: 

Observational study of 8,003 American men of Japanese ancestry from the Honolulu Heart Program/Honolulu-Asia Aging Study (HHP/HAAS), a genetically and culturally homogeneous cohort followed for over 40 years. …

A positive association was found between baseline height and all-cause mortality (RR = 1.007; 95% CI 1.003–1.011; P = 0.002) over the follow-up period. Adjustments for possible confounding variables reduced this association only slightly (RR = 1.006; 95% CI 1.002–1.010; P = 0.007). In addition, height was positively associated with all cancer mortality and mortality from cancer unrelated to smoking. A Cox regression model with time-dependent covariates showed that relative risk for baseline height on mortality increased as the population aged. Comparison of genotypes of a longevity-associated single nucleotide polymorphism in FOXO3 showed that the longevity allele was inversely associated with height. This finding was consistent with prior findings in model organisms of aging. Height was also positively associated with fasting blood insulin level, a risk factor for mortality. Regression analysis of fasting insulin level (mIU/L) on height (cm) adjusting for the age both data were collected yielded a regression coefficient of 0.26 (95% CI 0.10–0.42; P = 0.001).

The more of you there is, the more of you there is to age.

Interesting: lots of data on human height.

But there’s another possibility involving internal temperature–since internal body temperature requires calories to maintain, people who “run hot” (that is, are naturally warmer) may burn more calories and tend to be thinner than people who tend to run cool, who may burn fewer calories and thus tend to weigh more. Eg, low body temperature linked to obesity in new study: 

A new study has found that obese people (BMI >30) have lower body temperature during the day than normal weight people. The obese people had an average body temperature that was .63 degrees F cooler than normal weight people. The researchers calculated that this lower body temperature—which reflects a lower metabolic rate—would result in a body fat accumulation of approximately 160 grams per month, or four to five pounds a year, enough for the creeping weight gain many people experience.

There’s an interesting discussion in the link on thyroid issues that cause people to run cold and thus gain weight, and how some people lose weight with thyroid treatment.

On the other hand, this study found the opposite, and maybe the whole thing just washes out to women and men having different internal temperatures?

Obese people are–according to one study–more likely to suffer mood or mental disorders, which could also be triggered by an underlying health problem. They also suffer faster functional decline in old age:

Women had a higher prevalence of reported functional decline than men at the upper range of BMI categories (31.4% vs 14.3% for BMI > or =40). Women (odds ratio (OR) = 2.61, 95% confidence interval (CI) = 1.39-4.95) and men (OR = 3.32, 95% CI = 1.29-8.46) exhibited increased risk for any functional decline at BMI of 35 or greater. Weight loss of 10 pounds and weight gain of 20 pounds were also risk factors for any functional decline.

Note that gaining weight and losing weight were also related to decline, probably due to health problems that caused the weight fluctuations in the first place.

Of course, general physical decline and mental decline go hand-in-hand. Whether obesity causes declining health, declining health causes obesity, or some underlying third factor, like biological aging underlies both, I don’t know.

Anyway, I know this thought is a bit disjointed; it’s mostly just food for thought.

Some interesting things

Here’s a post a friend linked me to detailing the writer’s experience of discovering that the true background of two famous photos of the Vietnam War was very different from the background they had been taught:

As I read the article about the photos, I felt a sense of disbelief. I wasn’t quite sure what I was reading was correct. Surely, if this information about both photos were true, I’d have heard about it before this. After all, thirty years had passed.

I spent the next few hours searching the subject online and found quite a bit more information, but no serious or credible refutation of the stories I’d just learned. …

Then the strangest feeling came over me. I don’t even have a word for it, although I usually can come up with words for emotions.

This was a new feeling. The best description I can come up with is that it was a regret so intense it morphed seamlessly into guilt, as though I were responsible for something terrible, though I didn’t know exactly what. Regret and guilt, and also a rage that I’d been so stupid, that I’d let myself be duped or misled or kept ignorant about something so important, and that I’d remained ignorant all these years.

I sat in front of my computer and put my face down on the keyboard. I stayed in that position for a few minutes, energyless and drained. When I lifted my head I was surprised to find a few tears on my cheeks.

This is the emotion more blasely referred to as “red pilling;” the moment you realize that many of the things you had been taught to believe are, in fact, a lie.

It’s a very interesting article and I encourage you to read it.

Denisovan Jawbone in Tibet?

But now, an international team of scientists has announced the identification of another Denisovan fossil, from a site 1,500 miles away. It’s the right half of a jawbone, found some 10,700 feet above sea level in a cave in China’s Xiahe County, on the eastern edge of the Tibetan plateau. The Xiahe mandible, as it is now known, is not only the first Denisovan fossil to be found outside Denisova Cave, but also the very first Denisovan fossil to be found at all. It just took four decades for anyone to realize that.

So there may be a lot of old bits of bone or pieces of skulls lying unidentified in various old collections, especially in Asia, that we’ll be able to identify and piece together into various homo species as we fill in more of the information about our human family tree

To be honest, I am a little annoyed about how every article about the Denisovans expresses a form of supposed confusion at how a group whose only fossils (until now) were found in a cave in Siberia could have DNA in Tibetans and Melanesians. Obviously we just haven’t figured out the full ancestral ranges of these groups, and they used to overlap. If Tibetans have high-altitude adaptations that look like they came from Denisovans, then obviously Denisovans lived in Tibet, and old Tibetan bones are a great place to look for Denisovans.

Indeed, the Xiahe mandible, which is 160,000 years old, is by far the earliest hominin fossil from the Tibetan plateau. Researchers used to think that Homo sapiens was unique in adapting to the Himalayas, but the Denisovans were successfully living on the roof of the world at least 120,000 years earlier. They must also have adapted to extremely thin air—after all, the mandible was found in a cave that’s some 8,000 feet higher above sea level than Denisova itself. “Their presence that high up is truly astonishing,” Douka says.

Fascinating article about the genetics of circadian rhythms and their relationship to health matters:

Perhaps the most ubiquitous and persistent environmental factor present throughout the evolution of modern species is the revolution of the earth about its own axis, creating a 24 h solar day. The consequent recurrent pattern of light and darkness endows a sense of time to organisms that live on this planet. The importance of this sense of time is accentuated by an internal clock that functions on a 24 h scale, inherent in the genetic framework of living organisms ranging from cyanobacteria (Johnson et al., 1996) to human mammals (Herzog and Tosini, 2001). An internal, molecular program drives circadian oscillations within the organism that manifest at the molecular, biochemical, physiological and behavioral levels (Mazzoccoli et al., 2012). Importantly, these oscillations allow anticipatory responses to changes in the environment and promote survival.

The term “circadian” comes from the Latin “circa,” meaning “around” and “diem,” meaning “day.” Circadian events recur during the subjective day or the lighted portion of the 24 h period and the subjective night or the dark part of the 24 h period allowing physiological synchrony with the light/dark environment (Reddy and O’Neill, 2010). The circadian clock has been demonstrated in almost all living organisms (Johnson et al., 1996Herzog and Tosini, 2001Mazzoccoli et al., 2012). The two defining characteristics of the circadian timing system are perseverance of oscillation under constant environmental conditions, which define these rhythms as self-sustained and endogenously generated, and the ability to adapt to environmental change, particularly to changes in the environmental light/dark cycle (Tischkau and Gillette, 2005).

The fascinating thing about sleep is that it exists; you would think that, given how vulnerable we are during sleep, animals that sleep would have long ago been eaten by animals that don’t, and the entire kingdom would have evolved to be constantly awake. And yet it hasn’t, suggesting that whatever sleep does, it is vitally important.

Modern Shamans: Financial Managers, Political Pundits, and others who help tame life’s uncertainties:

Like all magical specialists, [shamans] rely on spells and occult gizmos, but what makes shamans special is that they use trance. …

But these advantages are offset by the ordeals involved. In many societies, a wannabe initiate lacks credibility until he (and it’s usually a he) undergoes a near-death experience or a long bout of asceticism.

One aboriginal Australian shaman told ethnographers that, as a novice, he was killed by an older shaman who then replaced his organs with a new, magical set. …

Manifesting as mediums, channelers, witch doctors and the prophets of religious movements, shamans have appeared in most human societies, including nearly all documented hunter-gatherers. They characterized the religious lives of ancestral humans and are often said to be the “first profession.” …

… Like people everywhere, contemporary Westerners look to experts to achieve the impossible – to heal incurable illnesses, to forecast unknowable futures – and the experts, in turn, compete among themselves, performing to convince people of their special abilities.

So who are these modern shamans?

According to the cognitive scientist Samuel Johnson, financial money managers are likely candidates. Money managers fail to outperform the market – in fact, they even fail to systematically outperform each other – yet customers continue to pay them to divine future stock prices. …

Very interesting insight. It might explain why we stuck with doctors for so many centuries even when they were totally useless (or even negatively useful,) and why we trusted psychiatry throughout most of the 20th century, despite it being obvious bullshit.

There are a lot of unknowns out there, and we feel more comfortable trusting someone than just leaving it unknown–which introduces a lot of room for people to take advantage of us.

Finally, on a similar note, Is Dogma Eugenic? 

As he explains, belief in the supernatural can be attributed to the above heuristics. If belief in the supernatural became a problem, we would have to evolve to loose those heuristics.

Heuristics can be good. But, insofar as heuristics have us create harmful dogmas that can perpetuate themselves socially, we will have to replace them with pure logic, or at least lessen their impact. 

So, insofar as humans have the capacity to believe harmful dogmas, we will lose heuristics and become more logical. Heuristics can be “gamed;” logic cannot. In this manner, humans evolve to act less on instinct. The logical part of our brain becomes more pronounced.

You might have to RTWT to really get the argument, but it’s fun.

The Idiocy of Categoric Purity

I realized yesterday that the Left has an odd idea of “purity” that underlies many of their otherwise inexplicable, reality-rejecting claims.

The left has, perhaps unconsciously, adopted the idea that if groups of things within a particular category exist, the groups must be totally independent and not overlap at all.

In the case of genetics, they think that for a genetic group to “exist” and be “real”, it must hail from a single, pure, founding population with no subsequent mixing with other groups. We see this in a recently headline from the BBC: Is this the last of the Aryans? 

Deep in India’s Ladakh region live the Aryans, perhaps the last generation of pure-blooded people and holders of possibly the only untampered gene pool left in the world.

These actually-called-Aryans might be fabulous, interesting people, but there is no way they are more pure and “untampered” than the rest of us. The entire sub-headline is nonsense, because all non-Africans (and some Africans) have Neanderthal DNA. They aren’t even pure Homo sapiens! Africans btw have their own archaic DNA from interbreeding with another, non-Neanderthal, human species. None of us, so far as I know, is a “pure” Homo sapiens.

Besides that, the proto-Indo-European people whom these Aryans are descended from where themselves a fusion of at least two peoples, European hunter-gatherers and a so far as I know untraced steppe-people from somewhere about Ukraine.

Further, even if the Aryans settled in their little villages 4,000 years ago and have had very little contact with the outside world over that time, it is highly unlikely that they have had none.

Meanwhile, out in the rest of the world, there are plenty of other highly isolated peoples: The Sentinelese of North Sentinel Island, for example, who will kill you if you try to set foot on their island. There was a pretty famous case just last year of someone earning himself a Darwin award by trying to convert the Sentinelese.

Now let’s look at that word “untampered.” What on earth does that mean? How do you tamper with a genome? Were the rest of us victims of evil alien experiments with CRSPR, tampering with our genomes?

The Chinese might figure out how to produce “tampered” genomes soon, but the rest of us, all of us in the entire world, have “untampered” genomes.

To be honest, I am slightly flabbergasted at this author’s notion that the rest of the people in the world are walking around with “tampered” genomes because our ancestors married some Anatolian farming people 4,000 years ago.

This strange idea pops up in liberal conversations about “race”, too. Take the recent AAPA Statement on Race and Racism:

Race does not provide an accurate representation of human biological variation. It was never accurate in the past, and it remains inaccurate when referencing contemporary human populations. Humans are not divided biologically into distinct continental types or racial genetic clusters.

But… no one said they did. At least, not since we stopped using Noah’s sons Shem, Ham, and Japheth going their separate ways after the Flood as our explanation for why races exist.

“See, human races are’t descended from Shem, Ham, and Japheth, therefore races don’t exist!”

Two groups of things need not be completely separate, non-overlapping to nonetheless exist. “Pillows” and “cloth” contain many overlapping traits, for example; there are no traits in “cloth” that do not also exist in “pillows.”

Colin Wight on Twitter articulates this beautifully as the “Univariate Fallacy”:

Click the cube. Watch it turn.

This fallacy, when deployed, is commonly done using a single sentence buried within an article or essay couched around a broader narrative on the history of a particular type of oppression, such as sexism. Let me give you some recent examples of this fallacy in action.

You’ll remember this @nature piece arguing that sex is a spectrum and that perhaps there are more then 2 sexes, even though over 99.98% of humans can be classified at birth as being unambiguously male or female. … [Link to piece]

In this piece, they hold off deploying the Univariate Fallacy until the second-to-last sentence of a nearly 3500 word essay.

So if the law requires that a person is male or female, should that sex be assigned by anatomy, hormones, cells or chromosomes, and what should be done if they clash? “My feeling is that since there is not one biological parameter that takes over every other parameter, at the end of the day, gender identity seems to be the most reasonable parameter.”

Please read the whole thread. It is very insightful.

For example, if you look at the so called “big five” personality traits, you find only 10% overlap between men and women. This is why it is usually pretty easy to tell if you are talking to a man or a woman. But if you you look at only one trait at a time, there’s a lot more overlap. So the trick is to take a thing with multiple facets–as most things in the real world are–and claim that because it overlaps in any of its facets with any other thing, that it does not exist. It is not pure.

Are our categories, in fact, random and arbitrary? Is there some reality beneath the categories we use to describe groups of people, like “male” and “female,” “young” and “old,” “black” and “white”? Could we just as easily have decided to use different categories, lumping humans by different criteria, like height or eye color or interest in Transformers, and found these equally valid? Should we refer to all short people as “the short race” and everyone who owns a fedora as “untouchables”?

Liberals believe that the categories came first, were decided for arbitrary or outright evil reasons, bear no relation to reality, and our belief in these categories then created them in the world because we enforced them. This is clearly articulated in the AAPA Statement on Race and Racism:

Instead, the Western concept of race must be understood as a classification system that emerged from, and in support of, European colonialism, oppression, and discrimination. It thus does not have its roots in biological reality, but in policies of discrimination. Because of that, over the last five centuries, race has become a social reality that structures societies and how we experience the world.

Race exists because evil Europeans made it, for their own evil benefit, out of the completely undifferentiated mass of humanity that existed before 1492.

This statement depends on the Univariate Fallacy discussed above–the claim that biological races don’t actually exist is 100% dependent on the UF–and a misunderstanding of the term “social construct,” a term which gets thrown around a lot despite no one understanding what it means.

I propose a different sequence of events, (with thanks to Steven Pinker in the Blank Slate for pointing it out): Reality exists, and in many cases, comes in lumps. Plants, for existence, have a lot in common with other plants. Animals have a lot in common with other animals. Humans create categories in order to talk about these lumps of things, and will keep using their categories so long as they are useful. If a category does not describe things well, it will be quickly replaced by a more effective category.

Meme theory suggests this directly–useful ideas spread faster than non-useful ideas. Useful categories get used. Useless categories get discarded. If I can’t talk about reality, then I need new words.

Sometimes, new information causes us to update our categories. For example, back before people figured out much about biology, fungi were a bit of a mystery. They clearly act like plants, but they aren’t green and they seem to grow parasitically out of dead things. Fungi were basically classed as “weird, creepy plants,” until we found out that they’re something else. It turns out that fungi are actually more closely related to humans than plants, but no one outside of a molecular biologist has any need for a category that is “humans and fungi, but not plants,” so no one uses such a category. There are, additionally, some weird plants, like venus flytraps, that show animal-like traits like predation and rapid movement, and some animals, like sponges, that look more like plants. You would not think a man crazy if he mistook a sponge for a plant, but no one looks at these examples, throws up their hands, and says, “Well, I guess plants and animals are arbitrary, socially-constructed categories and don’t exist.” No, we are all quite convinced that, despite a few cases that were confusing until modern science cleared them up, plants, animals, and fungi all actually exist–moving sponges from the “plant” category to the “animal” category didn’t discredit the entire notion of “plants” and “animals,” but instead improved our classification scheme.

Updating ideas and classification schemes slightly to make them work more efficiently as we get more information about obscure or edge cases in no way impacts the validity of the classification scheme. It just means that we’re human beings who aren’t always 100% right about everything the first time we behold it.

To summarize: reality exists, and it comes in lumps. We create words to describe it. If a word does not describe reality, it gets replaced by a superior word that does a better job of describing reality. Occasionally, we get lucky and find out more information about reality, and update our categories and words accordingly. Where a category exists and is commonly used, therefore, it most likely reflects an actual, underlying reality that existed before the world and caused it to come into existence–not the other way around.

The belief that words create reality is magical thinking and belongs over in Harry Potter and animist religion, where you can cure Yellow Fever by painting someone yellow and then washing off the paint. It’s the same childish thinking as believing that monsters can’t see you if you have a blanket over your head (because you can’t see them) or that Bloody Mary will appear in the bathroom mirror if you turn out the lights and say her name three times while spinning around.

Of course, “white privilege” is basically the “evil eye” updated for the modern age, so it’s not too surprised to find people engaged in other forms of mystical thinking, like that if you just don’t believe in race, it will cease to exist and no one will ever slaughter their neighbors again, just as no war ever happened before 1492 and Genghis Khan never went on a rampage that left 50 million people dead.

“Purity” as conceived of in these examples isn’t real. It doesn’t exist; it never existed, and outside of the simplistic explanations people thought up a few thousand years ago when they had much less information about the world, no one actually uses such definitions. The existence of different races doesn’t depend on Ham and Shem; rain doesn’t stop existing just because Zeus isn’t peeing through a sieve. In reality, men and women are different in a number of different ways that render categories like “man” and “woman” functional enough for 99.99% of your daily interactions. Racial categories like “black” and “white” reflect real-life differences between actual humans accurately enough that we find them useful terms, and the fact that humans have migrated back and forth across the planet, resulting in very interesting historical stories encoded in DNA, does not change this at all.

I’d like to wrap this up by returning to the BBC’s strange article on the Aryans:

I asked Dolma if she was excited over her daughter participating in the festival. She replied that not many outsiders came to Biama, and that it was fun to meet foreigners. But even more importantly, she couldn’t wait to see friends from neighbouring villages, brought together by each year by the festival, as well as the chance to dress up, dance and celebrate. If the future generations continue to hold traditional ceremonies and celebrations and keep their vibrant culture alive, perhaps then, they won’t be the last of the Aryans.

smallisland
Source:  The Economist

One wonders what the author–or the BBC in general–thinks of efforts to keep the British pure or preserve British culture, untouched and unchanged through the millennia. Or is preserving one’s culture only for quaint foreigners whose entertaining exoticism would be ruined if they started acting and dressing just like us? What about those of us in America who think the British have a quaint and amusing culture, and would like it to stick around so we can still be entertained by it? And do the British themselves deserve any say in this, or are they eternally tainted with “impure,” “tampered” bloodlines due to the mixing of bronze-age peoples with Anglo Saxon invaders over a millennium and a half ago, and thus have no right to claim a culture or history of their own?

Goodness, what an idiotic way of looking at the world.

Pygmy-pocalypse?

I just want to highlight this graph I came across yesterday while trying to research archaic introgression in the Igbo:

Populationsize
source

From Whole-genome sequence analysis of a Pan African set of samples reveals archaic gene flow from an extinct basal population of modern humans into sub-Saharan populations, by Lorente-Galdos et al.

There are three versions of this graph in the paper (check the supplemental materials for two of them), all showing about the same thing. It is supposed to be a graph of population size at different times in the past, and the most incredible thing is that for the past 100,000 years or so, the most numerically dominant populations in Africa were the Baka Pygmies, followed by various Bushmen (San) groups. The authors write:

To unravel the ancient demographic history of the African populations that are present in our data set, we used the Pairwise Sequentially Markovian Coalescent (PSMC) model that analyzes the dynamics of the effective population size over time [60]. We included at least one representative of each of the 15 African populations and two Eurasian samples in the analysis (Additional file 1: Figure S7.1) and considered both the classical mutation rate of 2.5 × 10−8 [61] and the 1.2 × 10−8 mutations per bp per generation reported in other analyses [6263]. The demographic trajectories of the sub-Saharan agriculturalist populations are very similar to each other; and only South African Bantu and Toubou individuals differ partly from the rest of sub-Saharan farmer samples; however, their considerable levels of admixture with other North African or hunter-gatherer populations (Fig. 2b) might explain this trend. Therefore, in order to ease visualization, we plotted a Yoruba individual (Yoruba_HGDP00936) and two Ju|‘hoansi individuals as representatives of the sub-Saharan agriculturalist and Khoisan populations, respectively (Fig. 3 and Additional file 1: Figure S7.2 considering a mutation rate of 1.2 × 10−8).

The authors note that the apparent large size of the pygmy groups could have been due to groups splitting and merging and thus getting more DNA variety than they would normally. It’s all very speculative. But still, the Baka Pygmies could have been the absolutely dominant group over central Africa for centuries.

What happened?

 

Book Club: Chinua Achebe’s Things Fall Apart, pt 3/3

 

Chinua_Achebe_-_Buffalo_25Sep2008_crop
Chinua Achebe, Author and Nobel Prize winner

Welcome back to our discussion of Chinua Achebe’s Things Fall Apart. Today I wanted to take a closer look at some of the aspects of traditional Igbo society mentioned in the book.

If you are a regular reader of this blog, you know by now that just as early modern humans (Homo sapiens) mated with Neanderthals and Denisovans somewhere over in Eurasia, some sapiens mated with archaic humans in Africa.

Unfortunately, the state of knowledge about African genomes and especially archaic African genomes is very primitive. Not only does ancient DNA not preserve terribly well in many parts of Africa, but the continent is also rather poor and so people there don’t send their spit to 23 and Me very often to get DNA tested. Thus, sadly, I do not have archaic DNA percents for the Igbo.

WAfricaAdmixture
“Approximate Bayesian computation estimates for the introgressing population across four African populations (Yoruba from Ibadan (YRI), Esan in Nigeria [ESN], Gambian in Western Divisions in the Gambia [GWD], and Mende in Sierra Leone [MSL]),” from Recovering Signals of Ghost Archaic Introgression in African Populations
However, we do have data for their neighbors, the Yoruba, Esan, Mende, and Gambians.

Keep in mind that so far, Eurasians measure about 1-4% Neanderthal and Melanesians about 6% Denisovan, so 10% Ghost in west Africans is a pretty big deal (if you’re into archaic DNA.) The authors of the study estimate that the admixture occurred about 50,000 years ago, which is coincidentally about the same time as the admixture in non-Africans–suggesting that whatever triggered the Out of Africa migration may have also simultaneously triggered an Into Africa migration. 

If you’re not familiar with some of these groups (I only know a little about the Yoruba,) the Esan, Mende, Gambians, and Yoruba are all speakers of languages from the Niger-Congo family (of which the Bantu languages are a sub-set.) The Niger-Congo family is one of the world’s largest, with 1,540 languages and 700 million speakers. It spread within the past 3,000 years from a homeland somewhere in west Africa (possibly Nigeria) to dominate sub-Saharan Africa. As far as I can tell, the Igbo are quite similar genetically to the Yoruba, and the admixture event happened tens of thousands of years before these groups spread and split, so there’s a good chance that the Igbo have similarly high levels of ghost-pop admixture.

Interestingly, a population related to the Bushmen and Pygmies used to dominate central and southern Africa, before the Bantu expansion. While the Bantu expansion and the admixture event are separated by a good 40 or 50 thousand years, this still suggests the possibility of human hybrid vigor.

Edit: A new paper just came out! Whole-genome sequence analysis of a Pan African set of samples reveals archaic gene flow from an extinct basal population of modern humans into sub-Saharan populations:

Here, we examine 15 African populations covering all major continental linguistic groups, ecosystems, and lifestyles within Africa through analysis of whole-genome sequence data of 21 individuals sequenced at deep coverage. We observe a remarkable correlation among genetic diversity and geographic distance, with the hunter-gatherer groups being more genetically differentiated and having larger effective population sizes throughout most modern-human history. Admixture signals are found between neighbor populations from both hunter-gatherer and agriculturalists groups, whereas North African individuals are closely related to Eurasian populations. Regarding archaic gene flow, we test six complex demographic models that consider recent admixture as well as archaic introgression. We identify the fingerprint of an archaic introgression event in the sub-Saharan populations included in the models (~ 4.0% in Khoisan, ~ 4.3% in Mbuti Pygmies, and ~ 5.8% in Mandenka) from an early divergent and currently extinct ghost modern human lineage.

So the ghost population that shows up in the Pygmies the same ghost population as shows up in the Mende? Looks like it.

There’s a lot of interesting stuff in this paper, but I’d just like to highlight this one graph:

Populationsize

I don’t really understand how they compute these things, much less if this is accurate (though their present estimate for the size of the Han looks pretty good,) but assuming it is, we can say a few things: One, before 100,000 years ago, all of the groups–except the Laal of Chad–tracked closely together in size because they were one group. Most of the groups then got smaller simply because they split up. But there seems to have been some kind of really big population bottleneck a bit over a million years ago.

The other really interesting thing is the absolute Pygmy dominance of the mid-10,000-100,000 year range. The authors note:

It is noteworthy that we observed by PSMC a sudden Ne increase in Baka Pygmy around 30 kya. A similar increase was observed in another study that analyzed several Baka and Biaka samples [25]. In addition, this individual presents the highest average genome-wide heterozygosity compared to the rest of samples (Fig. 1b). Nevertheless, such abrupt Ne increase can be attributed to either a population expansion or episodes of separation and admixture [60]. Further analyses at population level are needed to distinguish between these two scenarios.

Until we get more information on the possible Pygmy-cide, let’s get back to Things Fall Apart, for the Igbo of 1890 aren’t their ancestors of 50,000 BC nor the conquerors of central Africa. Here’s an interesting page with information about some of the rituals Achebe wrote about, like the Feast of the New Yams and the Egwugwu ceremony:

 The egwugwu ceremony takes place in order to dispute the guilty side of a crime taken place, similar to our court trials… Nine egwugwu represented a village of the clan, their leader known as Evil Forest; exit the huts with their masks on.

Short page; fast read.

The egwugwu ceremony I found particularly interesting. Of course everyone knows the guys in masks are just guys in masks (well, I assume everyone knows that. It seems obvious,) yet in taking on the masks, they adopt a kind of veil of anonymity. In real life, they are people, with all of the biases of ordinary people; under the mask, they take on the identity of a spirit, free from the biases of ordinary people. It is similar to the official garb worn by judges in other countries, which often look quite silly (wigs on English barristers, for example,) but effectively demarcate a line between normal life and official pronouncements. By putting on the costume of the office, the judge becomes more than an individual.

I have long been fascinated by masks, masquerades, and the power of anonymity. Many famous writers, from Benjamin Franklin to Samuel Clemens, published under pseudonyms. The mask implies falseness–on Halloween, we dress up as things that we are not–but it also allows honesty by freeing us from the threat of retribution.

It is interesting that a small, tightly-knit society where everyone knows everyone and social relations are of paramount importance, like the Igbo, developed a norm of anonymizing judges in order to remove judicial decisions from normal social relations and obligations (as much as possible, anyway). Since most Igbo villages did not have kings or other aristocrats to dictate laws, rule was conducted by notable community members who had effectively purchased or earned noble titles. These nobles got to wear the masks and costumes of the egwgwu.

Ok, so it’s getting late and I need to wrap this up. This moment comes in every post.

I know I haven’t said much about the book itself. The plot, narrative, pacing, structure, writing style, etc. To be honest, that’s because I didn’t enjoy it very much. It was interesting for its content, along with a sense of “I’ve been trying to tell people this and I could have saved myself a lot of time by just pointing them to the book. And if this is a book taught in schools (we didn’t read it in my highschool, but I have heard that many people did,) then why aren’t people more aware of the contents?

What was tribal life like before the Europeans got there? Well, women got beaten a lot. Children were murdered to avenge tribal conflicts. Infant mortality was high. In other words, many things were pretty unpleasant.

And yet, interestingly, much of what we think was unpleasant about them was, in its own way, keeping the peace. As Will (Evolving Moloch) quotes from The Social Structure of Right and Wrong on Twitter:

“Much of the conduct described by anthropologists as conflict management, social control, or even law in tribal and other traditional societies is regarded as crime in modern [nation state] societies.” This is especially clear in the case of violent modes of redress such as assassination, feuding, fighting, maiming, and beating, but it also applies to the confiscation and destruction of property and to other forms of deprivation and humiliation. Such actions typically express a grievance by one person or group against another.

See, for example, when the village burned down Okonkwo’s house for accidentally killing a villager, when they burned down the church for “killing” a deity, or when they took a little girl and killed a little boy in revenge for someone in another village killing one of their women. To the villagers, these were all legal punishments, and the logic of burning down a person’s house if they have killed someone is rather similar to the logic of charging someone a fine for committing manslaughter. Even though Okonkwo didn’t mean to kill anyone, he should have been more careful with his gun, which he knew was dangerous and could kill someone.

Unlike penalties imposed by the state, however, private executions of this kind often result in revenge or even a feud—Moreover, the person killed in retaliation may not be himself or herself a killer, for in these societies violent conflicts between nonkin are virtually always handled in a framework of collective responsibility–or more precisely, collective liability–whereby all members of a social category (such as a family or lineage) are held accountable for the conduct of their fellows.

And, of course, penalties so meted out can be incredibly violent, arbitrary, and selfish, but ignoring that, there’s clearly a conflict when traditional, tribal ways of dealing with problems clash with state-based ways of dealing with problems. Even if everyone eventually agrees that the state-based system is more effective (and I don’t expect everyone to agree) the transition is liable to be difficult for some people, especially if, as in the book, they are punished by the state for enforcing punishments prescribed by their own traditional laws. The state is effectively punishing them for punishing law-breakers, creating what must seem to them a state of anarcho-tyranny.

As for polygamy, Achebe seems to gloss over some of its downsides. From Christakis’s Blueprint: The Origins of a Good Society (h/t Rob Henderson), we have:

Co-wife conflict is ubiquitous in polygynous households… Because the Turkana often choose wives from different families in order to broaden their safety net, they typically do not practice sororal [sister-wives] polygyny… When co-wives are relatives, they can more easily share a household and cooperate… But while sororal polygyny is especially common in cultures in the Americas, general polygyny tends to be the usual pattern in Africa. An examination of ethnographic data from 69 nonsororal polygynous cultures fails to turn up a single society where co-wife relations could be described as harmonious. Detailed ethnographic studies highlight the stresses and fears present in polygynous families, including, for example, wives’ concern that other wives might try to poison their children so that their own children might inherit land or property.

Anyway, let’s wrap this up with a little article on human pacification:

There is a well-entrenched schism on the frequency (how often), intensity (deaths per 100,000/year), and evolutionary significance of warfare among hunter-gatherers compared with large-scale societies. To simplify, Rousseauians argue that warfare among prehistoric and contemporary hunter-gatherers was nearly absent and, if present, was a late cultural invention. In contrast, so-called Hobbesians argue that violence was relatively common but variable among hunter-gatherers. … Furthermore, Hobbesians with empirical data have already established that the frequency and intensity of hunter-gatherer warfare is greater compared with large-scale societies even though horticultural societies engage in warfare more intensively than hunter-gatherers. In the end I argue that although war is a primitive trait we may share with chimpanzees and/or our last common ancestor, the ability of hunter-gatherer bands to live peaceably with their neighbors, even though war may occur, is a derived trait that fundamentally distinguishes us socially and politically from chimpanzee societies. It is a point often lost in these debates.

I think we should read Legal Systems Very Different from Ours for our next book. Any other ideas?

Can Autism be Cured via a Gluten Free Diet?

I’d like to share a story from a friend and her son–let’s call them Heidi and Sven.

Sven was always a sickly child, delicate and underweight. (Heidi did not seem neglectful.) Once Sven started school, Heidi started receiving concerned notes from his teachers. He wasn’t paying attention in class. He wasn’t doing his work. They reported repetitious behavior like walking slowly around the room and tapping all of the books. Conversation didn’t quite work with Sven. He was friendly, but rarely responded when spoken to and often completely ignored people. He moved slowly.

Sven’s teachers suggested autism. Several doctors later, he’d been diagnosed.

Heidi began researching everything she could about autism. Thankfully she didn’t fall down any of the weirder rabbit holes, but when Sven’s started complaining that his stomach hurt, she decided to try a gluten-free diet.

And it worked. Not only did Sven’s stomach stop hurting, but his school performance improved. He stopped laying his head down on his desk every afternoon. He started doing his work and responding to classmates.

Had a gluten free diet cured his autism?

Wait.

A gluten free diet cured his celiac disease (aka coeliac disease). Sven’s troublesome behavior was most likely caused by anemia, caused by long-term inflammation, caused by gluten intolerance.

When we are sick, our bodies sequester iron to prevent whatever pathogen is infecting us from using it. This is a sensible response to short-term pathogens that we can easily defeat, but in long-term sicknesses, leads to anemia. Since Sven was sick with undiagnosed celiac disease for years, his intestines were inflamed for years–and his body responded by sequestering iron for years, leaving him continually tired, spacey, and unable to concentrate in school.

The removal of gluten from his diet allowed his intestines to heal and his body to finally start releasing iron.

Whether or not Sven had (or has) autism is a matter of debate. What is autism? It’s generally defined by a list of symptoms/behaviors, not a list of causes. So very different causes could nonetheless trigger similar symptoms in different people.

Saying that Sven’s autism was “cured” by this diet is somewhat misleading, since gluten-free diets clearly won’t work for the majority of people with autism–those folks don’t have celiac disease. But by the same token, Sven was diagnosed with autism and his diet certainly did work for him, just as it might for other people with similar symptoms. We just don’t have the ability right now to easily distinguish between the many potential causes for the symptoms lumped together under “autism,” so parents are left trying to figure out what might work for their kid.

Interestingly, the overlap between “autism” and feeding problems /gastrointestinal disorders is huge. Now, when I say things like this, I often notice that people are confused about the scale of problems. Nearly every parent swears, at some point, that their child is terribly picky. This is normal pickiness that goes away with time and isn’t a real problem. The problems autistic children face are not normal.

Parent of normal child: “My kid is so picky! She won’t eat peas!”

Parent of autistic child: “My kid only eats peas.”

See the difference?

Let’s cut to Wikipedia, which has a nice summary:

Gastrointestinal problems are one of the most commonly associated medical disorders in people with autism.[80] These are linked to greater social impairment, irritability, behavior and sleep problems, language impairments and mood changes, so the theory that they are an overlap syndrome has been postulated.[80][81] Studies indicate that gastrointestinalinflammation, immunoglobulin E-mediated or cell-mediated food allergies, gluten-related disorders (celiac diseasewheat allergynon-celiac gluten sensitivity), visceral hypersensitivity, dysautonomia and gastroesophageal reflux are the mechanisms that possibly link both.[81]

A 2016 review concludes that enteric nervous system abnormalities might play a role in several neurological disorders, including autism. Neural connections and the immune system are a pathway that may allow diseases originated in the intestine to spread to the brain.[82] A 2018 review suggests that the frequent association of gastrointestinal disorders and autism is due to abnormalities of the gut–brain axis.[80]

The “leaky gut” hypothesis is popular among parents of children with autism. It is based on the idea that defects in the intestinal barrier produce an excessive increase of the intestinal permeability, allowing substances present in the intestine, including bacteria, environmental toxins and food antigens, to pass into the blood. The data supporting this theory are limited and contradictory, since both increased intestinal permeability and normal permeability have been documented in people with autism. Studies with mice provide some support to this theory and suggest the importance of intestinal flora, demonstrating that the normalization of the intestinal barrier was associated with an improvement in some of the ASD-like behaviours.[82] Studies on subgroups of people with ASD showed the presence of high plasma levels of zonulin, a protein that regulates permeability opening the “pores” of the intestinal wall, as well as intestinal dysbiosis (reduced levels of Bifidobacteria and increased abundance of Akkermansia muciniphilaEscherichia coliClostridia and Candida fungi) that promotes the production of proinflammatory cytokines, all of which produces excessive intestinal permeability.[83] This allows passage of bacterial endotoxins from the gut into the bloodstream, stimulating liver cells to secrete tumor necrosis factor alpha (TNFα), which modulates blood–brain barrier permeability. Studies on ASD people showed that TNFα cascades produce proinflammatory cytokines, leading to peripheral inflammation and activation of microglia in the brain, which indicates neuroinflammation.[83] In addition, neuroactive opioid peptides from digested foods have been shown to leak into the bloodstream and permeate the blood–brain barrier, influencing neural cells and causing autistic symptoms.[83] (See Endogenous opiate precursor theory)

Here is an interesting case report of psychosis caused by gluten sensitivity:

 In May 2012, after a febrile episode, she became increasingly irritable and reported daily headache and concentration difficulties. One month after, her symptoms worsened presenting with severe headache, sleep problems, and behavior alterations, with several unmotivated crying spells and apathy. Her school performance deteriorated… The patient was referred to a local neuropsychiatric outpatient clinic, where a conversion somatic disorder was diagnosed and a benzodiazepine treatment (i.e., bromazepam) was started. In June 2012, during the final school examinations, psychiatric symptoms, occurring sporadically in the previous two months, worsened. Indeed, she began to have complex hallucinations. The types of these hallucinations varied and were reported as indistinguishable from reality. The hallucinations involved vivid scenes either with family members (she heard her sister and her boyfriend having bad discussions) or without (she saw people coming off the television to follow and scare her)… She also presented weight loss (about 5% of her weight) and gastrointestinal symptoms such as abdominal distension and severe constipation.

So she’s hospitalized and they do a bunch of tests. Eventually she’s put on steroids, which helps a little.

Her mother recalled that she did not return a “normal girl”. In September 2012, shortly after eating pasta, she presented crying spells, relevant confusion, ataxia, severe anxiety and paranoid delirium. Then she was again referred to the psychiatric unit. A relapse of autoimmune encephalitis was suspected and treatment with endovenous steroid and immunoglobulins was started. During the following months, several hospitalizations were done, for recurrence of psychotic symptoms.

Again, more testing.

In September 2013, she presented with severe abdominal pain, associated with asthenia, slowed speech, depression, distorted and paranoid thinking and suicidal ideation up to a state of pre-coma. The clinical suspicion was moving towards a fluctuating psychotic disorder. Treatment with a second-generation anti-psychotic (i.e., olanzapine) was started, but psychotic symptoms persisted. In November 2013, due to gastro-intestinal symptoms and further weight loss (about 15% of her weight in the last year), a nutritionist was consulted, and a gluten-free diet (GFD) was recommended for symptomatic treatment of the intestinal complaints; unexpectedly, within a week of gluten-free diet, the symptoms (both gastro-intestinal and psychiatric) dramatically improvedDespite her efforts, she occasionally experienced inadvertent gluten exposures, which triggered the recurrence of her psychotic symptoms within about four hours. Symptoms took two to three days to subside again.

Note: she has non-celiac gluten sensitivity.

One month after [beginning the gluten free diet] AGA IgG and calprotectin resulted negative, as well as the EEG, and ferritin levels improved.

Note: those are tests of inflammation and anemia–that means she no longer has inflammation and her iron levels are returning to normal.

She returned to the same neuro-psychiatric specialists that now reported a “normal behavior” and progressively stopped the olanzapine therapy without any problem. Her mother finally recalled that she was returned a “normal girl”. Nine months after definitely starting the GFD, she is still symptoms-free.

This case is absolutely crazy. That poor girl. Here she was in constant pain, had constant constipation, was losing weight (at an age when children should be growing,) and the idiot adults thought she had a psychiatric problem.

This is not the only case of gastro-intestinal disorder I have heard of that presented as psychosis.

Speaking of stomach pain, did you know Curt Cobain suffered frequent stomach pain that was so severe it made him vomit and want to commit suicide, and he started self-medicating with heroin just to stop the pain? And then he died.

Back to autism and gastrointestinal issues other than gluten, here is a fascinating new study on fecal transplants (h/t WrathofGnon):

Many studies have reported abnormal gut microbiota in individuals with Autism Spectrum Disorders (ASD), suggesting a link between gut microbiome and autism-like behaviors. Modifying the gut microbiome is a potential route to improve gastrointestinal (GI) and behavioral symptoms in children with ASD, and fecal microbiota transplant could transform the dysbiotic gut microbiome toward a healthy one by delivering a large number of commensal microbes from a healthy donor. We previously performed an open-label trial of Microbiota Transfer Therapy (MTT) that combined antibiotics, a bowel cleanse, a stomach-acid suppressant, and fecal microbiota transplant, and observed significant improvements in GI symptoms, autism-related symptoms, and gut microbiota. Here, we report on a follow-up with the same 18 participants two years after treatment was completed. Notably, most improvements in GI symptoms were maintained, and autism-related symptoms improved even more after the end of treatment.

Fecal transplant is exactly what it sounds like. The doctors clear out a person’s intestines as best they can, then put in new feces, from a donor, via a tube (up the butt or through the stomach; either direction works.)

Unfortunately, it wasn’t a double-blind study, but the authors are hopeful that they can get funding for a double-blind placebo controlled study soon.

I’d like to quote a little more from this study:

Two years after the MTT was completed, we invited the 18 original subjects in our treatment group to participate in a follow-up study … Two years after treatment, most participants reported GI symptoms remaining improved compared to baseline … The improvement was on average 58% reduction in Gastrointestinal Symptom Rating Scale (GSRS) and 26% reduction in % days of abnormal stools… The improvement in GI symptoms was observed for all sub-categories of GSRS (abdominal pain, indigestion, diarrhea, and constipation, Supplementary Fig. S2a) as well as for all sub-categories of DSR (no stool, hard stool, and soft/liquid stool, Supplementary Fig. S2b), although the degree of improvement on indigestion symptom (a sub-category of GSRS) was reduced after 2 years compared with weeks 10 and 18. This achievement is notable, because all 18 participants reported that they had had chronic GI problems (chronic constipation and/or diarrhea) since infancy, without any period of normal GI health.

Note that these children were chosen because they had both autism and lifelong gastrointestinal problems. This treatment may do nothing at all for people who don’t have gastrointestinal problems.

The families generally reported that ASD-related symptoms had slowly, steadily improved since week 18 of the Phase 1 trial… Based on the Childhood Autism Rating Scale (CARS) rated by a professional evaluator, the severity of ASD at the two-year follow-up was 47% lower than baseline (Fig. 1b), compared to 23% lower at the end of week 10. At the beginning of the open-label trial, 83% of participants rated in the severe ASD diagnosis per the CARS (Fig. 2a). At the two-year follow-up, only 17% were rated as severe, 39% were in the mild to moderate range, and 44% of participants were below the ASD diagnostic cut-off scores (Fig. 2a). … The Vineland Adaptive Behavior Scale (VABS) equivalent age continued to improve (Fig. 1f), although not as quickly as during the treatment, resulting in an increase of 2.5 years over 2 years, which is much faster than typical for the ASD population, whose developmental age was only 49% of their physical age at the start of this study.

Important point: their behavior matured faster than it normally does in autistic children.

This is a really interesting study, and I hope the authors can follow it up with a solid double-blind.

Of course, not all autists suffer from gastrointestinal complaints. Many eat and digest without difficulty. But the connection between physical complaints and mental disruption across a variety of conditions is fascinating. How many conditions that we currently believe are psychological might actually be caused a by an untreated biological illness?