Review: Why Warriors Lie Down and Die

51uvfeh9d2lI read an interview once in which Napoleon Chagnon was asked what the Yanomamo thought of him–why did they think he had come to live with them?

“To learn how to be human,” he replied.

I didn’t read Trudgen’s Why Warriors Lie Down and Die because I have any hope of helping the Yolngu people, (I don’t live in Australia, for starters) but in hopes of learning something universal. People like to play the blame game–it’s all whites’ fault, it’s all Aborigines’ fault–but there are broken communities and dying people everywhere, and understanding one community may give us insight into the others.

For example, US life expectancy has been declining:

A baby born in 2017 is expected to live to be 78.6 years old, which is down from 78.7 the year before, according to data from the Centers for Disease Control and Prevention’s National Center for Health Statistics.

The last three years represent the longest consecutive decline in the American lifespan at birth since the period between 1915 and 1918, which included World War I and the Spanish Flu pandemic, events that killed many millions worldwide.

Declining? In the developed world?

While there’s no single cause for the decline in the U.S., a report by the CDC highlights three factors contributing to the decline:

Drug overdoses…

Liver disease…

Suicide…

Not to mention heart disease, stroke, and all of the usual suspects.

Most causes of death can be divided roughly into the diseases of poverty (infection, malnutrition, parasites, etc,) and the diseases of abundance (heart attacks, strokes, type 2 diabetes, etc). In developing countries, people tend to die of the former; in developed countries, the latter. There are a few exceptions–Costa Ricans enjoy good health because they have beaten back the diseases of poverty without becoming rich enough to die of obesity; Japan enjoys high standards of living, but has retained enough of its traditional eating habits to also not develop too many modern diseases (so far). 

The poor of many developed countries, however, often don’t get to enjoy much of the wealth, but still get hammered with the diseases. This is true in Australia and the US, and is the cause of much consternation–the average Aborigine or poor white would probably be healthier if they moved to poor country like Costa Rica and ate like the locals.

When Trudgen first moved to Arnhem Land (the traditional Yolngu area) in the 70s, the situation wasn’t great, but it wasn’t terrible. People were going to school, graduating, and getting jobs. Communities had elders and hope for the future.

He left for eight years, then returned in the 80s to find a community that had been destroyed, with skyrocketing unemployment, hopelessness, drug use, disease, and death:

So my return to work with the Yolngu after eight years away was marked by the stark reality of what had become “normal” life in Arnhem Land. The people were dying at a horrific rate, more than five times the national average. And they were dying of disease that they had not seen before, disease that were considered to be those of affluent society: heart attack, strokes, diabetes, cancer.

What went wrong?

Trudgen points out that the variety of normal explanations offered for the abysmal state of Aboriginal communities in the 80s don’t make sense in light of their relatively good condition a mere decade before. People didn’t suddenly get dumb, lazy, or violent. Rather:

… I discovered that the communities in Arnhem Land had changed. The people’s freedom to direct their own lives had been almost completely eroded.

How do people end up out of control of their own lives? The author discusses several things affecting the Yolngu in particular.

The biggest of these is language–English is not their first language, and for some not even their 4th or 5th. (According to Wikipedia, even today, most Yolngu do not speak English as their first language.) Trudgen explains that since Yolngu is a small, obscure language, at least as of when he was writing, no English-to-Yolngu dictionaries existed to help speakers look up the meaning of unfamiliar words like “tumor” or “mortgage.” (And this was before the widespread adoption of the internet.)

Imagine trying to conduct your affairs when every interaction with someone more powerful than yourself, from the bureaucrats at the DMV to the doctors at the hospital, was conducted in a language you didn’t speak very well, without the benefit of a dictionary or a translator. Trudgen writes that the Aborigines would actually like to learn how to protect their health, avoid dying from cancer and heart disease, etc, but the information on how to do these things doesn’t exist in their language. (He reminds us that it took a couple hundred years for the knowledge of things like “germs” to travel from scientists to regular people in our culture, and we all speak the same language.)

Both in Arnhem Land and without, people often overestimate how much other people know. For example, in a case Trudgen facilitated as a translator, a doctor thought his patient understood his explanation that due to diabetes, only 2% of his kidneys were functioning, but the patient didn’t actually understand enough English to make sense of the diagnosis–not to mention, as the author points out, that Yolngu culture doesn’t have the concept of “percents.” After translation, the man (who’d been seeing doctors for his kidneys for years without understanding what they were saying) finally understood and started treating his problems.

Those of us outside of Yolngu Land don’t have quite this level of difficulty interacting with medical professionals, but language still influences our lives in many ways. We have high and low class accents and dialects, not to mention an absurd quantity of verbal signaling and flexing, like sharing one’s pronouns in a presidential debate.

People everywhere also suffer from the condition of knowing a lot less than others assume they know. Every survey of common knowledge shocks us, yet again, with how dumb the common man is–and then we forget that we have ever seen such a survey and are equally shocked all over again when the next one comes out. (I think about this a lot while teaching.)

I think most people tend to remember information if they either use it regularly (like the code I use for formatting these posts) or if it’s valued/used in their culture (I know about the Kardashians despite never having tried to learn about them simply because people talk about them all of the time). If people talked about quantum physics the way we talk about superheroes, a lot more people would have posters of Niels Bohr.

For the Yolngu, there’s a problem that a lot of information simply isn’t available in their language. They were literally stone-age hunter-gatherers less than a century ago and are trying to catch up on a couple thousand years of learning. For us, the difficulty is more of access–I have a couple of relatives who are doctors, so if someone in my family gets sick, I call a relative first for advice before heading to the more expensive options. But if you don’t have any doctors among your friends/family, then you don’t have this option.

There are probably a lot of cases where people are stymied because they don’t know how to even begin to solve their problems.

Trudgen wants to solve this problem by having much more extensive language training for everyone in the area, white and Yolngu, and also by extending educational programs to the adults, so that the entire culture can be infused with knowledge.

After language difficulties, the other biggest impediment to living the good life, in Trudgen’s view, is… the welfare state:

Welfare and the dependency it creates is the worst form of violence. It has created a living hell.

Before the arrival of the white people, he notes, Aborigines survived perfectly fine on their own. The locals fished, hunted, gathered, and probably did some yam-based horticulture. They farmed pearls and traded them with Macassans from modern-day Indonesia for rice, and traded with tribes in the interior of Australia for other products. They even had their own legal system, similar to many of the others we have read about. Their lives were simple, yes. Their huts were not very tall, and they certainly didn’t have cellphones or penicillin, but they ran their own lives and those who made it out of infancy survived just fine.

Today, their lives are dominated at every turn by government institutions, welfare included. Children were once educated by their parents and the tribe at large. Now they are educated by white teachers at government run schools. People used to hunt and gather their own food, now they buy food at the supermarket with their welfare cheques. A man once built his own house; now such a house would be demolished because it doesn’t meet the building code requirements. Even Aborigine men trained as skilled housebuilders have been replaced by white builders, because the state decided that it needed to build houses faster.

Every program designed to “help” the Yolngu risks taking away yet one more piece of their sovereignty and ability to run their own lives. Trudgen complains of plans to build preschools in the area–to quote roughly, “they say the schools will be staffed with local Yolngu, but Yolngu don’t have the right credentials to qualify for such jobs. In a few years, Yolngu mothers will have even been pushed out of the role of caring for their own little children. What purpose will they have left in life?”

I just checked, and 88% of indigenous Australian children are now enrolled in preschool.

Or as the author puts it:

In fact, every attempt to solve the [malnutrition] problem with outside ideas has sent the malnutrition rates higher. Welfare-type programs simply send the people into greater depths of dependency, which increases feelings of confusion and hopelessness. Old people as well as children are not being cared for.

During 1999 the children received a free breakfast at the school and some people were talking about giving them free lunches as well. So now the government feeds the people’s children, as well as build their houses and provides all levels of welfare for them. What is there left for them to do but go ff and drink kava or gamble?

And ultimately:

… where the people have lost control, the men are dead or dying.

Incidentally, here is an article on loneliness in American suburbia.

Everything here is compounded by the habit of modern governments to make everything illegal; complicated; or require three permits, two environmental impact studies, and 17 licenses before you can break ground. As Joel Salatin pens, “Everything I want to do is Illegal.”

Aborigines used to build their own houses, and whether they were good or not, they lived in them. (In fact, all groups of people are competent at building their own shelters.)

Then government came and declared that these houses were no good, they weren’t up to code, and the Aborigines had to be trained to build houses the white way. So the Aborigines learned, and began building “modern” houses.

Whether they were good at it or not, they had jobs and people had houses.

Then the government decided that the Aborigine builders weren’t building houses fast enough, so they brought in the army and threw up a bunch of pre-fab houses.

Now the taxpayers pay for whites to go to Yolngu land and build houses for the Aborigines. The aborigines who used to build the houses are out of a job and on welfare, while the money for the houses goes into the pockets of outsiders.

Yes, the houses get built faster, but it’s hard to say that this is “better” than just letting the locals build their own houses.

The same process has happened in other industries. Even trash collection in Yolngu areas is now done by newcomers. At every turn, it seems, the Yolngu are either pushed out of jobs because they weren’t as fast or efficient or had the right certificates and credentials, or because they just didn’t speak enough English.

What happens to a dream deferred?

Does it dry up
like a raisin in the sun?
Or fester like a sore—
And then run?
Does it stink like rotten meat?
Or crust and sugar over—
like a syrupy sweet?

Maybe it just sags
like a heavy load.

Or does it explode?

Langston Hughes, Harlem

The story of the fishing industry was also and adventure in bad decision-making.

Originally, simplifying a bit for the sake of time, each fisherman (or perhaps a small group of fishermen) had his own boat, and caught as many fish as he wanted and sold the rest to a fishing organization run by the local mission. This was clear and straightforward: men owned their own catches and could do what they wanted with them. The area was a net exporter of fish and the locals made a decent living.

Then the government decided the mission system was no good, and turned everything over to “communal councils.” This was a great big mess.

Trudgen points out that the councils aren’t consistent with existing Yolngu laws/governing norms. They already had elders and governing bodies which the government didn’t recognize, so the government effectively created an illegitimate government and set it in conflict with the existing one, in the name of democracy, with shades of every failed attempt to impose democracy on a foreign country.

The councils didn’t work because 1. they didn’t have real authority, and 2. communism always fails.

In this case, the council decided to get a loan to “develop” the fishing industry, but before they could get a loan, the bank sent out an efficiency expert who looked at all of the little boats and declared that it would be much more efficient if they just used one big boat.

So the council bought a big boat and burned the little boats in the middle of the night so no one could use them anymore.

Now “ownership” of the boat was all confused. Men were not clearly working to catch their own fish on their own boat, they were part of a big crew on a big boat with a boss. The boss had to be someone with the correct licenses and whatnot to be allowed to run a big boat, and of course he had to pay his employees, which probably gets you into Australian tax law, liability law, insurance law, etc. In short, the boss wasn’t a local Yolngu because the Yolngu didn’t have the right credentials to run the boat, so the fishermen now had to work for an outsider, and it was no longer clear which part of their catch was “theirs” and which part was the boss’s.

The fishing industry quickly fell apart and the area became a net importer of fish.

These councils set up by the government to run local affairs failed repeatedly, much to the distress of the locals–but Trudgen notes that collectivism didn’t work for the USSR, either.

One constant impression I got from the book is that multiculturalism is hard. Even without language issues, people from different cultures have different ideas about what it means to be respectful, polite, honest, or timely. Different ideas about what causes disease, or whether Coca Cola ads are a trustworthy source of nutrition advice. (If they aren’t, then why does the government allow them to be on the air?) 

Which gets me to one of my recurrent themes, which Trudgen touches on: society lies. All the time. Those of us who know society lies and all of the rules and meta-rules surrounding the lying are reasonably well equipped to deal with it, but those of us who don’t know the rules usually get screwed by them.

As Wesley Yang puts it in The Souls of Yellow Folk:

“Someone told me not long after I moved to New York that in order to succeed, you have to understand which rules you’re supposed to break. If you break the wrong rules, you’re finished. And so the easiest thing to do is follow all the rules. But then you consign yourself to a lower status. The real trick is understanding what rules are not meant for you.”

The idea of a kind of rule-governed rule-breaking–where the rule book was unwritten but passed along in an innate cultural sense–is perhaps the best explanation I have heard of how the Bamboo Ceiling functions in practice.

It’s not just Asians. Poor people, rural people, nerds, outsiders in general know only the explicitly taught rules, not the rules about breaking rules–and suffer for it.

And I think society lies in part because it serves the powerful. People lie about their age, their looks, their intelligence, how they got ahead and how they think you should apply for a job. Coca Cola lies about the healthiness of its product because it wants to sell more Coke, and the Aborigines believe it because they have very little experience with foods that taste good but aren’t good for you. Out in nature, in the traditional Aboriginal diet, sweet foods like fruits and berries were always good for you.

And these little lies are usually portrayed as “in your best interest,” but I’m far from convinced that they are.

People have been talking about UBI lately, at least the Yang Gang types. And I like Yang, at least as presidential candidates go. But we should be careful about whether more welfare is really the panacea we think it is.

The Yolngu have welfare already, and it doesn’t seem to be helping. At least, it doesn’t seem to make them happy. My conclusion from reading the book obviously isn’t that the Yolngu need more welfare or more programs. It’s that they need control over their own lives and communities. For that, they need something like Amish–a system of internal organization sufficient to feed themselves, deal with the outside world, and get it to back off.

Of course, I don’t know if that would actually work for the Yolngu in particular, but the Amish seem a reasonable model for solving many of modernity’s current problems.

Short argument for vending machines full of experimental drugs

So I was thinking the other day about medication and Marilyn Manson’s “I don’t like the drugs but the drugs like me,” and it occurred to me that illegal drugs, generally speaking, are really good at what they do.

By contrast, take anti-depressants. Even the “really good” ones have abominable track records. Maybe a good drug works for 10, 20% of the population–but you don’t know which. Depressed people just have to keep trying different pills until they find one that works better than placebo.

Meanwhile, you’ll never hear someone say “Oh, yeah, crack just doesn’t do anything for me.” Crack works. Heroin works. Sure, they’ll fuck you up, but they work.

Illegal drugs are tried and tested in the almost-free black market of capitalism, where people do whatever they want with them–grind them up, snort them, inject them, put them up their buts–and stop taking them whenever they stop working. As a result, illegal drugs are optimized for being highly addictive, yes, but also for working really well. And through trial and error, people have figured out how much they need, how best to take it, and how often for the optimal effects.

In other words, simply letting lots of people mess around with drugs results in really effective drugs.

The downside to the black-free-market refinement of drugs is that lots of people die in the process.

Most people don’t want to be killed by an experimental anti-depressant, (is that ironic? That seems kind of ironic,) so it makes sense to have safeguards in place to make sure that their latest incarnations won’t send you into cardiac arrest, but many medications are intended for people whose lives are otherwise over. People with alzheimer’s, pancreatic cancer, glioblastoma, ALS, fatal familial insomnia, etc, are going to die. (Especially the ones with fatal familial insomnia. I mean, it’s got “fatal” in the name.) They have been handed death sentences and they know it, so their only possible hope is to speed up drug/treatment development as much as possible.

I am quite certain that something similar to what I am proposing already exists in some form. I am just proposing that we ramp it up: all patients with essentially incurable death sentences have access to whatever experimental drugs (or non-experimental drugs) they  want, with a few obvious caveats about price–but really, price tends to come down with increased demand, so just stock everything in vending machines and charge 75c a dose.

Of course, the end result might just be that alzheimer’s meds come to closely resemble heroin, but hey, at least sick people will feel better as they die.

Since this is a short post, let me append a quick description of fatal familial insomnia: 

Fatal insomnia is a rare disorder that results in trouble sleeping.[2] The problems sleeping typically start out gradually and worsen over time.[3] Other symptoms may include speech problems, coordination problems, and dementia.[4][5] It results in death within a few months to a few years.[2]

It is a prion disease of the brain.[2] It is usually caused by a mutation to the protein PrPC.[2] It has two forms: fatal familial insomnia (FFI), which is autosomal dominant and sporadic fatal insomnia (sFI) which is due to a noninherited mutation. Diagnosis is based on a sleep studyPET scan, and genetic testing.[1]

Fatal insomnia has no known cure and involves progressively worsening insomnia, which leads to hallucinations, delirium, confusional states like that of dementia, and eventually death.[6] The average survival time from onset of symptoms is 18 months.[6] The first recorded case was an Italian man, who died in Venice in 1765.[7]

Terrible.

 

Can Autism be Cured via a Gluten Free Diet?

I’d like to share a story from a friend and her son–let’s call them Heidi and Sven.

Sven was always a sickly child, delicate and underweight. (Heidi did not seem neglectful.) Once Sven started school, Heidi started receiving concerned notes from his teachers. He wasn’t paying attention in class. He wasn’t doing his work. They reported repetitious behavior like walking slowly around the room and tapping all of the books. Conversation didn’t quite work with Sven. He was friendly, but rarely responded when spoken to and often completely ignored people. He moved slowly.

Sven’s teachers suggested autism. Several doctors later, he’d been diagnosed.

Heidi began researching everything she could about autism. Thankfully she didn’t fall down any of the weirder rabbit holes, but when Sven’s started complaining that his stomach hurt, she decided to try a gluten-free diet.

And it worked. Not only did Sven’s stomach stop hurting, but his school performance improved. He stopped laying his head down on his desk every afternoon. He started doing his work and responding to classmates.

Had a gluten free diet cured his autism?

Wait.

A gluten free diet cured his celiac disease (aka coeliac disease). Sven’s troublesome behavior was most likely caused by anemia, caused by long-term inflammation, caused by gluten intolerance.

When we are sick, our bodies sequester iron to prevent whatever pathogen is infecting us from using it. This is a sensible response to short-term pathogens that we can easily defeat, but in long-term sicknesses, leads to anemia. Since Sven was sick with undiagnosed celiac disease for years, his intestines were inflamed for years–and his body responded by sequestering iron for years, leaving him continually tired, spacey, and unable to concentrate in school.

The removal of gluten from his diet allowed his intestines to heal and his body to finally start releasing iron.

Whether or not Sven had (or has) autism is a matter of debate. What is autism? It’s generally defined by a list of symptoms/behaviors, not a list of causes. So very different causes could nonetheless trigger similar symptoms in different people.

Saying that Sven’s autism was “cured” by this diet is somewhat misleading, since gluten-free diets clearly won’t work for the majority of people with autism–those folks don’t have celiac disease. But by the same token, Sven was diagnosed with autism and his diet certainly did work for him, just as it might for other people with similar symptoms. We just don’t have the ability right now to easily distinguish between the many potential causes for the symptoms lumped together under “autism,” so parents are left trying to figure out what might work for their kid.

Interestingly, the overlap between “autism” and feeding problems /gastrointestinal disorders is huge. Now, when I say things like this, I often notice that people are confused about the scale of problems. Nearly every parent swears, at some point, that their child is terribly picky. This is normal pickiness that goes away with time and isn’t a real problem. The problems autistic children face are not normal.

Parent of normal child: “My kid is so picky! She won’t eat peas!”

Parent of autistic child: “My kid only eats peas.”

See the difference?

Let’s cut to Wikipedia, which has a nice summary:

Gastrointestinal problems are one of the most commonly associated medical disorders in people with autism.[80] These are linked to greater social impairment, irritability, behavior and sleep problems, language impairments and mood changes, so the theory that they are an overlap syndrome has been postulated.[80][81] Studies indicate that gastrointestinalinflammation, immunoglobulin E-mediated or cell-mediated food allergies, gluten-related disorders (celiac diseasewheat allergynon-celiac gluten sensitivity), visceral hypersensitivity, dysautonomia and gastroesophageal reflux are the mechanisms that possibly link both.[81]

A 2016 review concludes that enteric nervous system abnormalities might play a role in several neurological disorders, including autism. Neural connections and the immune system are a pathway that may allow diseases originated in the intestine to spread to the brain.[82] A 2018 review suggests that the frequent association of gastrointestinal disorders and autism is due to abnormalities of the gut–brain axis.[80]

The “leaky gut” hypothesis is popular among parents of children with autism. It is based on the idea that defects in the intestinal barrier produce an excessive increase of the intestinal permeability, allowing substances present in the intestine, including bacteria, environmental toxins and food antigens, to pass into the blood. The data supporting this theory are limited and contradictory, since both increased intestinal permeability and normal permeability have been documented in people with autism. Studies with mice provide some support to this theory and suggest the importance of intestinal flora, demonstrating that the normalization of the intestinal barrier was associated with an improvement in some of the ASD-like behaviours.[82] Studies on subgroups of people with ASD showed the presence of high plasma levels of zonulin, a protein that regulates permeability opening the “pores” of the intestinal wall, as well as intestinal dysbiosis (reduced levels of Bifidobacteria and increased abundance of Akkermansia muciniphilaEscherichia coliClostridia and Candida fungi) that promotes the production of proinflammatory cytokines, all of which produces excessive intestinal permeability.[83] This allows passage of bacterial endotoxins from the gut into the bloodstream, stimulating liver cells to secrete tumor necrosis factor alpha (TNFα), which modulates blood–brain barrier permeability. Studies on ASD people showed that TNFα cascades produce proinflammatory cytokines, leading to peripheral inflammation and activation of microglia in the brain, which indicates neuroinflammation.[83] In addition, neuroactive opioid peptides from digested foods have been shown to leak into the bloodstream and permeate the blood–brain barrier, influencing neural cells and causing autistic symptoms.[83] (See Endogenous opiate precursor theory)

Here is an interesting case report of psychosis caused by gluten sensitivity:

 In May 2012, after a febrile episode, she became increasingly irritable and reported daily headache and concentration difficulties. One month after, her symptoms worsened presenting with severe headache, sleep problems, and behavior alterations, with several unmotivated crying spells and apathy. Her school performance deteriorated… The patient was referred to a local neuropsychiatric outpatient clinic, where a conversion somatic disorder was diagnosed and a benzodiazepine treatment (i.e., bromazepam) was started. In June 2012, during the final school examinations, psychiatric symptoms, occurring sporadically in the previous two months, worsened. Indeed, she began to have complex hallucinations. The types of these hallucinations varied and were reported as indistinguishable from reality. The hallucinations involved vivid scenes either with family members (she heard her sister and her boyfriend having bad discussions) or without (she saw people coming off the television to follow and scare her)… She also presented weight loss (about 5% of her weight) and gastrointestinal symptoms such as abdominal distension and severe constipation.

So she’s hospitalized and they do a bunch of tests. Eventually she’s put on steroids, which helps a little.

Her mother recalled that she did not return a “normal girl”. In September 2012, shortly after eating pasta, she presented crying spells, relevant confusion, ataxia, severe anxiety and paranoid delirium. Then she was again referred to the psychiatric unit. A relapse of autoimmune encephalitis was suspected and treatment with endovenous steroid and immunoglobulins was started. During the following months, several hospitalizations were done, for recurrence of psychotic symptoms.

Again, more testing.

In September 2013, she presented with severe abdominal pain, associated with asthenia, slowed speech, depression, distorted and paranoid thinking and suicidal ideation up to a state of pre-coma. The clinical suspicion was moving towards a fluctuating psychotic disorder. Treatment with a second-generation anti-psychotic (i.e., olanzapine) was started, but psychotic symptoms persisted. In November 2013, due to gastro-intestinal symptoms and further weight loss (about 15% of her weight in the last year), a nutritionist was consulted, and a gluten-free diet (GFD) was recommended for symptomatic treatment of the intestinal complaints; unexpectedly, within a week of gluten-free diet, the symptoms (both gastro-intestinal and psychiatric) dramatically improvedDespite her efforts, she occasionally experienced inadvertent gluten exposures, which triggered the recurrence of her psychotic symptoms within about four hours. Symptoms took two to three days to subside again.

Note: she has non-celiac gluten sensitivity.

One month after [beginning the gluten free diet] AGA IgG and calprotectin resulted negative, as well as the EEG, and ferritin levels improved.

Note: those are tests of inflammation and anemia–that means she no longer has inflammation and her iron levels are returning to normal.

She returned to the same neuro-psychiatric specialists that now reported a “normal behavior” and progressively stopped the olanzapine therapy without any problem. Her mother finally recalled that she was returned a “normal girl”. Nine months after definitely starting the GFD, she is still symptoms-free.

This case is absolutely crazy. That poor girl. Here she was in constant pain, had constant constipation, was losing weight (at an age when children should be growing,) and the idiot adults thought she had a psychiatric problem.

This is not the only case of gastro-intestinal disorder I have heard of that presented as psychosis.

Speaking of stomach pain, did you know Curt Cobain suffered frequent stomach pain that was so severe it made him vomit and want to commit suicide, and he started self-medicating with heroin just to stop the pain? And then he died.

Back to autism and gastrointestinal issues other than gluten, here is a fascinating new study on fecal transplants (h/t WrathofGnon):

Many studies have reported abnormal gut microbiota in individuals with Autism Spectrum Disorders (ASD), suggesting a link between gut microbiome and autism-like behaviors. Modifying the gut microbiome is a potential route to improve gastrointestinal (GI) and behavioral symptoms in children with ASD, and fecal microbiota transplant could transform the dysbiotic gut microbiome toward a healthy one by delivering a large number of commensal microbes from a healthy donor. We previously performed an open-label trial of Microbiota Transfer Therapy (MTT) that combined antibiotics, a bowel cleanse, a stomach-acid suppressant, and fecal microbiota transplant, and observed significant improvements in GI symptoms, autism-related symptoms, and gut microbiota. Here, we report on a follow-up with the same 18 participants two years after treatment was completed. Notably, most improvements in GI symptoms were maintained, and autism-related symptoms improved even more after the end of treatment.

Fecal transplant is exactly what it sounds like. The doctors clear out a person’s intestines as best they can, then put in new feces, from a donor, via a tube (up the butt or through the stomach; either direction works.)

Unfortunately, it wasn’t a double-blind study, but the authors are hopeful that they can get funding for a double-blind placebo controlled study soon.

I’d like to quote a little more from this study:

Two years after the MTT was completed, we invited the 18 original subjects in our treatment group to participate in a follow-up study … Two years after treatment, most participants reported GI symptoms remaining improved compared to baseline … The improvement was on average 58% reduction in Gastrointestinal Symptom Rating Scale (GSRS) and 26% reduction in % days of abnormal stools… The improvement in GI symptoms was observed for all sub-categories of GSRS (abdominal pain, indigestion, diarrhea, and constipation, Supplementary Fig. S2a) as well as for all sub-categories of DSR (no stool, hard stool, and soft/liquid stool, Supplementary Fig. S2b), although the degree of improvement on indigestion symptom (a sub-category of GSRS) was reduced after 2 years compared with weeks 10 and 18. This achievement is notable, because all 18 participants reported that they had had chronic GI problems (chronic constipation and/or diarrhea) since infancy, without any period of normal GI health.

Note that these children were chosen because they had both autism and lifelong gastrointestinal problems. This treatment may do nothing at all for people who don’t have gastrointestinal problems.

The families generally reported that ASD-related symptoms had slowly, steadily improved since week 18 of the Phase 1 trial… Based on the Childhood Autism Rating Scale (CARS) rated by a professional evaluator, the severity of ASD at the two-year follow-up was 47% lower than baseline (Fig. 1b), compared to 23% lower at the end of week 10. At the beginning of the open-label trial, 83% of participants rated in the severe ASD diagnosis per the CARS (Fig. 2a). At the two-year follow-up, only 17% were rated as severe, 39% were in the mild to moderate range, and 44% of participants were below the ASD diagnostic cut-off scores (Fig. 2a). … The Vineland Adaptive Behavior Scale (VABS) equivalent age continued to improve (Fig. 1f), although not as quickly as during the treatment, resulting in an increase of 2.5 years over 2 years, which is much faster than typical for the ASD population, whose developmental age was only 49% of their physical age at the start of this study.

Important point: their behavior matured faster than it normally does in autistic children.

This is a really interesting study, and I hope the authors can follow it up with a solid double-blind.

Of course, not all autists suffer from gastrointestinal complaints. Many eat and digest without difficulty. But the connection between physical complaints and mental disruption across a variety of conditions is fascinating. How many conditions that we currently believe are psychological might actually be caused a by an untreated biological illness?

Does the DSM need to be re-written?

I recently came across an interesting paper that looked at the likelihood that a person, once diagnosed with one mental disorder, would be diagnosed with another. (Exploring Comorbidity Within Mental Disorders Among a Danish National Population, by Oleguer Plana-Ripoll.)

This was a remarkable study in two ways. First, it had a sample size of 5,940,778, followed up for 83.9 million person-years–basically, the entire population of Denmark over 15 years. (Big Data indeed.)

Second, it found that for virtually every disorder, one diagnoses increased your chances of being diagnosed with a second disorder. (“Comorbid” is a fancy word for “two diseases or conditions occurring together,” not “dying at the same time.”) Some diseases were particularly likely to co-occur–in particular, people diagnosed with “mood disorders” had a 30% chance of also being diagnosed with “neurotic disorders” during the 15 years covered by the study.

Mood disorders includes bipolar, depression, and SAD;

Neurotic disorders include anxieties, phobias, and OCD.

Those chances were considerably higher for people diagnosed at younger ages, and decreased significantly for the elderly–those diagnosed with mood disorders before the age of 20 had a +40% chance of also being diagnosed with a neurotic disorder, while those diagnosed after 80 had only a 5% chance.

I don’t find this terribly surprising, since I know someone with at least five different psychological diagnoses, (nor is it surprising that many people with “intellectual disabilities” also have “developmental disorders”) but it’s interesting just how pervasive comorbidity is across conditions that are ostensibly separate diseases.

This suggests to me that either many people are being mis-diagnosed (perhaps diagnosis itself is very difficult,) or what look like separate disorders are often actually one, single disorder. While it is certainly possible, of course, for someone to have both a phobia of snakes and seasonal affective disorder, the person I know with five diagnoses most likely has only one “true” disorder that has just been diagnosed and treated differently by different clinicians. It seems likely that some people’s depression also manifests itself as deep-rooted anxiety or phobias, for example.

While this is a bit of a blow for many psychiatric diagnoses, (and I am quite certain that many diagnostic categories will need a fair amount of revision before all is said and done,) autism recently got a validity boost–How brain scans can diagnose Autism with 97% accuracy.

The title is overselling it, but it’s interesting anyway:

Lead study author Marcel Just, PhD, professor of psychology and director of the Center for Cognitive Brain Imaging at Carnegie Mellon University, and his team performed fMRI scans on 17 young adults with high-functioning autism and 17 people without autism while they thought about a range of different social interactions, like “hug,” “humiliate,” “kick” and “adore.” The researchers used machine-learning techniques to measure the activation in 135 tiny pieces of the brain, each the size of a peppercorn, and analyzed how the activation levels formed a pattern. …

So great was the difference between the two groups that the researchers could identify whether a brain was autistic or neurotypical in 33 out of 34 of the participants—that’s 97% accuracy—just by looking at a certain fMRI activation pattern. “There was an area associated with the representation of self that did not activate in people with autism,” Just says. “When they thought about hugging or adoring or persuading or hating, they thought about it like somebody watching a play or reading a dictionary definition. They didn’t think of it as it applied to them.” This suggests that in autism, the representation of the self is altered, which researchers have known for many years, Just says.

N=34 is not quite as impressive as N=Denmark, but it’s a good start.

Book Club: The 10,000 Year Explosion pt. 6: Expansion

5172bf1dp2bnl-_sx323_bo1204203200_

Welcome back to the Book Club. Today we’re discussing chapter 6 of Cochran and Harpending’s The 10,000 Year Explosion: Expansions

The general assumption is that the winning advantage is cultural–that is to say, learned. Weapons, tactics, political organization, methods of agriculture: all is learned. The expansion of modern humans is the exception to the rule–most observers suspect that biological difference were the root cause of their advantage. … 

the assumption that more recent expansions are all driven by cultural factors is based on the notion that modern humans everywhere have essentially the same abilities. that’s a logical consequence of human evolutionary stasis” If humans have not undergone a significant amount of biological change since the expansion out of Africa, then people everywhere would have essentially the same potentials, and no group would have a biological advantage over its neighbors. But as we never tire of pointing out, there has been significant biological change during that period.

I remember a paper I wrote years ago (long before this blog) on South Korea’s meteoric economic rise. In those days you had to actually go to the library to do research, not just futz around on Wikipedia. My memory says the stacks were dimly lit, though that is probably just some romanticizing. 

I poured through volumes on 5 year economic plans, trying to figure out why South Korea’s were more successful than other nations’. Nothing stood out to me. Why this plan and not this plan? Did 5 or 10 years matter? 

I don’t remember what I eventually concluded, but it was probably something along the lines of “South Korea made good plans that worked.” 

People around these parts often criticize Jared Diamond for invoking environmental explanations while ignoring or directly counter-signaling their evolutionary implications, but Diamond was basically the first author I read who said anything that even remotely began to explain why some countries succeeded and others failed. 

Environment matters. Resources matter. Some peoples have long histories of civilization, others don’t. Korea has a decently long history. 

Diamond was one of many authors who broke me out of the habit of only looking at explicit things done by explicitly recognized governments, and at wider patterns of culture, history, and environment. It was while reading Peter Frost’s blog that I first encountered the phrase “gene-culture co-evolution,” which supplies the missing link. 

800px-National_IQ_per_country_-_estimates_by_Lynn_and_Vanhanen_2006
IQ by country

South Korea does well because 1. It’s not communist and 2. South Koreans are some of the smartest people in the world. 

I knew #1, but I could have saved myself a lot of time in the stacks if someone had just told me #2 instead of acting like SK’s economic success was a big mystery. 

The fact that every country was relatively poor before industrialization, and South Korea was particularly poor after a couple decades of warfare back and forth across the peninsula, obscures the nation’s historically high development. 

For example, the South Korean Examination system, Gwageo, was instituted in 788 (though it apparently didn’t become important until 958). Korea has had agriculture and literacy for a long time, with accompanying political and social organization. This probably has more to do with South Korea having a relatively easy time adopting the modern industrial economy than anything in particular in the governments’ plans. 

Cochran has an interesting post on his blog on Jared Diamond and Domestication: 

In fact, in my mind the real question is not why various peoples didn’t domesticate animals that we know were domesticable, but rather how anyone ever managed to domesticate the aurochs. At least twice. Imagine a longhorn on roids: they were big and aggressive, favorites in the Roman arena. … 

The idea is that at least some individual aurochs were not as hostile and fearful of humans as they ought to have been, because they were being manipulated by some parasite. … This would have made domestication a hell of a lot easier. …

The beef tape worm may not have made it through Beringia.  More generally, there were probably no parasites in the Americas that had some large mammal as intermediate host and Amerindians as the traditional definite host. 

They never mentioned parasites in gov class. 

Back to the book–I thought this was pretty interesting:

One sign of this reduced disease pressure is the unusual distribution of HLA alleles among Amerindians. the HLA system … is a group of genes that encode proteins expressed on the outer surfaces of cells. the immune system uses them to distinguish the self from non-self… their most important role is in infections disease. … 

HLA genes are among the most variable of all genes. … Because these genes are so variable, any two humans (other than identical twins) are almost certain to have a different set of them. … Natural selection therefore favors diversification of the HLA genes, and some alleles, though rare, have been persevered for a long time. In fact, some are 30 million years old, considerably older than Homo sapiens. …

But Amerindians didn’t have that diversity. Many tribes have a single HLA allele with a frequency of over 50 percent. … A careful analysis of global HLA diversity confirms continuing diversifying selection on HLA in most human populations but finds no evidence of any selection at all favoring diversity in HLA among Amerindians.

The results, of course, went very badly for the Indians–and allowed minuscule groups of Spaniards to conquer entire empires. 

The threat of European (and Asian and African) diseases wiping out native peoples continues, especially for “uncontacted” tribes. As the authors note, the Surui of Brazil numbered 800 when contacted in 1980, but only 200 in 1986, after tuberculosis had killed most of them. 

…in 1827, smallpox spared only 125 out of 1,600 Mandan Indians in what later became North Dakota.

The past is horrific. 

I find the history ancient exploration rather fascinating. Here is the frieze in Persepolis with the okapi and three Pygmies, from about 500 BC.

The authors quote Joao de Barros, a 16th century Portuguese historian: 

But it seems that for our sins, or for some inscrutable judgment of God, in all the entrances of this great Ethiopia we navigate along… He has placed a striking angel with a flaming sword of deadly fevers, who prevents us from penetrating into the interior to the springs of this garden, whence proceed these rivers of gold that flow to the sea in so many parts of our conquest.

Barros had a way with words. 

It wasn’t until quinine became widely available that Europeans had any meaningful success at conquering Africa–and even still, despite massive technological advantages, Europeans haven’t held the continent, nor have they made any significant, long-term demographic impact. 

EX-lactoseintolerance
Source: National Geographic

The book then segues into a discussion of the Indo-European expansion, which the authors suggest might have been due to the evolution of a lactase persistence gene. 

(Even though we usually refer to people as “lactose intolerance” and don’t regularly refer to people as “lactose tolerant,” it’s really tolerance that’s the oddity–most of the world’s population can’t digest lactose after childhood.

Lactase is the enzyme that breaks down lactose.)

Since the book was published, the Indo-European expansion has been traced genetically to the Yamnaya (not to be confused with the Yanomamo) people, located originally in the steppes north of the Caucasus mountains. (The Yamnaya and Kurgan cultures were, I believe, the same.) 

An interesting linguistic note: 

Uralic languages (the language family containing Finnish and Hungarian) appear to have had extensive contact with early Indo-European, and they may share a common ancestry. 

I hope these linguistic mysteries continue to be decoded. 

The authors claim that the Indo-Europeans didn’t make a huge genetic impact on Europe, practicing primarily elite dominance–but on the other hand, A Handful of Bronze-Age Men Could Have Fathered 2/3s of Europeans:

In a new study, we have added a piece to the puzzle: the Y chromosomes of the majority of European men can be traced back to just three individuals living between 3,500 and 7,300 years ago. How their lineages came to dominate Europe makes for interesting speculation. One possibility could be that their DNA rode across Europe on a wave of new culture brought by nomadic people from the Steppe known as the Yamnaya.

That’s all for now; see you next week.

Sugar

I have some hopefully good, deep stuff I am working on, but in the meanwhile, here is a quick, VERY SPECULATIVE thread on my theory for why refined sugars are probably bad for you:

First, refined sugars are evolutionarily novel. Unless you’re a Hazda, your ancient ancestors never had this much sugar.

Pick up a piece of raw sugar cane and gnaw on it. Raw sugar cane has such a high fiber to sugar content that you can use it as a toothbrush after chewing it for a bit.

According to the internet, a stick of raw sugar cane has 10 grams of sugar in it. A can of Coke has 39. Even milk (whole, skim, or fat-free) contains 12 grams of natural milk sugars (lactose) per glass. Your body has no problem handling the normal amounts of unrefined sugars in regular foods, but to get the amount of sugar found in a single soda, you’d have to eat almost four whole stalks of sugarcane, which you certainly aren’t going to do in a few minutes.

It’s when we extract all of the sugar and throw away the rest of the fiber, fat, and protein in the food that we run into trouble.

(The same is probably also true of fat, though I am rather fond of butter.)

In my opinion, all forms of heavily refined sugar are suspect, including fruit juice, which is essentially refined fructose. People think that fruit juice is “healthy” because it comes from fruit, which is a plant and therefore “natural” and “good for you,” unlike, say, sugar, which comes from sugar cane, which is… also a plant. Or HFCS, which is totally unnatural because it comes from… corn. Which is a plant.

“They actually did studies on the sugar plantations back in the early 1900s. All of the workers were healthy and lived longer than the sugar executives who got the refined, processed product.”

I don’t know if I agree with everything he has to say, but refined fructose is no more natural than any other refined sugar. Again, the amount of sugar you get from eating an apple is very different from the amount you get from a cup of apple juice.

Now people are talking about reducing childhood obesity by eliminating the scourge of 100% fruit juice:

Excessive fruit juice consumption is associated with increased risk for obesity… sucrose consumption without the corresponding fiber, as is commonly present in fruit juice, is associated with the metabolic syndrome, liver injury, and obesity.

Regular fruit is probably good for you. Refined is not.

Here’s another study on the problems with fructose:

If calcium levels in the blood are low, our bodies produce more parathyroid hormone, stimulating the absorption of calcium by the kidneys, as well as the production of vitamin D (calcitriol), also in the kidneys. Calcitriol stimulates the absorption of calcium in the intestine, decreases the production of PTH and stimulates the release of calcium from the bone. …

… Ferraris fed rats diets with high levels of glucose, fructose or starch. He and his team studied three groups of lactating rats and three groups of non-pregnant rats (the control group).

“Since the amounts of calcium channels and of binding proteins depend on the levels of the hormone calcitriol, we confirmed that calcitriol levels were much greater in lactating rats,” said Ferraris.  … “However, when the rat mothers were consuming fructose, there were no increases in calcitriol levels,” Ferraris added. “The levels remained the same as those in non-pregnant rats, and as a consequence, there were no increases in intestinal and renal calcium transport.”

You then have two options: food cravings until you eat enough to balance the nutrients, or strip bones of calcium. This is what triggers tooth decay.

Sugar not only feeds the bacteria on your teeth (I think), it also weakens your teeth to pay the piper for sugar digestion. (Also, there may be something about sugar-fed bacteria lowering the pH in your mouth.)

The second thing that happens is your taste buds acclimate to excessive sugar. Soon “Sweet” tastes “normal.”

Now when you try to stop eating sugar, normal food tastes “boring” “sour” “bitter” etc.
This is where you just have to bite the bullet and cut sugar anyway. If you keep eating normal food, eventually it will start tasting good again.

It just takes time for your brain to change its assumptions about what food tastes like.
But if you keep sweetening your food with “artificial” sweeteners, then you never give yourself a chance to recalibrate what food should taste like. You will keep craving sugar.
And it is really hard to stop eating sugar and let your body return to normal when you crave sugar.

If artificial sweeteners help you reduce sugar consumption and eventually stop using it altogether, then they’re probably a good idea, but don’t fall into the trap of thinking you’re going to get just as much cake and ice cream as always, just it won’t have any consequences anymore. No. Nature doesn’t work like that. Nature has consequences.

So I feel like I’ve been picking on fructose a lot in this post. I didn’t mean to. I am suspicious of all refined sugars; these are just the sources I happened across while researching today.

I am not sure about honey. I don’t eat a lot of honey, but maybe it’s okay. The Hadza of Tanzania eat a great deal of honey and they seem fine, but maybe they’re adapted to their diet in ways that we aren’t.

So what happens when you eat too much sugar? Aside from, obviously, food cravings, weight gain, mineral depletion, and tooth decay…

So here’s a theory:

Our bodies naturally cycle between winter and summer states. At least they do if you hail from a place that historically had winter; I can’t speak for people in radically different climates.

In the summer, plant matter (carbohydrates, fiber,) are widely available and any animal that can takes as much advantage of this as possible. As omnivores, we gorge on berries, leaves, fruits, tubers, really whatever we can. When we are satiated–when we have enough fat stores to last for the winter–our bodies start shutting down insulin production. That’s enough. We don’t need it anymore.

In the winter, there’s very little plant food naturally available, unless you’re a farmer (farming is relatively recent in areas with long winters.)

In the winter, you hunt animals for meat and fat.This is what the Inuit and Eskimo did almost all year round.

The digestion of meat and fat does not require insulin, but works on the ketogenic pathways which, long story short, also turn food into energy and keep people alive.

The real beauty of ketosis is that, apparently, it ramps up your energy production–that is, you feel physically warmer when running entirely off of meat and fat than when running off carbs. Given that ketosis is the winter digestive cycle, this is amazingly appropriate.

By spring, chances are you’ve lost a lot of the weight from last summer. Winters are harsh. With the fat gone, the body starts producing insulin again.

At this point, you go from hyperglycemia (too much sugar in your bloodstream if you eat anything sweet, due to no insulin,) to hypoglycemia–your body produces a lot of insulin to transform any plants you eat into energy FAST. (Remember the discussion above about how your body transforms fructose into fat? Back in our ancestral environment, that was a feature, not a bug!)

This lets you put on pounds quickly in the spring and summer, using now-available plants as your primary food source.

The difficulty with our society is we’ve figured out how to take the energy part out of the plants, refine it, and store up huge quantities of it so we can eat it any time we want, which is all the time.

Evolution makes us want to eat, obviously. Ancestors who didn’t have a good “eat now” drive didn’t eat whatever good food was available and didn’t become ancestors.

But now we’ve hacked that, and as a result we never go into the sugar-free periods we were built to occasionally endure.

I don’t think you need to go full keto or anti-bread or something to make up for this. Just cutting down on refined sugars (and most refined oils, btw) is probably enough for most people.

Note: Humans have been eating grains for longer than the domestication of plants–there’s a reason we thought it was a good idea to domesticate grains in the first place, and it wasn’t because they were a random, un-eaten weed. If your ancestors ate bread, then there’s a good chance that you can digest bread just fine.

But if bread causes you issues, then by all means, avoid it. Different people thrive on different foods.

Please remember that this thread is speculative.

AND FOR GOODNESS SAKES DON’T PUT SUGAR IN FRUIT THINGS. JAM DOES NOT NEED SUGAR. NEITHER DOES PIE.

IF YOU ARE USING DECENT FRUIT THEN YOU DON’T NEED SUGAR. THE ONLY REASON YOU NEED SUGAR IS IF YOUR FRUIT IS CRAP. THEN JUST GO EAT SOMETHING ELSE.

 

Tapeworm-cancer-AIDS is a real thing

Tapeworm Spreads Deadly Cancer to Human:

A Colombian man’s lung tumors turned out to have an extremely unusual cause: The rapidly growing masses weren’t actually made of human cells, but were from a tapeworm living inside him, according to a report of the case.

This is the first known report of a person becoming sick from cancer cells that developed in a parasite, the researchers said.

“We were amazed when we found this new type of disease—tapeworms growing inside a person, essentially getting cancer, that spreads to the person, causing tumors,” said study researcher Dr. Atis Muehlenbachs, a staff pathologist at the Centers for Disease Control and Prevention’s Infectious Diseases Pathology Branch (IDPB).

The man had HIV, which weakens the immune system and likely played a role in allowing the development of the parasite cancer, the researchers said.

There’s not a lot I can add to this.

But there are probably more cases like this, if only because gay men seem to contract a lot of parasites:

Fast forward to the spring of 2017. PreP had recently ushered in the second sexual revolution and everyone was now fucking each other like it was 1979. My wonderful boyfriend and I enjoyed a healthy sex life inside and outside our open relationship. Then he started experiencing stomach problems: diarrhea, bloating, stomach aches, nausea. All too familiar with those symptoms, I recommended he go to the doctor and ask for a stool test. …

His results came back positive for giardia. …

Well, just a few months later, summer of 2017, my boyfriend started experiencing another bout of diarrhea and stomach cramps. … This time the results came back positive for entamoeba histolytica. What the fuck is entamoeba histolytica?! I knew giardia. Giardia and I were on a first name basis. But entamoeba, what now?

Entamoeba histolytica, as it turns out, is another parasite common in developing countries spread through contaminated drinking water, poor hygiene when handling food, and…rimming. The PA treating him wasn’t familiar with entamoeba histolytica or how to treat it, so she had to research (Google?) how to handle the infection. The medical literature (Google search results?) led us back to metronidazole, the same antibiotic used to treat giardia.

When your urge to lick butts is so strong that this keeps happening, you’ve got to consider an underlying condition like toxoplasmosis or kamikaze horsehair worm.

Some Migration-Related Studies

I have too many tabs open on my computer, so here are some studies/writings which all touch on migration/population movements in some way:

Biographical Memoirs of Henry Harpending [pdf]:

The late Henry Harpending of West Hunter blog, along with Greg Cochran, wrote the 10,000 Year Explosion, did anthropological field work among the Ju/’hoansi, and pioneered population genetics. The biography has many interesting parts:

Henry’s early research on population genetics also helped establish the close relationship between genetics and geography. Genetic differences between groups tend to mirror the geographic distance between them, so that a map of genetic distances looks like a geographic map (Harpending and Jenkins, 1973). Henry developed methods for studying this relationship that are still in use. …

Meanwhile, Henry’s Kalahari field experience also motivated an interest in population ecology. Humans cope with variation in resource supply either by storage (averaging over time) or by mobility and sharing (averaging over space). These strategies are mutually exclusive. Those who store must defend their stored resources against others who would like to share them. Conversely, an ethic of sharing makes storage impossible. The contrast between the mobile and the sedentary Ju/’hoansi in Henry’s sample therefore represented a fundamental shift in strategy. …

Diseases need time to cause lesions on bone. If the infected individual dies quickly, no lesion will form, and the skeleton will look healthy. Lesions form only if the infected individual is healthy enough to survive for an extended period. Lesions on ancient bone may therefore imply that the population was healthy! …

In the 1970s, as Henry’s interest in genetic data waned, he began developing population genetic models of social evolution. He overturned 40 years of conventional wisdom by showing that group selection works best not when groups are isolated but when they are strongly connected by gene flow (1980, pp. 58-59; Harpending and Rogers, 1987). When gene flow is restricted, successful mutants cannot spread beyond the initial group, and group selection stalls.

Genetic Consequences of Social Stratification in Great Britain:

Human DNA varies across geographic regions, with most variation observed so far reflecting distant ancestry differences. Here, we investigate the geographic clustering of genetic variants that influence complex traits and disease risk in a sample of ~450,000 individuals from Great Britain. Out of 30 traits analyzed, 16 show significant geographic clustering at the genetic level after controlling for ancestry, likely reflecting recent migration driven by socio-economic status (SES). Alleles associated with educational attainment (EA) show most clustering, with EA-decreasing alleles clustering in lower SES areas such as coal mining areas. Individuals that leave coal mining areas carry more EA-increasing alleles on average than the rest of Great Britain. In addition, we leveraged the geographic clustering of complex trait variation to further disentangle regional differences in socio-economic and cultural outcomes through genome-wide association studies on publicly available regional measures, namely coal mining, religiousness, 1970/2015 general election outcomes, and Brexit referendum results.

Let’s hope no one reports on this as “They found the Brexit gene!”

Can you Move to Opportunity? Evidence from the Great Migration [PDF]:

The northern United States long served as a land of opportunity for black Americans, but today the region’s racial gap in intergenerational mobility rivals that of the South. I show that racial composition changes during the peak of the Great
Migration (1940-1970) reduced upward mobility in northern cities in the long run,
with the largest effects on black men. I identify urban black population increases
during the Migration at the commuting zone level using a shift-share instrument,
interacting pre-1940 black southern migrant location choices with predicted outmigration from southern counties. The Migration’s negative effects on children’s
adult outcomes appear driven by neighborhood factors, not changes in the characteristics of the average child. As early as the 1960s, the Migration led to greater white enrollment in private schools, increased spending on policing, and higher crime and incarceration rates. I estimate that the overall change in childhood environment induced by the Great Migration explains 43% of the upward mobility gap between black and white men in the region today.

43% is huge and, IMO, too big. However, the author may be on to something.

Lineage Specific Histories of Mycobacterium Tuberculosis Dispersal in Africa and Eurasia:

Mycobacterium tuberculosis (M.tb) is a globally distributed, obligate pathogen of humans that can be divided into seven clearly defined lineages. … We reconstructed M.tb migration in Africa and Eurasia, and investigated lineage specific patterns of spread. Applying evolutionary rates inferred with ancient M.tb genome calibration, we link M.tb dispersal to historical phenomena that altered patterns of connectivity throughout Africa and Eurasia: trans-Indian Ocean trade in spices and other goods, the Silk Road and its predecessors, the expansion of the Roman Empire and the European Age of Exploration. We find that Eastern Africa and Southeast Asia have been critical in the dispersal of M.tb.

I spend a surprising amount of time reading about mycobacteria.

Invasive Memes

 

220px-Smallpox_virus_virions_TEM_PHIL_1849
Smallpox virus

Do people eventually grow ideologically resistant to dangerous local memes, but remain susceptible to foreign memes, allowing them to spread like invasive species?

And if so, can we find some way to memetically vaccinate ourselves against deadly ideas?

***

Memetics is the study of how ideas (“memes”) spread and evolve, using evolutionary theory and epidemiology as models. A “viral meme” is one that spreads swiftly through society, “infecting” minds as it goes.

Of course, most memes are fairly innocent (e.g. fashion trends) or even beneficial (“wash your hands before eating to prevent disease transmission”), but some ideas, like communism, kill people.

Ideologies consist of a big set of related ideas rather than a single one, so let’s call them memeplexes.

Almost all ideological memeplexes (and religions) sound great on paper–they have to, because that’s how they spread–but they are much more variable in actual practice.

Any idea that causes its believers to suffer is unlikely to persist–at the very least, because its believers die off.

Over time, in places where people have been exposed to ideological memeplexes, their worst aspects become known and people may learn to avoid them; the memeplexes themselves can evolve to be less harmful.

Over in epidemiology, diseases humans have been exposed to for a long time become less virulent as humans become adapted to them. Chickenpox, for example, is a fairly mild disease that kills few people because the virus has been infecting people for as long as people have been around (the ancestral Varicella-Zoster virus evolved approximately 65 million years ago and has been infecting animals ever since). Rather than kill you, chickenpox prefers to enter your nerves and go dormant for decades, reemerging later as shingles, ready to infect new people.

By contrast, smallpox (Variola major and Variola minor) probably evolved from a rodent-infecting virus about 16,000 to 68,000 years ago. That’s a big range, but either way, it’s much more recent than chickenpox. Smallpox made its first major impact on the historical record around the third century BC, Egypt, and thereafter became a recurring plague in Africa and Eurasia. Note that unlike chickenpox, which is old enough to have spread throughout the world with humanity, smallpox emerged long after major population splits occurred–like part of the Asian clade splitting off and heading into the Americas.

By 1400, Europeans had developed some immunity to smallpox (due to those who didn’t have any immunity dying), but when Columbus landed in the New World, folks here had had never seen the disease before–and thus had no immunity. Diseases like smallpox and measles ripped through native communities, killing approximately 90% of the New World population.

If we extend this metaphor back to ideas–if people have been exposed to an ideology for a long time, they are more likely to have developed immunity to it or the ideology to have adapted to be relatively less harmful than it initially was. For example, the Protestant Reformation and subsequent Catholic counter-reformation triggered a series of European wars that killed 10 million people, but today Catholics and Protestants manage to live in the same countries without killing each other. New religions are much more likely to lead all of their followers in a mass suicide than old, established religions; countries that have just undergone a political revolution are much more likely to kill off large numbers of their citizens than ones that haven’t.

This is not to say that old ideas are perfect and never harmful–chickenpox still kills people and is not a fun disease–but that any bad aspects are likely to become more mild over time as people wise up to bad ideas, (certain caveats applying).

But this process only works for ideas that have been around for a long time. What about new ideas?

You can’t stop new ideas. Technology is always changing. The world is changing, and it requires new ideas to operate. When these new ideas arrive, even terrible ones can spread like wildfire because people have no memetic antibodies to resist them. New memes, in short, are like invasive memetic species.

In the late 1960s, 15 million people still caught smallpox every year. In 1980, it was declared officially eradicated–not one case had been seen since 1977, due to a massive, world-wide vaccination campaign.

Humans can acquire immunity to disease in two main ways. The slow way is everyone who isn’t immune dying; everyone left alive happens to have adaptations that let them not die, which they can pass on to their children. As with chickenpox, over generations, the disease becomes less severe because humans become successively more adapted to it.

The fast way is to catch a disease, produce antibodies that recognize and can fight it off, and thereafter enjoy immunity. This, of course, assumes that you survive the disease.

Vaccination works by teaching body’s immune system to recognize a disease without infecting it with a full-strength germ, using a weakened or harmless version of the germ, instead. Early on, weakened germs from actual smallpox scabs or lesions to inoculate people, a risky method since the germs often weren’t that weak. Later, people discovered that cowpox was similar enough to smallpox that its antibodies could also fight smallpox, but cowpox itself was too adapted to cattle hosts to seriously harm humans. (Today I believe the vaccine uses a different weakened virus, but the principle is the same.)

The good part about memes is that you do not actually have to inject a physical substance into your body in order to learn about them.

Ideologies are very difficult to evaluate in the abstract, because, as mentioned, they are all optimized to sound good on paper. It’s their actual effects we are interested in.

So if we want to learn whether an idea is good or not, it’s probably best not to learn about it by merely reading books written by its advocates. Talk to people in places where the ideas have already been tried and learn from their experiences. If those people tell you this ideology causes mass suffering and they hate it, drop it like a hot potato. If those people are practicing an “impure” version of the ideology, it’s probably an improvement over the original.

For example, “communism” as practiced in China today is quite different from “communism” as practiced there 50 years ago–so much so that the modern system really isn’t communism at all. There was never, to my knowledge, an official changeover from one system to another, just a gradual accretion of improvements. This speaks strongly against communism as an ideology, since no country has managed to be successful by moving toward ideological communist purity, only by moving away from it–though they may still find it useful to retain some of communism’s original ideas.

I think there is a similar dynamic occurring in many Islamic countries. Islam is a relatively old religion that has had time to adapt to local conditions in many different parts of the world. For example, in Morocco, where the climate is more favorable to raising pigs than in other parts of the Islamic world, the taboo against pigs isn’t as strongly observed. The burka is not an Islamic universal, but characteristic of central Asia (the similar niqab is from Yemen). Islamic head coverings vary by culture–such as this kurhars, traditionally worn by unmarried women in Ingushetia, north of the Caucuses, or this cap, popular in Xianjiang. Turkey has laws officially restricting burkas in some areas, and Syria discourages even hijabs. Women in Iran did not go heavily veiled prior to the Iranian Revolution. So the insistence on extensive veiling in many Islamic communities (like the territory conquered by ISIS) is not a continuation of old traditions, but the imposition of a new, idealized, version of Islam.

Purity is counter to practicality.

Of course, this approach is hampered by the fact that what works in one place, time, and community may not work in a different one. Tilling your fields one way works in Europe, and tilling them a different way works in Papua New Guinea. But extrapolating from what works is at least a good start.

 

 

Did tobacco become popular because it kills parasites?

While reading about the conditions in a Burmese prison around the turn of the previous century (The History and Romance of Crime: Oriental Prisons, by Arthur Griffiths)(not good) it occurred to me that there might have been some beneficial effect of the large amounts of tobacco smoke inside the prison. Sure, in the long run, tobacco is highly likely to give you cancer, but in the short run, is it noxious to fleas and other disease-bearing pests?

Meanwhile in Melanesia, (Pygmies and Papuans,) a group of ornithologists struggled up a river to reach an almost completely isolated tribe of Melanesians that barely practiced horticulture; even further up the mountain they met a band of pygmies (negritoes) whose existence had only been rumored of; the pygmies cultivated tobacco, which they traded with their otherwise not terribly interested in trading for worldy goods neighbors.

The homeless smoke at rates 3x higher than the rest of the population, though this might have something to do with the high correlation between schizophrenia and smoking–80% of schizophrenics smoke, compared to 20% of the general population. Obviously this correlation is best explained by tobacco’s well-noted psychological effects (including addiction,) but why is tobacco so ubiquitous in prisons that cigarettes are used as currency? Could they have, in unsanitary conditions, some healthful purpose?

From NPR: Pot For Parasites? Pygmy Men Smoke out Worms:

On average, the more THC byproduct that Hagen’s team found in an Aka man’s urine, the fewer worm eggs were present in his gut.

“The heaviest smokers, with everything else being equal, had about half the number of parasitic eggs in their stool, compared to everyone else,” Hagen says. …

THC — and nicotine — are known to kill intestinal worms in a Petri dish. And many worms make their way to the gut via the lungs. “The worms’ larval stage is in the lung,” Hagan says. “When you smoke you just blast them with THC or nicotine directly.”

Smithsonian reports that Birds Harness the Deadly Power of Nicotine to Poison Parasites:

Smoking kills. But if you’re a bird and if you want to kill parasites, that can be a good thing. City birds have taken to stuffing their nests with cigarette butts to poison potential parasites. Nature reports:

“In a study published today in Biology Letters, the researchers examined the nests of two bird species common on the North American continent. They measured the amount of cellulose acetate (a component of cigarette butts) in the nests, and found that the more there was, the fewer parasitic mites the nest contained.”

Out in the State of Nature, parasites are extremely common and difficult to get rid of (eg, hookworm elimination campaigns in the early 1900s found that 40% of school-aged children were infected); farmers can apparently use tobacco as a natural de-wormer (but be careful, as tobacco can be poisonous.)

In the pre-modern environment, when many people had neither shoes, toilets, nor purified water, parasites were very hard to avoid.
Befoundalive recommends eating the tobacco from a cigarette if you have intestinal parasites and no access to modern medicine.

Here’s a study comparing parasite rates in tobacco workers vs. prisoners in Ethiopia:

Overall, 8 intestinal parasite species have been recovered singly or in combinations from 146 (61.8 %) samples. The prevalence in prison population (88/121 = 72.7%) was significantly higher than that in tobacco farm (58/115 = 50.4%).

In vitro anthelmintic effect of Tobacco (Nicotiana tabacum) extract on parasitic nematode, Marshallagia marshalli reports:

Because of developing resistance to the existing anthelmintic drugs, there is a need for new anthelmintic agents. Tobacco plant has alkaloid materials that have antiparasitic effect. We investigated the in vitro anthelminthic effect of aqueous and alcoholic extract of Tobacco (Nicotiana tabacum) against M. marshalli. … Overall, extracts of Tobacco possess considerable anthelminthic activity and more potent effects were observed with the highest concentrations. Therefore, the in vivo study on Tobocco in animal models is recommended.

(Helminths are parasites; anthelmintic=anti-parasites.)

So it looks like, at least in the pre-sewers and toilets and clean water environment when people struggled to stay parasite free, tobacco (and certain other drugs) may have offered people an edge over the pests. (I’ve noticed many bitter or noxious plants seem to have been useful for occasionally flushing out parasites, but you certainly don’t want to be in a state of “flush” all the time.)

It looks like it was only when regular sanitation got good enough that we didn’t have to worry about parasites anymore that people started getting really concerned with tobacco’s long-term negative effects on humans.