Short argument for vending machines full of experimental drugs

So I was thinking the other day about medication and Marilyn Manson’s “I don’t like the drugs but the drugs like me,” and it occurred to me that illegal drugs, generally speaking, are really good at what they do.

By contrast, take anti-depressants. Even the “really good” ones have abominable track records. Maybe a good drug works for 10, 20% of the population–but you don’t know which. Depressed people just have to keep trying different pills until they find one that works better than placebo.

Meanwhile, you’ll never hear someone say “Oh, yeah, crack just doesn’t do anything for me.” Crack works. Heroin works. Sure, they’ll fuck you up, but they work.

Illegal drugs are tried and tested in the almost-free black market of capitalism, where people do whatever they want with them–grind them up, snort them, inject them, put them up their buts–and stop taking them whenever they stop working. As a result, illegal drugs are optimized for being highly addictive, yes, but also for working really well. And through trial and error, people have figured out how much they need, how best to take it, and how often for the optimal effects.

In other words, simply letting lots of people mess around with drugs results in really effective drugs.

The downside to the black-free-market refinement of drugs is that lots of people die in the process.

Most people don’t want to be killed by an experimental anti-depressant, (is that ironic? That seems kind of ironic,) so it makes sense to have safeguards in place to make sure that their latest incarnations won’t send you into cardiac arrest, but many medications are intended for people whose lives are otherwise over. People with alzheimer’s, pancreatic cancer, glioblastoma, ALS, fatal familial insomnia, etc, are going to die. (Especially the ones with fatal familial insomnia. I mean, it’s got “fatal” in the name.) They have been handed death sentences and they know it, so their only possible hope is to speed up drug/treatment development as much as possible.

I am quite certain that something similar to what I am proposing already exists in some form. I am just proposing that we ramp it up: all patients with essentially incurable death sentences have access to whatever experimental drugs (or non-experimental drugs) they  want, with a few obvious caveats about price–but really, price tends to come down with increased demand, so just stock everything in vending machines and charge 75c a dose.

Of course, the end result might just be that alzheimer’s meds come to closely resemble heroin, but hey, at least sick people will feel better as they die.

Since this is a short post, let me append a quick description of fatal familial insomnia: 

Fatal insomnia is a rare disorder that results in trouble sleeping.[2] The problems sleeping typically start out gradually and worsen over time.[3] Other symptoms may include speech problems, coordination problems, and dementia.[4][5] It results in death within a few months to a few years.[2]

It is a prion disease of the brain.[2] It is usually caused by a mutation to the protein PrPC.[2] It has two forms: fatal familial insomnia (FFI), which is autosomal dominant and sporadic fatal insomnia (sFI) which is due to a noninherited mutation. Diagnosis is based on a sleep studyPET scan, and genetic testing.[1]

Fatal insomnia has no known cure and involves progressively worsening insomnia, which leads to hallucinations, delirium, confusional states like that of dementia, and eventually death.[6] The average survival time from onset of symptoms is 18 months.[6] The first recorded case was an Italian man, who died in Venice in 1765.[7]

Terrible.

 

Advertisements

Can Autism be Cured via a Gluten Free Diet?

I’d like to share a story from a friend and her son–let’s call them Heidi and Sven.

Sven was always a sickly child, delicate and underweight. (Heidi did not seem neglectful.) Once Sven started school, Heidi started receiving concerned notes from his teachers. He wasn’t paying attention in class. He wasn’t doing his work. They reported repetitious behavior like walking slowly around the room and tapping all of the books. Conversation didn’t quite work with Sven. He was friendly, but rarely responded when spoken to and often completely ignored people. He moved slowly.

Sven’s teachers suggested autism. Several doctors later, he’d been diagnosed.

Heidi began researching everything she could about autism. Thankfully she didn’t fall down any of the weirder rabbit holes, but when Sven’s started complaining that his stomach hurt, she decided to try a gluten-free diet.

And it worked. Not only did Sven’s stomach stop hurting, but his school performance improved. He stopped laying his head down on his desk every afternoon. He started doing his work and responding to classmates.

Had a gluten free diet cured his autism?

Wait.

A gluten free diet cured his celiac disease (aka coeliac disease). Sven’s troublesome behavior was most likely caused by anemia, caused by long-term inflammation, caused by gluten intolerance.

When we are sick, our bodies sequester iron to prevent whatever pathogen is infecting us from using it. This is a sensible response to short-term pathogens that we can easily defeat, but in long-term sicknesses, leads to anemia. Since Sven was sick with undiagnosed celiac disease for years, his intestines were inflamed for years–and his body responded by sequestering iron for years, leaving him continually tired, spacey, and unable to concentrate in school.

The removal of gluten from his diet allowed his intestines to heal and his body to finally start releasing iron.

Whether or not Sven had (or has) autism is a matter of debate. What is autism? It’s generally defined by a list of symptoms/behaviors, not a list of causes. So very different causes could nonetheless trigger similar symptoms in different people.

Saying that Sven’s autism was “cured” by this diet is somewhat misleading, since gluten-free diets clearly won’t work for the majority of people with autism–those folks don’t have celiac disease. But by the same token, Sven was diagnosed with autism and his diet certainly did work for him, just as it might for other people with similar symptoms. We just don’t have the ability right now to easily distinguish between the many potential causes for the symptoms lumped together under “autism,” so parents are left trying to figure out what might work for their kid.

Interestingly, the overlap between “autism” and feeding problems /gastrointestinal disorders is huge. Now, when I say things like this, I often notice that people are confused about the scale of problems. Nearly every parent swears, at some point, that their child is terribly picky. This is normal pickiness that goes away with time and isn’t a real problem. The problems autistic children face are not normal.

Parent of normal child: “My kid is so picky! She won’t eat peas!”

Parent of autistic child: “My kid only eats peas.”

See the difference?

Let’s cut to Wikipedia, which has a nice summary:

Gastrointestinal problems are one of the most commonly associated medical disorders in people with autism.[80] These are linked to greater social impairment, irritability, behavior and sleep problems, language impairments and mood changes, so the theory that they are an overlap syndrome has been postulated.[80][81] Studies indicate that gastrointestinalinflammation, immunoglobulin E-mediated or cell-mediated food allergies, gluten-related disorders (celiac diseasewheat allergynon-celiac gluten sensitivity), visceral hypersensitivity, dysautonomia and gastroesophageal reflux are the mechanisms that possibly link both.[81]

A 2016 review concludes that enteric nervous system abnormalities might play a role in several neurological disorders, including autism. Neural connections and the immune system are a pathway that may allow diseases originated in the intestine to spread to the brain.[82] A 2018 review suggests that the frequent association of gastrointestinal disorders and autism is due to abnormalities of the gut–brain axis.[80]

The “leaky gut” hypothesis is popular among parents of children with autism. It is based on the idea that defects in the intestinal barrier produce an excessive increase of the intestinal permeability, allowing substances present in the intestine, including bacteria, environmental toxins and food antigens, to pass into the blood. The data supporting this theory are limited and contradictory, since both increased intestinal permeability and normal permeability have been documented in people with autism. Studies with mice provide some support to this theory and suggest the importance of intestinal flora, demonstrating that the normalization of the intestinal barrier was associated with an improvement in some of the ASD-like behaviours.[82] Studies on subgroups of people with ASD showed the presence of high plasma levels of zonulin, a protein that regulates permeability opening the “pores” of the intestinal wall, as well as intestinal dysbiosis (reduced levels of Bifidobacteria and increased abundance of Akkermansia muciniphilaEscherichia coliClostridia and Candida fungi) that promotes the production of proinflammatory cytokines, all of which produces excessive intestinal permeability.[83] This allows passage of bacterial endotoxins from the gut into the bloodstream, stimulating liver cells to secrete tumor necrosis factor alpha (TNFα), which modulates blood–brain barrier permeability. Studies on ASD people showed that TNFα cascades produce proinflammatory cytokines, leading to peripheral inflammation and activation of microglia in the brain, which indicates neuroinflammation.[83] In addition, neuroactive opioid peptides from digested foods have been shown to leak into the bloodstream and permeate the blood–brain barrier, influencing neural cells and causing autistic symptoms.[83] (See Endogenous opiate precursor theory)

Here is an interesting case report of psychosis caused by gluten sensitivity:

 In May 2012, after a febrile episode, she became increasingly irritable and reported daily headache and concentration difficulties. One month after, her symptoms worsened presenting with severe headache, sleep problems, and behavior alterations, with several unmotivated crying spells and apathy. Her school performance deteriorated… The patient was referred to a local neuropsychiatric outpatient clinic, where a conversion somatic disorder was diagnosed and a benzodiazepine treatment (i.e., bromazepam) was started. In June 2012, during the final school examinations, psychiatric symptoms, occurring sporadically in the previous two months, worsened. Indeed, she began to have complex hallucinations. The types of these hallucinations varied and were reported as indistinguishable from reality. The hallucinations involved vivid scenes either with family members (she heard her sister and her boyfriend having bad discussions) or without (she saw people coming off the television to follow and scare her)… She also presented weight loss (about 5% of her weight) and gastrointestinal symptoms such as abdominal distension and severe constipation.

So she’s hospitalized and they do a bunch of tests. Eventually she’s put on steroids, which helps a little.

Her mother recalled that she did not return a “normal girl”. In September 2012, shortly after eating pasta, she presented crying spells, relevant confusion, ataxia, severe anxiety and paranoid delirium. Then she was again referred to the psychiatric unit. A relapse of autoimmune encephalitis was suspected and treatment with endovenous steroid and immunoglobulins was started. During the following months, several hospitalizations were done, for recurrence of psychotic symptoms.

Again, more testing.

In September 2013, she presented with severe abdominal pain, associated with asthenia, slowed speech, depression, distorted and paranoid thinking and suicidal ideation up to a state of pre-coma. The clinical suspicion was moving towards a fluctuating psychotic disorder. Treatment with a second-generation anti-psychotic (i.e., olanzapine) was started, but psychotic symptoms persisted. In November 2013, due to gastro-intestinal symptoms and further weight loss (about 15% of her weight in the last year), a nutritionist was consulted, and a gluten-free diet (GFD) was recommended for symptomatic treatment of the intestinal complaints; unexpectedly, within a week of gluten-free diet, the symptoms (both gastro-intestinal and psychiatric) dramatically improvedDespite her efforts, she occasionally experienced inadvertent gluten exposures, which triggered the recurrence of her psychotic symptoms within about four hours. Symptoms took two to three days to subside again.

Note: she has non-celiac gluten sensitivity.

One month after [beginning the gluten free diet] AGA IgG and calprotectin resulted negative, as well as the EEG, and ferritin levels improved.

Note: those are tests of inflammation and anemia–that means she no longer has inflammation and her iron levels are returning to normal.

She returned to the same neuro-psychiatric specialists that now reported a “normal behavior” and progressively stopped the olanzapine therapy without any problem. Her mother finally recalled that she was returned a “normal girl”. Nine months after definitely starting the GFD, she is still symptoms-free.

This case is absolutely crazy. That poor girl. Here she was in constant pain, had constant constipation, was losing weight (at an age when children should be growing,) and the idiot adults thought she had a psychiatric problem.

This is not the only case of gastro-intestinal disorder I have heard of that presented as psychosis.

Speaking of stomach pain, did you know Curt Cobain suffered frequent stomach pain that was so severe it made him vomit and want to commit suicide, and he started self-medicating with heroin just to stop the pain? And then he died.

Back to autism and gastrointestinal issues other than gluten, here is a fascinating new study on fecal transplants (h/t WrathofGnon):

Many studies have reported abnormal gut microbiota in individuals with Autism Spectrum Disorders (ASD), suggesting a link between gut microbiome and autism-like behaviors. Modifying the gut microbiome is a potential route to improve gastrointestinal (GI) and behavioral symptoms in children with ASD, and fecal microbiota transplant could transform the dysbiotic gut microbiome toward a healthy one by delivering a large number of commensal microbes from a healthy donor. We previously performed an open-label trial of Microbiota Transfer Therapy (MTT) that combined antibiotics, a bowel cleanse, a stomach-acid suppressant, and fecal microbiota transplant, and observed significant improvements in GI symptoms, autism-related symptoms, and gut microbiota. Here, we report on a follow-up with the same 18 participants two years after treatment was completed. Notably, most improvements in GI symptoms were maintained, and autism-related symptoms improved even more after the end of treatment.

Fecal transplant is exactly what it sounds like. The doctors clear out a person’s intestines as best they can, then put in new feces, from a donor, via a tube (up the butt or through the stomach; either direction works.)

Unfortunately, it wasn’t a double-blind study, but the authors are hopeful that they can get funding for a double-blind placebo controlled study soon.

I’d like to quote a little more from this study:

Two years after the MTT was completed, we invited the 18 original subjects in our treatment group to participate in a follow-up study … Two years after treatment, most participants reported GI symptoms remaining improved compared to baseline … The improvement was on average 58% reduction in Gastrointestinal Symptom Rating Scale (GSRS) and 26% reduction in % days of abnormal stools… The improvement in GI symptoms was observed for all sub-categories of GSRS (abdominal pain, indigestion, diarrhea, and constipation, Supplementary Fig. S2a) as well as for all sub-categories of DSR (no stool, hard stool, and soft/liquid stool, Supplementary Fig. S2b), although the degree of improvement on indigestion symptom (a sub-category of GSRS) was reduced after 2 years compared with weeks 10 and 18. This achievement is notable, because all 18 participants reported that they had had chronic GI problems (chronic constipation and/or diarrhea) since infancy, without any period of normal GI health.

Note that these children were chosen because they had both autism and lifelong gastrointestinal problems. This treatment may do nothing at all for people who don’t have gastrointestinal problems.

The families generally reported that ASD-related symptoms had slowly, steadily improved since week 18 of the Phase 1 trial… Based on the Childhood Autism Rating Scale (CARS) rated by a professional evaluator, the severity of ASD at the two-year follow-up was 47% lower than baseline (Fig. 1b), compared to 23% lower at the end of week 10. At the beginning of the open-label trial, 83% of participants rated in the severe ASD diagnosis per the CARS (Fig. 2a). At the two-year follow-up, only 17% were rated as severe, 39% were in the mild to moderate range, and 44% of participants were below the ASD diagnostic cut-off scores (Fig. 2a). … The Vineland Adaptive Behavior Scale (VABS) equivalent age continued to improve (Fig. 1f), although not as quickly as during the treatment, resulting in an increase of 2.5 years over 2 years, which is much faster than typical for the ASD population, whose developmental age was only 49% of their physical age at the start of this study.

Important point: their behavior matured faster than it normally does in autistic children.

This is a really interesting study, and I hope the authors can follow it up with a solid double-blind.

Of course, not all autists suffer from gastrointestinal complaints. Many eat and digest without difficulty. But the connection between physical complaints and mental disruption across a variety of conditions is fascinating. How many conditions that we currently believe are psychological might actually be caused a by an untreated biological illness?

Does the DSM need to be re-written?

I recently came across an interesting paper that looked at the likelihood that a person, once diagnosed with one mental disorder, would be diagnosed with another. (Exploring Comorbidity Within Mental Disorders Among a Danish National Population, by Oleguer Plana-Ripoll.)

This was a remarkable study in two ways. First, it had a sample size of 5,940,778, followed up for 83.9 million person-years–basically, the entire population of Denmark over 15 years. (Big Data indeed.)

Second, it found that for virtually every disorder, one diagnoses increased your chances of being diagnosed with a second disorder. (“Comorbid” is a fancy word for “two diseases or conditions occurring together,” not “dying at the same time.”) Some diseases were particularly likely to co-occur–in particular, people diagnosed with “mood disorders” had a 30% chance of also being diagnosed with “neurotic disorders” during the 15 years covered by the study.

Mood disorders includes bipolar, depression, and SAD;

Neurotic disorders include anxieties, phobias, and OCD.

Those chances were considerably higher for people diagnosed at younger ages, and decreased significantly for the elderly–those diagnosed with mood disorders before the age of 20 had a +40% chance of also being diagnosed with a neurotic disorder, while those diagnosed after 80 had only a 5% chance.

I don’t find this terribly surprising, since I know someone with at least five different psychological diagnoses, (nor is it surprising that many people with “intellectual disabilities” also have “developmental disorders”) but it’s interesting just how pervasive comorbidity is across conditions that are ostensibly separate diseases.

This suggests to me that either many people are being mis-diagnosed (perhaps diagnosis itself is very difficult,) or what look like separate disorders are often actually one, single disorder. While it is certainly possible, of course, for someone to have both a phobia of snakes and seasonal affective disorder, the person I know with five diagnoses most likely has only one “true” disorder that has just been diagnosed and treated differently by different clinicians. It seems likely that some people’s depression also manifests itself as deep-rooted anxiety or phobias, for example.

While this is a bit of a blow for many psychiatric diagnoses, (and I am quite certain that many diagnostic categories will need a fair amount of revision before all is said and done,) autism recently got a validity boost–How brain scans can diagnose Autism with 97% accuracy.

The title is overselling it, but it’s interesting anyway:

Lead study author Marcel Just, PhD, professor of psychology and director of the Center for Cognitive Brain Imaging at Carnegie Mellon University, and his team performed fMRI scans on 17 young adults with high-functioning autism and 17 people without autism while they thought about a range of different social interactions, like “hug,” “humiliate,” “kick” and “adore.” The researchers used machine-learning techniques to measure the activation in 135 tiny pieces of the brain, each the size of a peppercorn, and analyzed how the activation levels formed a pattern. …

So great was the difference between the two groups that the researchers could identify whether a brain was autistic or neurotypical in 33 out of 34 of the participants—that’s 97% accuracy—just by looking at a certain fMRI activation pattern. “There was an area associated with the representation of self that did not activate in people with autism,” Just says. “When they thought about hugging or adoring or persuading or hating, they thought about it like somebody watching a play or reading a dictionary definition. They didn’t think of it as it applied to them.” This suggests that in autism, the representation of the self is altered, which researchers have known for many years, Just says.

N=34 is not quite as impressive as N=Denmark, but it’s a good start.

Book Club: The 10,000 Year Explosion pt. 6: Expansion

5172bf1dp2bnl-_sx323_bo1204203200_

Welcome back to the Book Club. Today we’re discussing chapter 6 of Cochran and Harpending’s The 10,000 Year Explosion: Expansions

The general assumption is that the winning advantage is cultural–that is to say, learned. Weapons, tactics, political organization, methods of agriculture: all is learned. The expansion of modern humans is the exception to the rule–most observers suspect that biological difference were the root cause of their advantage. … 

the assumption that more recent expansions are all driven by cultural factors is based on the notion that modern humans everywhere have essentially the same abilities. that’s a logical consequence of human evolutionary stasis” If humans have not undergone a significant amount of biological change since the expansion out of Africa, then people everywhere would have essentially the same potentials, and no group would have a biological advantage over its neighbors. But as we never tire of pointing out, there has been significant biological change during that period.

I remember a paper I wrote years ago (long before this blog) on South Korea’s meteoric economic rise. In those days you had to actually go to the library to do research, not just futz around on Wikipedia. My memory says the stacks were dimly lit, though that is probably just some romanticizing. 

I poured through volumes on 5 year economic plans, trying to figure out why South Korea’s were more successful than other nations’. Nothing stood out to me. Why this plan and not this plan? Did 5 or 10 years matter? 

I don’t remember what I eventually concluded, but it was probably something along the lines of “South Korea made good plans that worked.” 

People around these parts often criticize Jared Diamond for invoking environmental explanations while ignoring or directly counter-signaling their evolutionary implications, but Diamond was basically the first author I read who said anything that even remotely began to explain why some countries succeeded and others failed. 

Environment matters. Resources matter. Some peoples have long histories of civilization, others don’t. Korea has a decently long history. 

Diamond was one of many authors who broke me out of the habit of only looking at explicit things done by explicitly recognized governments, and at wider patterns of culture, history, and environment. It was while reading Peter Frost’s blog that I first encountered the phrase “gene-culture co-evolution,” which supplies the missing link. 

800px-National_IQ_per_country_-_estimates_by_Lynn_and_Vanhanen_2006
IQ by country

South Korea does well because 1. It’s not communist and 2. South Koreans are some of the smartest people in the world. 

I knew #1, but I could have saved myself a lot of time in the stacks if someone had just told me #2 instead of acting like SK’s economic success was a big mystery. 

The fact that every country was relatively poor before industrialization, and South Korea was particularly poor after a couple decades of warfare back and forth across the peninsula, obscures the nation’s historically high development. 

For example, the South Korean Examination system, Gwageo, was instituted in 788 (though it apparently didn’t become important until 958). Korea has had agriculture and literacy for a long time, with accompanying political and social organization. This probably has more to do with South Korea having a relatively easy time adopting the modern industrial economy than anything in particular in the governments’ plans. 

Cochran has an interesting post on his blog on Jared Diamond and Domestication: 

In fact, in my mind the real question is not why various peoples didn’t domesticate animals that we know were domesticable, but rather how anyone ever managed to domesticate the aurochs. At least twice. Imagine a longhorn on roids: they were big and aggressive, favorites in the Roman arena. … 

The idea is that at least some individual aurochs were not as hostile and fearful of humans as they ought to have been, because they were being manipulated by some parasite. … This would have made domestication a hell of a lot easier. …

The beef tape worm may not have made it through Beringia.  More generally, there were probably no parasites in the Americas that had some large mammal as intermediate host and Amerindians as the traditional definite host. 

They never mentioned parasites in gov class. 

Back to the book–I thought this was pretty interesting:

One sign of this reduced disease pressure is the unusual distribution of HLA alleles among Amerindians. the HLA system … is a group of genes that encode proteins expressed on the outer surfaces of cells. the immune system uses them to distinguish the self from non-self… their most important role is in infections disease. … 

HLA genes are among the most variable of all genes. … Because these genes are so variable, any two humans (other than identical twins) are almost certain to have a different set of them. … Natural selection therefore favors diversification of the HLA genes, and some alleles, though rare, have been persevered for a long time. In fact, some are 30 million years old, considerably older than Homo sapiens. …

But Amerindians didn’t have that diversity. Many tribes have a single HLA allele with a frequency of over 50 percent. … A careful analysis of global HLA diversity confirms continuing diversifying selection on HLA in most human populations but finds no evidence of any selection at all favoring diversity in HLA among Amerindians.

The results, of course, went very badly for the Indians–and allowed minuscule groups of Spaniards to conquer entire empires. 

The threat of European (and Asian and African) diseases wiping out native peoples continues, especially for “uncontacted” tribes. As the authors note, the Surui of Brazil numbered 800 when contacted in 1980, but only 200 in 1986, after tuberculosis had killed most of them. 

…in 1827, smallpox spared only 125 out of 1,600 Mandan Indians in what later became North Dakota.

The past is horrific. 

I find the history ancient exploration rather fascinating. Here is the frieze in Persepolis with the okapi and three Pygmies, from about 500 BC.

The authors quote Joao de Barros, a 16th century Portuguese historian: 

But it seems that for our sins, or for some inscrutable judgment of God, in all the entrances of this great Ethiopia we navigate along… He has placed a striking angel with a flaming sword of deadly fevers, who prevents us from penetrating into the interior to the springs of this garden, whence proceed these rivers of gold that flow to the sea in so many parts of our conquest.

Barros had a way with words. 

It wasn’t until quinine became widely available that Europeans had any meaningful success at conquering Africa–and even still, despite massive technological advantages, Europeans haven’t held the continent, nor have they made any significant, long-term demographic impact. 

EX-lactoseintolerance
Source: National Geographic

The book then segues into a discussion of the Indo-European expansion, which the authors suggest might have been due to the evolution of a lactase persistence gene. 

(Even though we usually refer to people as “lactose intolerance” and don’t regularly refer to people as “lactose tolerant,” it’s really tolerance that’s the oddity–most of the world’s population can’t digest lactose after childhood.

Lactase is the enzyme that breaks down lactose.)

Since the book was published, the Indo-European expansion has been traced genetically to the Yamnaya (not to be confused with the Yanomamo) people, located originally in the steppes north of the Caucasus mountains. (The Yamnaya and Kurgan cultures were, I believe, the same.) 

An interesting linguistic note: 

Uralic languages (the language family containing Finnish and Hungarian) appear to have had extensive contact with early Indo-European, and they may share a common ancestry. 

I hope these linguistic mysteries continue to be decoded. 

The authors claim that the Indo-Europeans didn’t make a huge genetic impact on Europe, practicing primarily elite dominance–but on the other hand, A Handful of Bronze-Age Men Could Have Fathered 2/3s of Europeans:

In a new study, we have added a piece to the puzzle: the Y chromosomes of the majority of European men can be traced back to just three individuals living between 3,500 and 7,300 years ago. How their lineages came to dominate Europe makes for interesting speculation. One possibility could be that their DNA rode across Europe on a wave of new culture brought by nomadic people from the Steppe known as the Yamnaya.

That’s all for now; see you next week.

Sugar

I have some hopefully good, deep stuff I am working on, but in the meanwhile, here is a quick, VERY SPECULATIVE thread on my theory for why refined sugars are probably bad for you:

First, refined sugars are evolutionarily novel. Unless you’re a Hazda, your ancient ancestors never had this much sugar.

Pick up a piece of raw sugar cane and gnaw on it. Raw sugar cane has such a high fiber to sugar content that you can use it as a toothbrush after chewing it for a bit.

According to the internet, a stick of raw sugar cane has 10 grams of sugar in it. A can of Coke has 39. Even milk (whole, skim, or fat-free) contains 12 grams of natural milk sugars (lactose) per glass. Your body has no problem handling the normal amounts of unrefined sugars in regular foods, but to get the amount of sugar found in a single soda, you’d have to eat almost four whole stalks of sugarcane, which you certainly aren’t going to do in a few minutes.

It’s when we extract all of the sugar and throw away the rest of the fiber, fat, and protein in the food that we run into trouble.

(The same is probably also true of fat, though I am rather fond of butter.)

In my opinion, all forms of heavily refined sugar are suspect, including fruit juice, which is essentially refined fructose. People think that fruit juice is “healthy” because it comes from fruit, which is a plant and therefore “natural” and “good for you,” unlike, say, sugar, which comes from sugar cane, which is… also a plant. Or HFCS, which is totally unnatural because it comes from… corn. Which is a plant.

“They actually did studies on the sugar plantations back in the early 1900s. All of the workers were healthy and lived longer than the sugar executives who got the refined, processed product.”

I don’t know if I agree with everything he has to say, but refined fructose is no more natural than any other refined sugar. Again, the amount of sugar you get from eating an apple is very different from the amount you get from a cup of apple juice.

Now people are talking about reducing childhood obesity by eliminating the scourge of 100% fruit juice:

Excessive fruit juice consumption is associated with increased risk for obesity… sucrose consumption without the corresponding fiber, as is commonly present in fruit juice, is associated with the metabolic syndrome, liver injury, and obesity.

Regular fruit is probably good for you. Refined is not.

Here’s another study on the problems with fructose:

If calcium levels in the blood are low, our bodies produce more parathyroid hormone, stimulating the absorption of calcium by the kidneys, as well as the production of vitamin D (calcitriol), also in the kidneys. Calcitriol stimulates the absorption of calcium in the intestine, decreases the production of PTH and stimulates the release of calcium from the bone. …

… Ferraris fed rats diets with high levels of glucose, fructose or starch. He and his team studied three groups of lactating rats and three groups of non-pregnant rats (the control group).

“Since the amounts of calcium channels and of binding proteins depend on the levels of the hormone calcitriol, we confirmed that calcitriol levels were much greater in lactating rats,” said Ferraris.  … “However, when the rat mothers were consuming fructose, there were no increases in calcitriol levels,” Ferraris added. “The levels remained the same as those in non-pregnant rats, and as a consequence, there were no increases in intestinal and renal calcium transport.”

You then have two options: food cravings until you eat enough to balance the nutrients, or strip bones of calcium. This is what triggers tooth decay.

Sugar not only feeds the bacteria on your teeth (I think), it also weakens your teeth to pay the piper for sugar digestion. (Also, there may be something about sugar-fed bacteria lowering the pH in your mouth.)

The second thing that happens is your taste buds acclimate to excessive sugar. Soon “Sweet” tastes “normal.”

Now when you try to stop eating sugar, normal food tastes “boring” “sour” “bitter” etc.
This is where you just have to bite the bullet and cut sugar anyway. If you keep eating normal food, eventually it will start tasting good again.

It just takes time for your brain to change its assumptions about what food tastes like.
But if you keep sweetening your food with “artificial” sweeteners, then you never give yourself a chance to recalibrate what food should taste like. You will keep craving sugar.
And it is really hard to stop eating sugar and let your body return to normal when you crave sugar.

If artificial sweeteners help you reduce sugar consumption and eventually stop using it altogether, then they’re probably a good idea, but don’t fall into the trap of thinking you’re going to get just as much cake and ice cream as always, just it won’t have any consequences anymore. No. Nature doesn’t work like that. Nature has consequences.

So I feel like I’ve been picking on fructose a lot in this post. I didn’t mean to. I am suspicious of all refined sugars; these are just the sources I happened across while researching today.

I am not sure about honey. I don’t eat a lot of honey, but maybe it’s okay. The Hadza of Tanzania eat a great deal of honey and they seem fine, but maybe they’re adapted to their diet in ways that we aren’t.

So what happens when you eat too much sugar? Aside from, obviously, food cravings, weight gain, mineral depletion, and tooth decay…

So here’s a theory:

Our bodies naturally cycle between winter and summer states. At least they do if you hail from a place that historically had winter; I can’t speak for people in radically different climates.

In the summer, plant matter (carbohydrates, fiber,) are widely available and any animal that can takes as much advantage of this as possible. As omnivores, we gorge on berries, leaves, fruits, tubers, really whatever we can. When we are satiated–when we have enough fat stores to last for the winter–our bodies start shutting down insulin production. That’s enough. We don’t need it anymore.

In the winter, there’s very little plant food naturally available, unless you’re a farmer (farming is relatively recent in areas with long winters.)

In the winter, you hunt animals for meat and fat.This is what the Inuit and Eskimo did almost all year round.

The digestion of meat and fat does not require insulin, but works on the ketogenic pathways which, long story short, also turn food into energy and keep people alive.

The real beauty of ketosis is that, apparently, it ramps up your energy production–that is, you feel physically warmer when running entirely off of meat and fat than when running off carbs. Given that ketosis is the winter digestive cycle, this is amazingly appropriate.

By spring, chances are you’ve lost a lot of the weight from last summer. Winters are harsh. With the fat gone, the body starts producing insulin again.

At this point, you go from hyperglycemia (too much sugar in your bloodstream if you eat anything sweet, due to no insulin,) to hypoglycemia–your body produces a lot of insulin to transform any plants you eat into energy FAST. (Remember the discussion above about how your body transforms fructose into fat? Back in our ancestral environment, that was a feature, not a bug!)

This lets you put on pounds quickly in the spring and summer, using now-available plants as your primary food source.

The difficulty with our society is we’ve figured out how to take the energy part out of the plants, refine it, and store up huge quantities of it so we can eat it any time we want, which is all the time.

Evolution makes us want to eat, obviously. Ancestors who didn’t have a good “eat now” drive didn’t eat whatever good food was available and didn’t become ancestors.

But now we’ve hacked that, and as a result we never go into the sugar-free periods we were built to occasionally endure.

I don’t think you need to go full keto or anti-bread or something to make up for this. Just cutting down on refined sugars (and most refined oils, btw) is probably enough for most people.

Note: Humans have been eating grains for longer than the domestication of plants–there’s a reason we thought it was a good idea to domesticate grains in the first place, and it wasn’t because they were a random, un-eaten weed. If your ancestors ate bread, then there’s a good chance that you can digest bread just fine.

But if bread causes you issues, then by all means, avoid it. Different people thrive on different foods.

Please remember that this thread is speculative.

AND FOR GOODNESS SAKES DON’T PUT SUGAR IN FRUIT THINGS. JAM DOES NOT NEED SUGAR. NEITHER DOES PIE.

IF YOU ARE USING DECENT FRUIT THEN YOU DON’T NEED SUGAR. THE ONLY REASON YOU NEED SUGAR IS IF YOUR FRUIT IS CRAP. THEN JUST GO EAT SOMETHING ELSE.

 

Tapeworm-cancer-AIDS is a real thing

Tapeworm Spreads Deadly Cancer to Human:

A Colombian man’s lung tumors turned out to have an extremely unusual cause: The rapidly growing masses weren’t actually made of human cells, but were from a tapeworm living inside him, according to a report of the case.

This is the first known report of a person becoming sick from cancer cells that developed in a parasite, the researchers said.

“We were amazed when we found this new type of disease—tapeworms growing inside a person, essentially getting cancer, that spreads to the person, causing tumors,” said study researcher Dr. Atis Muehlenbachs, a staff pathologist at the Centers for Disease Control and Prevention’s Infectious Diseases Pathology Branch (IDPB).

The man had HIV, which weakens the immune system and likely played a role in allowing the development of the parasite cancer, the researchers said.

There’s not a lot I can add to this.

But there are probably more cases like this, if only because gay men seem to contract a lot of parasites:

Fast forward to the spring of 2017. PreP had recently ushered in the second sexual revolution and everyone was now fucking each other like it was 1979. My wonderful boyfriend and I enjoyed a healthy sex life inside and outside our open relationship. Then he started experiencing stomach problems: diarrhea, bloating, stomach aches, nausea. All too familiar with those symptoms, I recommended he go to the doctor and ask for a stool test. …

His results came back positive for giardia. …

Well, just a few months later, summer of 2017, my boyfriend started experiencing another bout of diarrhea and stomach cramps. … This time the results came back positive for entamoeba histolytica. What the fuck is entamoeba histolytica?! I knew giardia. Giardia and I were on a first name basis. But entamoeba, what now?

Entamoeba histolytica, as it turns out, is another parasite common in developing countries spread through contaminated drinking water, poor hygiene when handling food, and…rimming. The PA treating him wasn’t familiar with entamoeba histolytica or how to treat it, so she had to research (Google?) how to handle the infection. The medical literature (Google search results?) led us back to metronidazole, the same antibiotic used to treat giardia.

When your urge to lick butts is so strong that this keeps happening, you’ve got to consider an underlying condition like toxoplasmosis or kamikaze horsehair worm.

Some Migration-Related Studies

I have too many tabs open on my computer, so here are some studies/writings which all touch on migration/population movements in some way:

Biographical Memoirs of Henry Harpending [pdf]:

The late Henry Harpending of West Hunter blog, along with Greg Cochran, wrote the 10,000 Year Explosion, did anthropological field work among the Ju/’hoansi, and pioneered population genetics. The biography has many interesting parts:

Henry’s early research on population genetics also helped establish the close relationship between genetics and geography. Genetic differences between groups tend to mirror the geographic distance between them, so that a map of genetic distances looks like a geographic map (Harpending and Jenkins, 1973). Henry developed methods for studying this relationship that are still in use. …

Meanwhile, Henry’s Kalahari field experience also motivated an interest in population ecology. Humans cope with variation in resource supply either by storage (averaging over time) or by mobility and sharing (averaging over space). These strategies are mutually exclusive. Those who store must defend their stored resources against others who would like to share them. Conversely, an ethic of sharing makes storage impossible. The contrast between the mobile and the sedentary Ju/’hoansi in Henry’s sample therefore represented a fundamental shift in strategy. …

Diseases need time to cause lesions on bone. If the infected individual dies quickly, no lesion will form, and the skeleton will look healthy. Lesions form only if the infected individual is healthy enough to survive for an extended period. Lesions on ancient bone may therefore imply that the population was healthy! …

In the 1970s, as Henry’s interest in genetic data waned, he began developing population genetic models of social evolution. He overturned 40 years of conventional wisdom by showing that group selection works best not when groups are isolated but when they are strongly connected by gene flow (1980, pp. 58-59; Harpending and Rogers, 1987). When gene flow is restricted, successful mutants cannot spread beyond the initial group, and group selection stalls.

Genetic Consequences of Social Stratification in Great Britain:

Human DNA varies across geographic regions, with most variation observed so far reflecting distant ancestry differences. Here, we investigate the geographic clustering of genetic variants that influence complex traits and disease risk in a sample of ~450,000 individuals from Great Britain. Out of 30 traits analyzed, 16 show significant geographic clustering at the genetic level after controlling for ancestry, likely reflecting recent migration driven by socio-economic status (SES). Alleles associated with educational attainment (EA) show most clustering, with EA-decreasing alleles clustering in lower SES areas such as coal mining areas. Individuals that leave coal mining areas carry more EA-increasing alleles on average than the rest of Great Britain. In addition, we leveraged the geographic clustering of complex trait variation to further disentangle regional differences in socio-economic and cultural outcomes through genome-wide association studies on publicly available regional measures, namely coal mining, religiousness, 1970/2015 general election outcomes, and Brexit referendum results.

Let’s hope no one reports on this as “They found the Brexit gene!”

Can you Move to Opportunity? Evidence from the Great Migration [PDF]:

The northern United States long served as a land of opportunity for black Americans, but today the region’s racial gap in intergenerational mobility rivals that of the South. I show that racial composition changes during the peak of the Great
Migration (1940-1970) reduced upward mobility in northern cities in the long run,
with the largest effects on black men. I identify urban black population increases
during the Migration at the commuting zone level using a shift-share instrument,
interacting pre-1940 black southern migrant location choices with predicted outmigration from southern counties. The Migration’s negative effects on children’s
adult outcomes appear driven by neighborhood factors, not changes in the characteristics of the average child. As early as the 1960s, the Migration led to greater white enrollment in private schools, increased spending on policing, and higher crime and incarceration rates. I estimate that the overall change in childhood environment induced by the Great Migration explains 43% of the upward mobility gap between black and white men in the region today.

43% is huge and, IMO, too big. However, the author may be on to something.

Lineage Specific Histories of Mycobacterium Tuberculosis Dispersal in Africa and Eurasia:

Mycobacterium tuberculosis (M.tb) is a globally distributed, obligate pathogen of humans that can be divided into seven clearly defined lineages. … We reconstructed M.tb migration in Africa and Eurasia, and investigated lineage specific patterns of spread. Applying evolutionary rates inferred with ancient M.tb genome calibration, we link M.tb dispersal to historical phenomena that altered patterns of connectivity throughout Africa and Eurasia: trans-Indian Ocean trade in spices and other goods, the Silk Road and its predecessors, the expansion of the Roman Empire and the European Age of Exploration. We find that Eastern Africa and Southeast Asia have been critical in the dispersal of M.tb.

I spend a surprising amount of time reading about mycobacteria.

Invasive Memes

 

220px-Smallpox_virus_virions_TEM_PHIL_1849
Smallpox virus

Do people eventually grow ideologically resistant to dangerous local memes, but remain susceptible to foreign memes, allowing them to spread like invasive species?

And if so, can we find some way to memetically vaccinate ourselves against deadly ideas?

***

Memetics is the study of how ideas (“memes”) spread and evolve, using evolutionary theory and epidemiology as models. A “viral meme” is one that spreads swiftly through society, “infecting” minds as it goes.

Of course, most memes are fairly innocent (e.g. fashion trends) or even beneficial (“wash your hands before eating to prevent disease transmission”), but some ideas, like communism, kill people.

Ideologies consist of a big set of related ideas rather than a single one, so let’s call them memeplexes.

Almost all ideological memeplexes (and religions) sound great on paper–they have to, because that’s how they spread–but they are much more variable in actual practice.

Any idea that causes its believers to suffer is unlikely to persist–at the very least, because its believers die off.

Over time, in places where people have been exposed to ideological memeplexes, their worst aspects become known and people may learn to avoid them; the memeplexes themselves can evolve to be less harmful.

Over in epidemiology, diseases humans have been exposed to for a long time become less virulent as humans become adapted to them. Chickenpox, for example, is a fairly mild disease that kills few people because the virus has been infecting people for as long as people have been around (the ancestral Varicella-Zoster virus evolved approximately 65 million years ago and has been infecting animals ever since). Rather than kill you, chickenpox prefers to enter your nerves and go dormant for decades, reemerging later as shingles, ready to infect new people.

By contrast, smallpox (Variola major and Variola minor) probably evolved from a rodent-infecting virus about 16,000 to 68,000 years ago. That’s a big range, but either way, it’s much more recent than chickenpox. Smallpox made its first major impact on the historical record around the third century BC, Egypt, and thereafter became a recurring plague in Africa and Eurasia. Note that unlike chickenpox, which is old enough to have spread throughout the world with humanity, smallpox emerged long after major population splits occurred–like part of the Asian clade splitting off and heading into the Americas.

By 1400, Europeans had developed some immunity to smallpox (due to those who didn’t have any immunity dying), but when Columbus landed in the New World, folks here had had never seen the disease before–and thus had no immunity. Diseases like smallpox and measles ripped through native communities, killing approximately 90% of the New World population.

If we extend this metaphor back to ideas–if people have been exposed to an ideology for a long time, they are more likely to have developed immunity to it or the ideology to have adapted to be relatively less harmful than it initially was. For example, the Protestant Reformation and subsequent Catholic counter-reformation triggered a series of European wars that killed 10 million people, but today Catholics and Protestants manage to live in the same countries without killing each other. New religions are much more likely to lead all of their followers in a mass suicide than old, established religions; countries that have just undergone a political revolution are much more likely to kill off large numbers of their citizens than ones that haven’t.

This is not to say that old ideas are perfect and never harmful–chickenpox still kills people and is not a fun disease–but that any bad aspects are likely to become more mild over time as people wise up to bad ideas, (certain caveats applying).

But this process only works for ideas that have been around for a long time. What about new ideas?

You can’t stop new ideas. Technology is always changing. The world is changing, and it requires new ideas to operate. When these new ideas arrive, even terrible ones can spread like wildfire because people have no memetic antibodies to resist them. New memes, in short, are like invasive memetic species.

In the late 1960s, 15 million people still caught smallpox every year. In 1980, it was declared officially eradicated–not one case had been seen since 1977, due to a massive, world-wide vaccination campaign.

Humans can acquire immunity to disease in two main ways. The slow way is everyone who isn’t immune dying; everyone left alive happens to have adaptations that let them not die, which they can pass on to their children. As with chickenpox, over generations, the disease becomes less severe because humans become successively more adapted to it.

The fast way is to catch a disease, produce antibodies that recognize and can fight it off, and thereafter enjoy immunity. This, of course, assumes that you survive the disease.

Vaccination works by teaching body’s immune system to recognize a disease without infecting it with a full-strength germ, using a weakened or harmless version of the germ, instead. Early on, weakened germs from actual smallpox scabs or lesions to inoculate people, a risky method since the germs often weren’t that weak. Later, people discovered that cowpox was similar enough to smallpox that its antibodies could also fight smallpox, but cowpox itself was too adapted to cattle hosts to seriously harm humans. (Today I believe the vaccine uses a different weakened virus, but the principle is the same.)

The good part about memes is that you do not actually have to inject a physical substance into your body in order to learn about them.

Ideologies are very difficult to evaluate in the abstract, because, as mentioned, they are all optimized to sound good on paper. It’s their actual effects we are interested in.

So if we want to learn whether an idea is good or not, it’s probably best not to learn about it by merely reading books written by its advocates. Talk to people in places where the ideas have already been tried and learn from their experiences. If those people tell you this ideology causes mass suffering and they hate it, drop it like a hot potato. If those people are practicing an “impure” version of the ideology, it’s probably an improvement over the original.

For example, “communism” as practiced in China today is quite different from “communism” as practiced there 50 years ago–so much so that the modern system really isn’t communism at all. There was never, to my knowledge, an official changeover from one system to another, just a gradual accretion of improvements. This speaks strongly against communism as an ideology, since no country has managed to be successful by moving toward ideological communist purity, only by moving away from it–though they may still find it useful to retain some of communism’s original ideas.

I think there is a similar dynamic occurring in many Islamic countries. Islam is a relatively old religion that has had time to adapt to local conditions in many different parts of the world. For example, in Morocco, where the climate is more favorable to raising pigs than in other parts of the Islamic world, the taboo against pigs isn’t as strongly observed. The burka is not an Islamic universal, but characteristic of central Asia (the similar niqab is from Yemen). Islamic head coverings vary by culture–such as this kurhars, traditionally worn by unmarried women in Ingushetia, north of the Caucuses, or this cap, popular in Xianjiang. Turkey has laws officially restricting burkas in some areas, and Syria discourages even hijabs. Women in Iran did not go heavily veiled prior to the Iranian Revolution. So the insistence on extensive veiling in many Islamic communities (like the territory conquered by ISIS) is not a continuation of old traditions, but the imposition of a new, idealized, version of Islam.

Purity is counter to practicality.

Of course, this approach is hampered by the fact that what works in one place, time, and community may not work in a different one. Tilling your fields one way works in Europe, and tilling them a different way works in Papua New Guinea. But extrapolating from what works is at least a good start.

 

 

Did tobacco become popular because it kills parasites?

While reading about the conditions in a Burmese prison around the turn of the previous century (The History and Romance of Crime: Oriental Prisons, by Arthur Griffiths)(not good) it occurred to me that there might have been some beneficial effect of the large amounts of tobacco smoke inside the prison. Sure, in the long run, tobacco is highly likely to give you cancer, but in the short run, is it noxious to fleas and other disease-bearing pests?

Meanwhile in Melanesia, (Pygmies and Papuans,) a group of ornithologists struggled up a river to reach an almost completely isolated tribe of Melanesians that barely practiced horticulture; even further up the mountain they met a band of pygmies (negritoes) whose existence had only been rumored of; the pygmies cultivated tobacco, which they traded with their otherwise not terribly interested in trading for worldy goods neighbors.

The homeless smoke at rates 3x higher than the rest of the population, though this might have something to do with the high correlation between schizophrenia and smoking–80% of schizophrenics smoke, compared to 20% of the general population. Obviously this correlation is best explained by tobacco’s well-noted psychological effects (including addiction,) but why is tobacco so ubiquitous in prisons that cigarettes are used as currency? Could they have, in unsanitary conditions, some healthful purpose?

From NPR: Pot For Parasites? Pygmy Men Smoke out Worms:

On average, the more THC byproduct that Hagen’s team found in an Aka man’s urine, the fewer worm eggs were present in his gut.

“The heaviest smokers, with everything else being equal, had about half the number of parasitic eggs in their stool, compared to everyone else,” Hagen says. …

THC — and nicotine — are known to kill intestinal worms in a Petri dish. And many worms make their way to the gut via the lungs. “The worms’ larval stage is in the lung,” Hagan says. “When you smoke you just blast them with THC or nicotine directly.”

Smithsonian reports that Birds Harness the Deadly Power of Nicotine to Poison Parasites:

Smoking kills. But if you’re a bird and if you want to kill parasites, that can be a good thing. City birds have taken to stuffing their nests with cigarette butts to poison potential parasites. Nature reports:

“In a study published today in Biology Letters, the researchers examined the nests of two bird species common on the North American continent. They measured the amount of cellulose acetate (a component of cigarette butts) in the nests, and found that the more there was, the fewer parasitic mites the nest contained.”

Out in the State of Nature, parasites are extremely common and difficult to get rid of (eg, hookworm elimination campaigns in the early 1900s found that 40% of school-aged children were infected); farmers can apparently use tobacco as a natural de-wormer (but be careful, as tobacco can be poisonous.)

In the pre-modern environment, when many people had neither shoes, toilets, nor purified water, parasites were very hard to avoid.
Befoundalive recommends eating the tobacco from a cigarette if you have intestinal parasites and no access to modern medicine.

Here’s a study comparing parasite rates in tobacco workers vs. prisoners in Ethiopia:

Overall, 8 intestinal parasite species have been recovered singly or in combinations from 146 (61.8 %) samples. The prevalence in prison population (88/121 = 72.7%) was significantly higher than that in tobacco farm (58/115 = 50.4%).

In vitro anthelmintic effect of Tobacco (Nicotiana tabacum) extract on parasitic nematode, Marshallagia marshalli reports:

Because of developing resistance to the existing anthelmintic drugs, there is a need for new anthelmintic agents. Tobacco plant has alkaloid materials that have antiparasitic effect. We investigated the in vitro anthelminthic effect of aqueous and alcoholic extract of Tobacco (Nicotiana tabacum) against M. marshalli. … Overall, extracts of Tobacco possess considerable anthelminthic activity and more potent effects were observed with the highest concentrations. Therefore, the in vivo study on Tobocco in animal models is recommended.

(Helminths are parasites; anthelmintic=anti-parasites.)

So it looks like, at least in the pre-sewers and toilets and clean water environment when people struggled to stay parasite free, tobacco (and certain other drugs) may have offered people an edge over the pests. (I’ve noticed many bitter or noxious plants seem to have been useful for occasionally flushing out parasites, but you certainly don’t want to be in a state of “flush” all the time.)

It looks like it was only when regular sanitation got good enough that we didn’t have to worry about parasites anymore that people started getting really concerned with tobacco’s long-term negative effects on humans.

Is Crohn’s Disease Tuberculosis of the Intestines?

Source: Rise in Crohn’s Disease admission rates, Glasgow

Crohn‘s is an inflammatory disease of the digestive tract involving diarrhea, vomiting internal lesions, pain, and severe weight loss. Left untreated, Crohn’s can lead to death through direct starvation/malnutrition, infections caused by the intestinal walls breaking down and spilling feces into the rest of the body, or a whole host of other horrible symptoms, like pyoderma gangrenosum–basically your skin just rotting off.

Crohn’s disease has no known cause and no cure, though several treatments have proven effective at putting it into remission–at least temporarily.

The disease appears to be triggered by a combination of environmental, bacterial, and genetic factors–about 70 genes have been identified so far that appear to contribute to an individual’s chance of developing Crohn’s, but no gene has been found yet that definitely triggers it. (The siblings of people who have Crohn’s are more likely than non-siblings to also have it, and identical twins of Crohn’s patients have a 55% chance of developing it.) A variety of environmental factors, such as living in a first world country, (parasites may be somewhat protective against the disease), smoking, or eating lots of animal protein also correlate with Crohn’s, but since only 3.2/1000 people even in the West have it’s, these obviously don’t trigger the disease in most people.

Crohn’s appears to be a kind of over-reaction of the immune system, though not specifically an auto-immune disorder, which suggests that a pathogen of some sort is probably involved. Most people are probably able to fight off this pathogen, but people with a variety of genetic issues may have more trouble–according to Wikipedia, “There is considerable overlap between susceptibility loci for IBD and mycobacterial infections.[62] ” Mycobacteria are a genus of of bacteria that includes species like tuberculosis and leprosy. A variety of bacteria–including specific strains of e coli, yersinia, listeria, and Mycobacterium avium subspecies paratuberculosis–are found in the intestines of Crohn’s suffers at higher rates than in the intestines of non-sufferers (intestines, of course, are full of all kinds of bacteria.)

Source: The Gutsy Group

Crohn’s treatment depends on the severity of the case and specific symptoms, but often includes a course of antibiotics, (especially if the patient has abscesses,) tube feeding (in acute cases where the sufferer is having trouble digesting food,) and long-term immune-system suppressants such as prednisone, methotrexate, or infliximab. In severe cases, damaged portions of the intestines may be cut out. Before the development of immunosuppressant treatments, sufferers often progressively lost more and more of their intestines, with predictably unpleasant results, like no longer having a functioning colon. (70% of Crohn’s sufferers eventually have surgery.)

A similar disease, Johne’s, infects cattle. Johne’s is caused by Mycobacterium avium subspecies paratuberculosis, (hereafter just MAP). MAP typically infects calves at birth, transmitted via infected feces from their mothers, incubates for two years, and then manifests as diarrhea, malnutrition, dehydration, wasting, starvation, and death. Luckily for cows, there’s a vaccine, though any infectious disease in a herd is a problem for farmers.

If you’re thinking that “paratuberculosis” sounds like “tuberculosis,” you’re correct. When scientists first isolated it, they thought the bacteria looked rather like tuberculosis, hence the name, “tuberculosis-like.” The scientists’ instincts were correct, and it turns out that MAP is in the same bacterial genus as tuberculosis and leprosy (though it may be more closely related to leprosy than TB.) (“Genus” is one step up from “species;” our species is “homo Sapiens;” our genus, homo, we share with homo Neanderthalis, homo Erectus, etc, but chimps and gorillas are not in the homo genus.)

A: Crohn’s Disease in Humans. Figure B: Johne’s Disease in Animals. Greenstein Lancet Infectious Disease, 2004, H/T Human Para Foundation

The intestines of cattle who have died of MAP look remarkably like the intestines of people suffering from advanced Crohn’s disease.

MAP can actually infect all sorts of mammals, not just cows, it’s just more common and problematic in cattle herds. (Sorry, we’re not getting through this post without photos of infected intestines.)

So here’s how it could work:

The MAP bacteria–possibly transmitted via milk or meat products–is fairly common and infects a variety of mammals. Most people who encounter it fight it off with no difficulty (or perhaps have a short bout of diarrhea and then recover.)

A few people, though, have genetic issues that make it harder for them to fight off the infection. For example, Crohn’s sufferers produce less intestinal mucus, which normally acts as a barrier between the intestines and all of the stuff in them.

Interestingly, parasite infections can increase intestinal mucus (some parasites feed on mucus), which in turn is protective against other forms of infection; decreasing parasite load can increase the chance of other intestinal infections.

Once MAP enters the intestinal walls, the immune system attempts to fight it off, but a genetic defect in microphagy results in the immune cells themselves getting infected. The body responds to the signs of infection by sending more immune cells to fight it, which subsequently also get infected with MAP, triggering the body to send even more immune cells. These lumps of infected cells become the characteristic ulcerations and lesions that mark Crohn’s disease and eventually leave the intestines riddled with inflamed tissue and holes.

The most effective treatments for Crohn’s, like Infliximab, don’t target infection but the immune system. They work by interrupting the immune system’s feedback cycle so that it stops sending more cells to the infected area, giving the already infected cells a chance to die. It doesn’t cure the disease, but it does give the intestines time to recover.

Unfortunately, this means infliximab raises your chance of developing TB:

There were 70 reported cases of tuberculosis after treatment with infliximab for a median of 12 weeks. In 48 patients, tuberculosis developed after three or fewer infusions. … Of the 70 reports, 64 were from countries with a low incidence of tuberculosis. The reported frequency of tuberculosis in association with infliximab therapy was much higher than the reported frequency of other opportunistic infections associated with this drug. In addition, the rate of reported cases of tuberculosis among patients treated with infliximab was higher than the available background rates.

because it is actively suppressing the immune system’s ability to fight diseases in the TB family.

Luckily, if you live in the first world and aren’t in prison, you’re unlikely to catch TB–only about 5-10% of the US population tests positive for TB, compared to 80% in many African and Asian countries. (In other words, increased immigration from these countries will absolutely put Crohn’s suffers at risk of dying.)

There are a fair number of similarities between Crohn’s, TB, and leprosy is that they are all very slow diseases that can take years to finally kill you. By contrast, other deadly diseases, like smallpox, cholera, and yersinia pestis (plague), spread and kill extremely quickly. Within about two weeks, you’ll definitely know if your plague infection is going to kill you or not, whereas you can have leprosy for 20 years before you even notice it.

TB, like Crohn’s, creates granulomas:

Tuberculosis is classified as one of the granulomatous inflammatory diseases. Macrophages, T lymphocytes, B lymphocytes, and fibroblasts aggregate to form granulomas, with lymphocytes surrounding the infected macrophages. When other macrophages attack the infected macrophage, they fuse together to form a giant multinucleated cell in the alveolar lumen. The granuloma may prevent dissemination of the mycobacteria and provide a local environment for interaction of cells of the immune system.[63] However, more recent evidence suggests that the bacteria use the granulomas to avoid destruction by the host’s immune system. … In many people, the infection waxes and wanes.

Crohn’s also waxes and wanes. Many sufferers experience flare ups of the disease, during which they may have to be hospitalized, tube fed, and put through another round of antibiotics or sectioning (surgical removal of the intestines) before they improve–until the disease flares up again.

Leprosy is also marked by lesions, though of course so are dozens of other diseases.

Note: Since Crohn’s is a complex, multi-factorial disease, there may be more than one bacteria or pathogen that could infect people and create similar results. Alternatively, Crohn’s sufferers may simply have intestines that are really bad at fighting off all sorts of diseases, as a side effect of Crohn’s, not a cause, resulting in a variety of unpleasant infections.

The MAP hypothesis suggests several possible treatment routes:

  1. Improving the intestinal mucus, perhaps via parasites or medicines derived from parasites
  2. Improving the intestinal microbe balance
  3. Antibiotics that treat Map
  4. Anti-MAP vaccine similar to the one for Johne’s disease in cattle
  5. Eliminate map from the food supply

Here’s an article about the parasites and Crohn’s:

To determine how the worms could be our frenemies, Cadwell and colleagues tested mice with the same genetic defect found in many people with Crohn’s disease. Mucus-secreting cells in the intestines malfunction in the animals, reducing the amount of mucus that protects the gut lining from harmful bacteria. Researchers have also detected a change in the rodents’ microbiome, the natural microbial community in their guts. The abundance of one microbe, an inflammation-inducing bacterium in the Bacteroides group, soars in the mice with the genetic defect.

The researchers found that feeding the rodents one type of intestinal worm restored their mucus-producing cells to normal. At the same time, levels of two inflammation indicators declined in the animals’ intestines. In addition, the bacterial lineup in the rodents’ guts shifted, the team reports online today in Science. Bacteroides’s numbers plunged, whereas the prevalence of species in a different microbial group, the Clostridiales, increased. A second species of worm also triggers similar changes in the mice’s intestines, the team confirmed.

To check whether helminths cause the same effects in people, the scientists compared two populations in Malaysia: urbanites living in Kuala Lumpur, who harbor few intestinal parasites, and members of an indigenous group, the Orang Asli, who live in a rural area where the worms are rife. A type of Bacteroides, the proinflammatory microbes, predominated in the residents of Kuala Lumpur. It was rarer among the Orang Asli, where a member of the Clostridiales group was plentiful. Treating the Orang Asli with drugs to kill their intestinal worms reversed this pattern, favoring Bacteroides species over Clostridiales species, the team documented.

This sounds unethical unless they were merely tagging along with another team of doctors who were de-worming the Orangs for normal health reasons and didn’t intend on potentially inflicting Crohn’s on people. Nevertheless, it’s an interesting study.

At any rate, so far they haven’t managed to produce an effective medicine from parasites, possibly in part because people think parasites are icky.

But if parasites aren’t disgusting enough for you, there’s always the option of directly changing the gut bacteria: fecal microbiota transplants (FMT).  A fecal transplant is exactly what it sounds like: you take the regular feces out of the patient and put in new, fresh feces from an uninfected donor. (When your other option is pooping into a bag for the rest of your life because your colon was removed, swallowing a few poop pills doesn’t sound so bad.) EG, Fecal microbiota transplant for refractory Crohn’s:

Approximately one-third of patients with Crohn’s disease do not respond to conventional treatments, and some experience significant adverse effects, such as serious infections and lymphoma, and many patients require surgery due to complications. .. Herein, we present a patient with Crohn’s colitis in whom biologic therapy failed previously, but clinical remission and endoscopic improvement was achieved after a single fecal microbiota transplantation infusion.

Here’s a Chinese doctor who appears to have good success with FMTs to treat Crohn’s–improvement in 87% of patients one month after treatment and remission in 77%, though the effects may wear off over time. Note: even infliximab, considered a “wonder drug” for its amazing abilities, only works for about 50-75% of patients, must be administered via regular IV infusions for life (or until it stops working,) costs about $20,000 a year per patient, and has some serious side effects, like cancer. If fecal transplants can get the same results, that’s pretty good.

Little known fact: “In the United States, the Food and Drug Administration (FDA) has regulated human feces as an experimental drug since 2013.”

Antibiotics are another potential route. The Redhill Biopharma is conducting a phase III clinical study of antibiotics designed to fight MAP in Crohn’s patients. Redhill is expected to release some of their results in April.

A Crohn’s MAP vaccine trial is underway in healthy volunteers:

Mechanism of action: The vaccine is what is called a ‘T-cell’ vaccine. T-cells are a type of white blood cell -an important player in the immune system- in particular, for fighting against organisms that hide INSIDE the body’s cells –like MAP does. Many people are exposed to MAP but most don’t get Crohn’s –Why? Because their T-cells can ‘see’ and destroy MAP. In those who do get Crohn’s, the immune system has a ‘blind spot’ –their T-cells cannot see MAP. The vaccine works by UN-BLINDING the immune system to MAP, reversing the immune dysregulation and programming the body’s own T-cells to seek out and destroy cells containing MAP. For general information, there are two informative videos about T Cells and the immune system below.

Efficacy: In extensive tests in animals (in mice and in cattle), 2 shots of the vaccine spaced 8 weeks apart proved to be a powerful, long-lasting stimulant of immunity against MAP. To read the published data from the trial in mice, click here. To read the published data from the trial in cattle, click here.

Before: Fistula in the intestines, 31 year old Crohn’s patient–Dr Borody, Combining infliximab, anti-MAP and hyperbaric oxygen therapy for resistant fistulizing Crohn’s disease

Dr. Borody (who was influential in the discovery that ulcers are caused by the h. pylori bacteria and not stress,) has had amazing success treating Crohn’s patients with a combination of infliximab, anti-MAP antibiotics, and hyperbaric oxygen. Here are two of his before and after photos of the intestines of a 31 yr old Crohn’s sufferer:

Here are some more interesting articles on the subject:

Sources: Is Crohn’s Disease caused by a Mycobacterium? Comparisons with Tuberculosis, Leprosy, and Johne’s Disease.

What is MAP?

Researcher Finds Possible link Between Cattle and Human Diseases:

Last week, Davis and colleagues in the U.S. and India published a case report in Frontiers of Medicine http://journal.frontiersin.org/article/10.3389/fmed.2016.00049/full . The report described a single patient, clearly infected with MAP, with the classic features of Johne’s disease in cattle, including the massive shedding of MAP in his feces. The patient was also ill with clinical features that were indistinguishable from the clinical features of Crohn’s. In this case though, a novel treatment approach cleared the patient’s infection.

The patient was treated with antibiotics known to be effective for tuberculosis, which then eliminated the clinical symptoms of Crohn’s disease, too.

After: The same intestines, now healed

Psychology Today: Treating Crohn’s Disease:

Through luck, hard work, good fortune, perseverance, and wonderful doctors, I seem to be one of the few people in the world who can claim to be “cured” of Crohn’s Disease. … In brief, I was treated for 6 years with medications normally used for multidrug resistant TB and leprosy, under the theory that a particular germ causes Crohn’s Disease. I got well, and have been entirely well since 2004. I do not follow a particular diet, and my recent colonoscopies and blood work have shown that I have no inflammation. The rest of these 3 blogs will explain more of the story.

What about removing Johne’s disease from the food supply? Assuming Johne’s is the culprit, this may be hard to do, (it’s pretty contagious in cattle, can lie dormant for years, and survives cooking) but drinking ultrapasteurized milk may be protective, especially for people who are susceptible to the disease.

***

However… there are also studies that contradict the MAP theory. For example, a recent study of the rate of Crohn’s disease in people exposed to Johne’s disease found no correllation. (However, Crohn’s is a pretty rare condition, and the survey only found 7 total cases, which is small enough that random chance could be a factor, but we are talking about people who probably got very up close and personal with feces infected with MAP.)

Another study found a negative correlation between Crohn’s and milk consumption:

Logistic regression showed no significant association with measures of potential contamination of water sources with MAP, water intake, or water treatment. Multivariate analysis showed that consumption of pasteurized milk (per kg/month: odds ratio (OR) = 0.82, 95% confidence interval (CI): 0.69, 0.97) was associated with a reduced risk of Crohn’s disease. Meat intake (per kg/month: OR = 1.40, 95% CI: 1.17, 1.67) was associated with a significantly increased risk of Crohn’s disease, whereas fruit consumption (per kg/month: OR = 0.78, 95% CI: 0.67, 0.92) was associated with reduced risk.

So even if Crohn’s is caused by MAP or something similar, it appears that people aren’t catching it from milk.

There are other theories about what causes Crohn’s–these folks, for example, think it’s related to consumption of GMO corn. Perhaps MAP has only been found in the intestines of Crohn’s patients because people with Crohn’s are really bad at fighting off infections. Perhaps the whole thing is caused by weird gut bacteria, or not enough parasites, insufficient Vitamin D, or industrial pollution.

The condition remains very much a mystery.