Short argument for vending machines full of experimental drugs

So I was thinking the other day about medication and Marilyn Manson’s “I don’t like the drugs but the drugs like me,” and it occurred to me that illegal drugs, generally speaking, are really good at what they do.

By contrast, take anti-depressants. Even the “really good” ones have abominable track records. Maybe a good drug works for 10, 20% of the population–but you don’t know which. Depressed people just have to keep trying different pills until they find one that works better than placebo.

Meanwhile, you’ll never hear someone say “Oh, yeah, crack just doesn’t do anything for me.” Crack works. Heroin works. Sure, they’ll fuck you up, but they work.

Illegal drugs are tried and tested in the almost-free black market of capitalism, where people do whatever they want with them–grind them up, snort them, inject them, put them up their buts–and stop taking them whenever they stop working. As a result, illegal drugs are optimized for being highly addictive, yes, but also for working really well. And through trial and error, people have figured out how much they need, how best to take it, and how often for the optimal effects.

In other words, simply letting lots of people mess around with drugs results in really effective drugs.

The downside to the black-free-market refinement of drugs is that lots of people die in the process.

Most people don’t want to be killed by an experimental anti-depressant, (is that ironic? That seems kind of ironic,) so it makes sense to have safeguards in place to make sure that their latest incarnations won’t send you into cardiac arrest, but many medications are intended for people whose lives are otherwise over. People with alzheimer’s, pancreatic cancer, glioblastoma, ALS, fatal familial insomnia, etc, are going to die. (Especially the ones with fatal familial insomnia. I mean, it’s got “fatal” in the name.) They have been handed death sentences and they know it, so their only possible hope is to speed up drug/treatment development as much as possible.

I am quite certain that something similar to what I am proposing already exists in some form. I am just proposing that we ramp it up: all patients with essentially incurable death sentences have access to whatever experimental drugs (or non-experimental drugs) they  want, with a few obvious caveats about price–but really, price tends to come down with increased demand, so just stock everything in vending machines and charge 75c a dose.

Of course, the end result might just be that alzheimer’s meds come to closely resemble heroin, but hey, at least sick people will feel better as they die.

Since this is a short post, let me append a quick description of fatal familial insomnia: 

Fatal insomnia is a rare disorder that results in trouble sleeping.[2] The problems sleeping typically start out gradually and worsen over time.[3] Other symptoms may include speech problems, coordination problems, and dementia.[4][5] It results in death within a few months to a few years.[2]

It is a prion disease of the brain.[2] It is usually caused by a mutation to the protein PrPC.[2] It has two forms: fatal familial insomnia (FFI), which is autosomal dominant and sporadic fatal insomnia (sFI) which is due to a noninherited mutation. Diagnosis is based on a sleep studyPET scan, and genetic testing.[1]

Fatal insomnia has no known cure and involves progressively worsening insomnia, which leads to hallucinations, delirium, confusional states like that of dementia, and eventually death.[6] The average survival time from onset of symptoms is 18 months.[6] The first recorded case was an Italian man, who died in Venice in 1765.[7]




Be careful what you rationalize

The first few thousand years of “medicine” were pretty bad. We did figure out a few things–an herb that’ll make you defecate faster here, something to staunch bleeding there–but overall, we were idiots. Doctors used to stick leeches on people to make them bleed, because they were convinced that “too much blood” was a problem. A primitive form of CPR invented in the 1700s involved blowing tobacco smoke up a drowned person’s rectum (it didn’t work.) And, of course, people have periodically taken it into their heads that consuming mercury is a good idea.

Did pre-modern (ie, before 1900 or so) doctors even benefit their patients, on net? Consider this account of ancient Egyptian medicine:

The ancient Egyptians had a remarkably well-organized medical system, complete with doctors who specialized in healing specific ailments. Nevertheless, the cures they prescribed weren’t always up to snuff. Lizard blood, dead mice, mud and moldy bread were all used as topical ointments and dressings, and women were sometimes dosed with horse saliva as a cure for an impaired libido.

Most disgusting of all, Egyptian physicians used human and animal excrement as a cure-all remedy for diseases and injuries. According to 1500 B.C.’s Ebers Papyrus, donkey, dog, gazelle and fly dung were all celebrated for their healing properties and their ability to ward off bad spirits. While these repugnant remedies may have occasionally led to tetanus and other infections, they probably weren’t entirely ineffective—research shows the microflora found in some types of animal dung contain antibiotic substances.

Bed rest, nurturing care, a bowl of hot soup–these are obviously beneficial. Dog feces, not so much.

Very ancient medicine and primitive shamanism seem inherently linked–early medicine can probably be divided into “secret knowledge” (ie, useful herbs); magical rites like painting a patient suffering from yellow fever with yellow paint and then washing it off to “wash away” the disease; and outright charlatanry.

It’s amazing that medicine persisted as a profession for centuries despite its terrible track record; you’d think disgruntled patients–or their relatives–would have put a quick and violent end to physicians bleeding patients.

The Christian Scientists got their start when a sickly young woman observed that she felt better when she didn’t go to the doctor than when she did, because this was the 1800s and medicine in those days did more harm than good. Yet the Christian Scientists were (and remain) an exception. Society at large never (to my knowledge) revolted against the “expertise” of supposed doctors.

Our desire for answers in the face of the unknown, our desire to do something when the optimal course is actually doing nothing and just hoping you don’t die, has overwhelmed medicine’s terrible track record for centuries.

Modern medicine is remarkably good. We can set bones, cure bubonic plague, prevent smallpox, and transplant hearts. There are still lots of things we can’t do–we can’t cure the common cold, for example–but modern medicine is, on the whole, positive. So this post is not about modern medicine.

But our tendency to trust too much, to trust the guy who offers answers and solutions over the guy who says “We don’t know, we can’t know, you’re probably best off doing nothing and hoping for the best,” is still with us. It’s probably a cognitive bias, and very hard to combat without purposefully setting out to do so.

So be careful what you rationalize.

Bio-thermodynamics and aging

I suspect nature is constrained by basic physics/chemistry/thermodynamics in a variety of interesting ways.

For example, chemical reactions (and thus biological processes) proceed more quickly when they are warm than cold–this is pretty much a tautology, since temperature=movement–and thus it seems reasonable to expect certain biological processes to proceed more slowly in colder places/seasons than in warmer ones.

The Greenland Shark, which lives in very cold waters, lives to be about 300-500 years old. It’s no coincidence:

Temperature is a basic and essential property of any physical system, including living systems. Even modest variations in temperature can have profound effects on organisms, and it has long been thought that as metabolism increases at higher temperatures so should rates of ageing. Here, we review the literature on how temperature affects longevity, ageing and life history traits. From poikilotherms to homeotherms, there is a clear trend for lower temperature being associated with longer lifespans both in wild populations and in laboratory conditions. Many life-extending manipulations in rodents, such as caloric restriction, also decrease core body temperature.

This implies, in turn, that people (or animals) who overeat will tend to die younger, not necessarily due to any particular effects of having extra lumps of fat around, but because they burn hotter and thus faster.

Weighing more may trigger certain physiological changes–like menarchy–to begin earlier due to the beneficial presence of fat–you don’t want to menstruate if you don’t have at least a little weight to spare–which may in turn speed up certain other parts of aging, but there could be an additional effect on aging just from the presence of more cells in the body, each requiring additional metabolic processes to maintain.

Increased human height (due to better nutrition) over the past century could have a similar effect–shorter men do seem to live longer than taller men, eg: 

Observational study of 8,003 American men of Japanese ancestry from the Honolulu Heart Program/Honolulu-Asia Aging Study (HHP/HAAS), a genetically and culturally homogeneous cohort followed for over 40 years. …

A positive association was found between baseline height and all-cause mortality (RR = 1.007; 95% CI 1.003–1.011; P = 0.002) over the follow-up period. Adjustments for possible confounding variables reduced this association only slightly (RR = 1.006; 95% CI 1.002–1.010; P = 0.007). In addition, height was positively associated with all cancer mortality and mortality from cancer unrelated to smoking. A Cox regression model with time-dependent covariates showed that relative risk for baseline height on mortality increased as the population aged. Comparison of genotypes of a longevity-associated single nucleotide polymorphism in FOXO3 showed that the longevity allele was inversely associated with height. This finding was consistent with prior findings in model organisms of aging. Height was also positively associated with fasting blood insulin level, a risk factor for mortality. Regression analysis of fasting insulin level (mIU/L) on height (cm) adjusting for the age both data were collected yielded a regression coefficient of 0.26 (95% CI 0.10–0.42; P = 0.001).

The more of you there is, the more of you there is to age.

Interesting: lots of data on human height.

But there’s another possibility involving internal temperature–since internal body temperature requires calories to maintain, people who “run hot” (that is, are naturally warmer) may burn more calories and tend to be thinner than people who tend to run cool, who may burn fewer calories and thus tend to weigh more. Eg, low body temperature linked to obesity in new study: 

A new study has found that obese people (BMI >30) have lower body temperature during the day than normal weight people. The obese people had an average body temperature that was .63 degrees F cooler than normal weight people. The researchers calculated that this lower body temperature—which reflects a lower metabolic rate—would result in a body fat accumulation of approximately 160 grams per month, or four to five pounds a year, enough for the creeping weight gain many people experience.

There’s an interesting discussion in the link on thyroid issues that cause people to run cold and thus gain weight, and how some people lose weight with thyroid treatment.

On the other hand, this study found the opposite, and maybe the whole thing just washes out to women and men having different internal temperatures?

Obese people are–according to one study–more likely to suffer mood or mental disorders, which could also be triggered by an underlying health problem. They also suffer faster functional decline in old age:

Women had a higher prevalence of reported functional decline than men at the upper range of BMI categories (31.4% vs 14.3% for BMI > or =40). Women (odds ratio (OR) = 2.61, 95% confidence interval (CI) = 1.39-4.95) and men (OR = 3.32, 95% CI = 1.29-8.46) exhibited increased risk for any functional decline at BMI of 35 or greater. Weight loss of 10 pounds and weight gain of 20 pounds were also risk factors for any functional decline.

Note that gaining weight and losing weight were also related to decline, probably due to health problems that caused the weight fluctuations in the first place.

Of course, general physical decline and mental decline go hand-in-hand. Whether obesity causes declining health, declining health causes obesity, or some underlying third factor, like biological aging underlies both, I don’t know.

Anyway, I know this thought is a bit disjointed; it’s mostly just food for thought.

Can Autism be Cured via a Gluten Free Diet?

I’d like to share a story from a friend and her son–let’s call them Heidi and Sven.

Sven was always a sickly child, delicate and underweight. (Heidi did not seem neglectful.) Once Sven started school, Heidi started receiving concerned notes from his teachers. He wasn’t paying attention in class. He wasn’t doing his work. They reported repetitious behavior like walking slowly around the room and tapping all of the books. Conversation didn’t quite work with Sven. He was friendly, but rarely responded when spoken to and often completely ignored people. He moved slowly.

Sven’s teachers suggested autism. Several doctors later, he’d been diagnosed.

Heidi began researching everything she could about autism. Thankfully she didn’t fall down any of the weirder rabbit holes, but when Sven’s started complaining that his stomach hurt, she decided to try a gluten-free diet.

And it worked. Not only did Sven’s stomach stop hurting, but his school performance improved. He stopped laying his head down on his desk every afternoon. He started doing his work and responding to classmates.

Had a gluten free diet cured his autism?


A gluten free diet cured his celiac disease (aka coeliac disease). Sven’s troublesome behavior was most likely caused by anemia, caused by long-term inflammation, caused by gluten intolerance.

When we are sick, our bodies sequester iron to prevent whatever pathogen is infecting us from using it. This is a sensible response to short-term pathogens that we can easily defeat, but in long-term sicknesses, leads to anemia. Since Sven was sick with undiagnosed celiac disease for years, his intestines were inflamed for years–and his body responded by sequestering iron for years, leaving him continually tired, spacey, and unable to concentrate in school.

The removal of gluten from his diet allowed his intestines to heal and his body to finally start releasing iron.

Whether or not Sven had (or has) autism is a matter of debate. What is autism? It’s generally defined by a list of symptoms/behaviors, not a list of causes. So very different causes could nonetheless trigger similar symptoms in different people.

Saying that Sven’s autism was “cured” by this diet is somewhat misleading, since gluten-free diets clearly won’t work for the majority of people with autism–those folks don’t have celiac disease. But by the same token, Sven was diagnosed with autism and his diet certainly did work for him, just as it might for other people with similar symptoms. We just don’t have the ability right now to easily distinguish between the many potential causes for the symptoms lumped together under “autism,” so parents are left trying to figure out what might work for their kid.

Interestingly, the overlap between “autism” and feeding problems /gastrointestinal disorders is huge. Now, when I say things like this, I often notice that people are confused about the scale of problems. Nearly every parent swears, at some point, that their child is terribly picky. This is normal pickiness that goes away with time and isn’t a real problem. The problems autistic children face are not normal.

Parent of normal child: “My kid is so picky! She won’t eat peas!”

Parent of autistic child: “My kid only eats peas.”

See the difference?

Let’s cut to Wikipedia, which has a nice summary:

Gastrointestinal problems are one of the most commonly associated medical disorders in people with autism.[80] These are linked to greater social impairment, irritability, behavior and sleep problems, language impairments and mood changes, so the theory that they are an overlap syndrome has been postulated.[80][81] Studies indicate that gastrointestinalinflammation, immunoglobulin E-mediated or cell-mediated food allergies, gluten-related disorders (celiac diseasewheat allergynon-celiac gluten sensitivity), visceral hypersensitivity, dysautonomia and gastroesophageal reflux are the mechanisms that possibly link both.[81]

A 2016 review concludes that enteric nervous system abnormalities might play a role in several neurological disorders, including autism. Neural connections and the immune system are a pathway that may allow diseases originated in the intestine to spread to the brain.[82] A 2018 review suggests that the frequent association of gastrointestinal disorders and autism is due to abnormalities of the gut–brain axis.[80]

The “leaky gut” hypothesis is popular among parents of children with autism. It is based on the idea that defects in the intestinal barrier produce an excessive increase of the intestinal permeability, allowing substances present in the intestine, including bacteria, environmental toxins and food antigens, to pass into the blood. The data supporting this theory are limited and contradictory, since both increased intestinal permeability and normal permeability have been documented in people with autism. Studies with mice provide some support to this theory and suggest the importance of intestinal flora, demonstrating that the normalization of the intestinal barrier was associated with an improvement in some of the ASD-like behaviours.[82] Studies on subgroups of people with ASD showed the presence of high plasma levels of zonulin, a protein that regulates permeability opening the “pores” of the intestinal wall, as well as intestinal dysbiosis (reduced levels of Bifidobacteria and increased abundance of Akkermansia muciniphilaEscherichia coliClostridia and Candida fungi) that promotes the production of proinflammatory cytokines, all of which produces excessive intestinal permeability.[83] This allows passage of bacterial endotoxins from the gut into the bloodstream, stimulating liver cells to secrete tumor necrosis factor alpha (TNFα), which modulates blood–brain barrier permeability. Studies on ASD people showed that TNFα cascades produce proinflammatory cytokines, leading to peripheral inflammation and activation of microglia in the brain, which indicates neuroinflammation.[83] In addition, neuroactive opioid peptides from digested foods have been shown to leak into the bloodstream and permeate the blood–brain barrier, influencing neural cells and causing autistic symptoms.[83] (See Endogenous opiate precursor theory)

Here is an interesting case report of psychosis caused by gluten sensitivity:

 In May 2012, after a febrile episode, she became increasingly irritable and reported daily headache and concentration difficulties. One month after, her symptoms worsened presenting with severe headache, sleep problems, and behavior alterations, with several unmotivated crying spells and apathy. Her school performance deteriorated… The patient was referred to a local neuropsychiatric outpatient clinic, where a conversion somatic disorder was diagnosed and a benzodiazepine treatment (i.e., bromazepam) was started. In June 2012, during the final school examinations, psychiatric symptoms, occurring sporadically in the previous two months, worsened. Indeed, she began to have complex hallucinations. The types of these hallucinations varied and were reported as indistinguishable from reality. The hallucinations involved vivid scenes either with family members (she heard her sister and her boyfriend having bad discussions) or without (she saw people coming off the television to follow and scare her)… She also presented weight loss (about 5% of her weight) and gastrointestinal symptoms such as abdominal distension and severe constipation.

So she’s hospitalized and they do a bunch of tests. Eventually she’s put on steroids, which helps a little.

Her mother recalled that she did not return a “normal girl”. In September 2012, shortly after eating pasta, she presented crying spells, relevant confusion, ataxia, severe anxiety and paranoid delirium. Then she was again referred to the psychiatric unit. A relapse of autoimmune encephalitis was suspected and treatment with endovenous steroid and immunoglobulins was started. During the following months, several hospitalizations were done, for recurrence of psychotic symptoms.

Again, more testing.

In September 2013, she presented with severe abdominal pain, associated with asthenia, slowed speech, depression, distorted and paranoid thinking and suicidal ideation up to a state of pre-coma. The clinical suspicion was moving towards a fluctuating psychotic disorder. Treatment with a second-generation anti-psychotic (i.e., olanzapine) was started, but psychotic symptoms persisted. In November 2013, due to gastro-intestinal symptoms and further weight loss (about 15% of her weight in the last year), a nutritionist was consulted, and a gluten-free diet (GFD) was recommended for symptomatic treatment of the intestinal complaints; unexpectedly, within a week of gluten-free diet, the symptoms (both gastro-intestinal and psychiatric) dramatically improvedDespite her efforts, she occasionally experienced inadvertent gluten exposures, which triggered the recurrence of her psychotic symptoms within about four hours. Symptoms took two to three days to subside again.

Note: she has non-celiac gluten sensitivity.

One month after [beginning the gluten free diet] AGA IgG and calprotectin resulted negative, as well as the EEG, and ferritin levels improved.

Note: those are tests of inflammation and anemia–that means she no longer has inflammation and her iron levels are returning to normal.

She returned to the same neuro-psychiatric specialists that now reported a “normal behavior” and progressively stopped the olanzapine therapy without any problem. Her mother finally recalled that she was returned a “normal girl”. Nine months after definitely starting the GFD, she is still symptoms-free.

This case is absolutely crazy. That poor girl. Here she was in constant pain, had constant constipation, was losing weight (at an age when children should be growing,) and the idiot adults thought she had a psychiatric problem.

This is not the only case of gastro-intestinal disorder I have heard of that presented as psychosis.

Speaking of stomach pain, did you know Curt Cobain suffered frequent stomach pain that was so severe it made him vomit and want to commit suicide, and he started self-medicating with heroin just to stop the pain? And then he died.

Back to autism and gastrointestinal issues other than gluten, here is a fascinating new study on fecal transplants (h/t WrathofGnon):

Many studies have reported abnormal gut microbiota in individuals with Autism Spectrum Disorders (ASD), suggesting a link between gut microbiome and autism-like behaviors. Modifying the gut microbiome is a potential route to improve gastrointestinal (GI) and behavioral symptoms in children with ASD, and fecal microbiota transplant could transform the dysbiotic gut microbiome toward a healthy one by delivering a large number of commensal microbes from a healthy donor. We previously performed an open-label trial of Microbiota Transfer Therapy (MTT) that combined antibiotics, a bowel cleanse, a stomach-acid suppressant, and fecal microbiota transplant, and observed significant improvements in GI symptoms, autism-related symptoms, and gut microbiota. Here, we report on a follow-up with the same 18 participants two years after treatment was completed. Notably, most improvements in GI symptoms were maintained, and autism-related symptoms improved even more after the end of treatment.

Fecal transplant is exactly what it sounds like. The doctors clear out a person’s intestines as best they can, then put in new feces, from a donor, via a tube (up the butt or through the stomach; either direction works.)

Unfortunately, it wasn’t a double-blind study, but the authors are hopeful that they can get funding for a double-blind placebo controlled study soon.

I’d like to quote a little more from this study:

Two years after the MTT was completed, we invited the 18 original subjects in our treatment group to participate in a follow-up study … Two years after treatment, most participants reported GI symptoms remaining improved compared to baseline … The improvement was on average 58% reduction in Gastrointestinal Symptom Rating Scale (GSRS) and 26% reduction in % days of abnormal stools… The improvement in GI symptoms was observed for all sub-categories of GSRS (abdominal pain, indigestion, diarrhea, and constipation, Supplementary Fig. S2a) as well as for all sub-categories of DSR (no stool, hard stool, and soft/liquid stool, Supplementary Fig. S2b), although the degree of improvement on indigestion symptom (a sub-category of GSRS) was reduced after 2 years compared with weeks 10 and 18. This achievement is notable, because all 18 participants reported that they had had chronic GI problems (chronic constipation and/or diarrhea) since infancy, without any period of normal GI health.

Note that these children were chosen because they had both autism and lifelong gastrointestinal problems. This treatment may do nothing at all for people who don’t have gastrointestinal problems.

The families generally reported that ASD-related symptoms had slowly, steadily improved since week 18 of the Phase 1 trial… Based on the Childhood Autism Rating Scale (CARS) rated by a professional evaluator, the severity of ASD at the two-year follow-up was 47% lower than baseline (Fig. 1b), compared to 23% lower at the end of week 10. At the beginning of the open-label trial, 83% of participants rated in the severe ASD diagnosis per the CARS (Fig. 2a). At the two-year follow-up, only 17% were rated as severe, 39% were in the mild to moderate range, and 44% of participants were below the ASD diagnostic cut-off scores (Fig. 2a). … The Vineland Adaptive Behavior Scale (VABS) equivalent age continued to improve (Fig. 1f), although not as quickly as during the treatment, resulting in an increase of 2.5 years over 2 years, which is much faster than typical for the ASD population, whose developmental age was only 49% of their physical age at the start of this study.

Important point: their behavior matured faster than it normally does in autistic children.

This is a really interesting study, and I hope the authors can follow it up with a solid double-blind.

Of course, not all autists suffer from gastrointestinal complaints. Many eat and digest without difficulty. But the connection between physical complaints and mental disruption across a variety of conditions is fascinating. How many conditions that we currently believe are psychological might actually be caused a by an untreated biological illness?

Does the DSM need to be re-written?

I recently came across an interesting paper that looked at the likelihood that a person, once diagnosed with one mental disorder, would be diagnosed with another. (Exploring Comorbidity Within Mental Disorders Among a Danish National Population, by Oleguer Plana-Ripoll.)

This was a remarkable study in two ways. First, it had a sample size of 5,940,778, followed up for 83.9 million person-years–basically, the entire population of Denmark over 15 years. (Big Data indeed.)

Second, it found that for virtually every disorder, one diagnoses increased your chances of being diagnosed with a second disorder. (“Comorbid” is a fancy word for “two diseases or conditions occurring together,” not “dying at the same time.”) Some diseases were particularly likely to co-occur–in particular, people diagnosed with “mood disorders” had a 30% chance of also being diagnosed with “neurotic disorders” during the 15 years covered by the study.

Mood disorders includes bipolar, depression, and SAD;

Neurotic disorders include anxieties, phobias, and OCD.

Those chances were considerably higher for people diagnosed at younger ages, and decreased significantly for the elderly–those diagnosed with mood disorders before the age of 20 had a +40% chance of also being diagnosed with a neurotic disorder, while those diagnosed after 80 had only a 5% chance.

I don’t find this terribly surprising, since I know someone with at least five different psychological diagnoses, (nor is it surprising that many people with “intellectual disabilities” also have “developmental disorders”) but it’s interesting just how pervasive comorbidity is across conditions that are ostensibly separate diseases.

This suggests to me that either many people are being mis-diagnosed (perhaps diagnosis itself is very difficult,) or what look like separate disorders are often actually one, single disorder. While it is certainly possible, of course, for someone to have both a phobia of snakes and seasonal affective disorder, the person I know with five diagnoses most likely has only one “true” disorder that has just been diagnosed and treated differently by different clinicians. It seems likely that some people’s depression also manifests itself as deep-rooted anxiety or phobias, for example.

While this is a bit of a blow for many psychiatric diagnoses, (and I am quite certain that many diagnostic categories will need a fair amount of revision before all is said and done,) autism recently got a validity boost–How brain scans can diagnose Autism with 97% accuracy.

The title is overselling it, but it’s interesting anyway:

Lead study author Marcel Just, PhD, professor of psychology and director of the Center for Cognitive Brain Imaging at Carnegie Mellon University, and his team performed fMRI scans on 17 young adults with high-functioning autism and 17 people without autism while they thought about a range of different social interactions, like “hug,” “humiliate,” “kick” and “adore.” The researchers used machine-learning techniques to measure the activation in 135 tiny pieces of the brain, each the size of a peppercorn, and analyzed how the activation levels formed a pattern. …

So great was the difference between the two groups that the researchers could identify whether a brain was autistic or neurotypical in 33 out of 34 of the participants—that’s 97% accuracy—just by looking at a certain fMRI activation pattern. “There was an area associated with the representation of self that did not activate in people with autism,” Just says. “When they thought about hugging or adoring or persuading or hating, they thought about it like somebody watching a play or reading a dictionary definition. They didn’t think of it as it applied to them.” This suggests that in autism, the representation of the self is altered, which researchers have known for many years, Just says.

N=34 is not quite as impressive as N=Denmark, but it’s a good start.

Reminder: Hunter-Gatherers were not Peace Loving Pacifists

From Balancing Selection at the Prion Protein Gene Consistent with Prehistoric Kurulike Epidemics:

Kuru is an acquired prion disease largely restricted to the Fore linguistic group of the Papua New Guinea Highlands, which was transmitted during endocannibalistic feasts. Heterozygosity for a common polymorphism in the human prion protein gene (PRNP) confers relative resistance to prion diseases. Elderly survivors of the kuru epidemic, who had multiple exposures at mortuary feasts, are, in marked contrast to younger unexposed Fore, predominantly PRNP 129 heterozygotes. Kuru imposed strong balancing selection on the Fore, essentially eliminating PRNP 129 homozygotes. Worldwide PRNP haplotype diversity and coding allele frequencies suggest that strong balancing selection at this locus occurred during the evolution of modern humans.

Our ancestors–the ancestors of all humans–ate each other so often that they actually evolved resistance to prion diseases.

(H/T Littlefoot,)

Of course, they weren’t necessarily hunting each other for the calories (humans are not a very good source of calories compared to other common food sources.) They might have just had a habit of eating the dead from their own communities–which is still pretty gruesome.

Of course, cannibalism didn’t stop when people adopted agriculture. The Aztecs were cannibals“Indigenous Culture Day” celebrates genocidal cannibals who were even worse than Columbus. The Anasazi were cannibals. The word “cannibal” itself comes from the language of the Carib Indians. And of course, there are still-living folks in many other parts of the world who have cannibalized others.

But the idea that ancient humans were some kind of angels is absurd.


I have some hopefully good, deep stuff I am working on, but in the meanwhile, here is a quick, VERY SPECULATIVE thread on my theory for why refined sugars are probably bad for you:

First, refined sugars are evolutionarily novel. Unless you’re a Hazda, your ancient ancestors never had this much sugar.

Pick up a piece of raw sugar cane and gnaw on it. Raw sugar cane has such a high fiber to sugar content that you can use it as a toothbrush after chewing it for a bit.

According to the internet, a stick of raw sugar cane has 10 grams of sugar in it. A can of Coke has 39. Even milk (whole, skim, or fat-free) contains 12 grams of natural milk sugars (lactose) per glass. Your body has no problem handling the normal amounts of unrefined sugars in regular foods, but to get the amount of sugar found in a single soda, you’d have to eat almost four whole stalks of sugarcane, which you certainly aren’t going to do in a few minutes.

It’s when we extract all of the sugar and throw away the rest of the fiber, fat, and protein in the food that we run into trouble.

(The same is probably also true of fat, though I am rather fond of butter.)

In my opinion, all forms of heavily refined sugar are suspect, including fruit juice, which is essentially refined fructose. People think that fruit juice is “healthy” because it comes from fruit, which is a plant and therefore “natural” and “good for you,” unlike, say, sugar, which comes from sugar cane, which is… also a plant. Or HFCS, which is totally unnatural because it comes from… corn. Which is a plant.

“They actually did studies on the sugar plantations back in the early 1900s. All of the workers were healthy and lived longer than the sugar executives who got the refined, processed product.”

I don’t know if I agree with everything he has to say, but refined fructose is no more natural than any other refined sugar. Again, the amount of sugar you get from eating an apple is very different from the amount you get from a cup of apple juice.

Now people are talking about reducing childhood obesity by eliminating the scourge of 100% fruit juice:

Excessive fruit juice consumption is associated with increased risk for obesity… sucrose consumption without the corresponding fiber, as is commonly present in fruit juice, is associated with the metabolic syndrome, liver injury, and obesity.

Regular fruit is probably good for you. Refined is not.

Here’s another study on the problems with fructose:

If calcium levels in the blood are low, our bodies produce more parathyroid hormone, stimulating the absorption of calcium by the kidneys, as well as the production of vitamin D (calcitriol), also in the kidneys. Calcitriol stimulates the absorption of calcium in the intestine, decreases the production of PTH and stimulates the release of calcium from the bone. …

… Ferraris fed rats diets with high levels of glucose, fructose or starch. He and his team studied three groups of lactating rats and three groups of non-pregnant rats (the control group).

“Since the amounts of calcium channels and of binding proteins depend on the levels of the hormone calcitriol, we confirmed that calcitriol levels were much greater in lactating rats,” said Ferraris.  … “However, when the rat mothers were consuming fructose, there were no increases in calcitriol levels,” Ferraris added. “The levels remained the same as those in non-pregnant rats, and as a consequence, there were no increases in intestinal and renal calcium transport.”

You then have two options: food cravings until you eat enough to balance the nutrients, or strip bones of calcium. This is what triggers tooth decay.

Sugar not only feeds the bacteria on your teeth (I think), it also weakens your teeth to pay the piper for sugar digestion. (Also, there may be something about sugar-fed bacteria lowering the pH in your mouth.)

The second thing that happens is your taste buds acclimate to excessive sugar. Soon “Sweet” tastes “normal.”

Now when you try to stop eating sugar, normal food tastes “boring” “sour” “bitter” etc.
This is where you just have to bite the bullet and cut sugar anyway. If you keep eating normal food, eventually it will start tasting good again.

It just takes time for your brain to change its assumptions about what food tastes like.
But if you keep sweetening your food with “artificial” sweeteners, then you never give yourself a chance to recalibrate what food should taste like. You will keep craving sugar.
And it is really hard to stop eating sugar and let your body return to normal when you crave sugar.

If artificial sweeteners help you reduce sugar consumption and eventually stop using it altogether, then they’re probably a good idea, but don’t fall into the trap of thinking you’re going to get just as much cake and ice cream as always, just it won’t have any consequences anymore. No. Nature doesn’t work like that. Nature has consequences.

So I feel like I’ve been picking on fructose a lot in this post. I didn’t mean to. I am suspicious of all refined sugars; these are just the sources I happened across while researching today.

I am not sure about honey. I don’t eat a lot of honey, but maybe it’s okay. The Hadza of Tanzania eat a great deal of honey and they seem fine, but maybe they’re adapted to their diet in ways that we aren’t.

So what happens when you eat too much sugar? Aside from, obviously, food cravings, weight gain, mineral depletion, and tooth decay…

So here’s a theory:

Our bodies naturally cycle between winter and summer states. At least they do if you hail from a place that historically had winter; I can’t speak for people in radically different climates.

In the summer, plant matter (carbohydrates, fiber,) are widely available and any animal that can takes as much advantage of this as possible. As omnivores, we gorge on berries, leaves, fruits, tubers, really whatever we can. When we are satiated–when we have enough fat stores to last for the winter–our bodies start shutting down insulin production. That’s enough. We don’t need it anymore.

In the winter, there’s very little plant food naturally available, unless you’re a farmer (farming is relatively recent in areas with long winters.)

In the winter, you hunt animals for meat and fat.This is what the Inuit and Eskimo did almost all year round.

The digestion of meat and fat does not require insulin, but works on the ketogenic pathways which, long story short, also turn food into energy and keep people alive.

The real beauty of ketosis is that, apparently, it ramps up your energy production–that is, you feel physically warmer when running entirely off of meat and fat than when running off carbs. Given that ketosis is the winter digestive cycle, this is amazingly appropriate.

By spring, chances are you’ve lost a lot of the weight from last summer. Winters are harsh. With the fat gone, the body starts producing insulin again.

At this point, you go from hyperglycemia (too much sugar in your bloodstream if you eat anything sweet, due to no insulin,) to hypoglycemia–your body produces a lot of insulin to transform any plants you eat into energy FAST. (Remember the discussion above about how your body transforms fructose into fat? Back in our ancestral environment, that was a feature, not a bug!)

This lets you put on pounds quickly in the spring and summer, using now-available plants as your primary food source.

The difficulty with our society is we’ve figured out how to take the energy part out of the plants, refine it, and store up huge quantities of it so we can eat it any time we want, which is all the time.

Evolution makes us want to eat, obviously. Ancestors who didn’t have a good “eat now” drive didn’t eat whatever good food was available and didn’t become ancestors.

But now we’ve hacked that, and as a result we never go into the sugar-free periods we were built to occasionally endure.

I don’t think you need to go full keto or anti-bread or something to make up for this. Just cutting down on refined sugars (and most refined oils, btw) is probably enough for most people.

Note: Humans have been eating grains for longer than the domestication of plants–there’s a reason we thought it was a good idea to domesticate grains in the first place, and it wasn’t because they were a random, un-eaten weed. If your ancestors ate bread, then there’s a good chance that you can digest bread just fine.

But if bread causes you issues, then by all means, avoid it. Different people thrive on different foods.

Please remember that this thread is speculative.




Tapeworm-cancer-AIDS is a real thing

Tapeworm Spreads Deadly Cancer to Human:

A Colombian man’s lung tumors turned out to have an extremely unusual cause: The rapidly growing masses weren’t actually made of human cells, but were from a tapeworm living inside him, according to a report of the case.

This is the first known report of a person becoming sick from cancer cells that developed in a parasite, the researchers said.

“We were amazed when we found this new type of disease—tapeworms growing inside a person, essentially getting cancer, that spreads to the person, causing tumors,” said study researcher Dr. Atis Muehlenbachs, a staff pathologist at the Centers for Disease Control and Prevention’s Infectious Diseases Pathology Branch (IDPB).

The man had HIV, which weakens the immune system and likely played a role in allowing the development of the parasite cancer, the researchers said.

There’s not a lot I can add to this.

But there are probably more cases like this, if only because gay men seem to contract a lot of parasites:

Fast forward to the spring of 2017. PreP had recently ushered in the second sexual revolution and everyone was now fucking each other like it was 1979. My wonderful boyfriend and I enjoyed a healthy sex life inside and outside our open relationship. Then he started experiencing stomach problems: diarrhea, bloating, stomach aches, nausea. All too familiar with those symptoms, I recommended he go to the doctor and ask for a stool test. …

His results came back positive for giardia. …

Well, just a few months later, summer of 2017, my boyfriend started experiencing another bout of diarrhea and stomach cramps. … This time the results came back positive for entamoeba histolytica. What the fuck is entamoeba histolytica?! I knew giardia. Giardia and I were on a first name basis. But entamoeba, what now?

Entamoeba histolytica, as it turns out, is another parasite common in developing countries spread through contaminated drinking water, poor hygiene when handling food, and…rimming. The PA treating him wasn’t familiar with entamoeba histolytica or how to treat it, so she had to research (Google?) how to handle the infection. The medical literature (Google search results?) led us back to metronidazole, the same antibiotic used to treat giardia.

When your urge to lick butts is so strong that this keeps happening, you’ve got to consider an underlying condition like toxoplasmosis or kamikaze horsehair worm.

Greatest Hits: Can Ice Packs Help Stop a Seizure in Humans?


Source: WHO

Over the years, a few posts have proven surprising hits– Can Ice packs help stop a seizure (in humans)?, Turkey: Not very Turkic, Why do Native Americans Have so much Neanderthal DNA?, and Do Black Babies have Blue Eyes?

It’s been a while since these posts aired, so I thought it was time to revisit the material and see if anything new has turned up.

First, Ice packs and Epilepsy

Ice packs (cold packs) applied to the lower back at the first sign of a seizure may be able to halt or significantly decrease the severity of a seizure in humans.

I consider this one of the most important posts I’ve written, because it is the only one that offers useful, real-life advice: if someone is having a seizure, grab an ice pack or two and press them against the person’s back/neck. There is very little you can do for someone who is already having a seizure besides making sure they don’t accidentally hurt themselves, but using ice packs may help decrease the duration and severity of the seizure.

I have received some very positive responses to the post, including this one, by Tom Coventry:

We have been using an ice pack on our 13 yr old Son’s neck to stop seizures for nearly a year now and it works without fail to bring the seizures to an end within seconds of applying the ice. This is an old technique used before medications were invented, you can read about it at The Meridian Foundation papers on Edgar Case and Abdominal epilepsy.

Here is a relevant quote from Cayce’s paper on abdominal epilepsy:

… Also note that the reflex from the abdomen was mediated through the medulla oblongata, a important nerve center at the upper portion of the spinal cord where it enters the skull.  This is significant because Cayce sometimes recommended that a piece of ice be placed at this area during the aura or at the beginning of the seizure.  This simple technique has proven effective in several contemporary cases where Cayce’s therapeutic model has been utilized. Incidentally, this technique for preventing seizures was also used by osteopathic physicians during the early decades of this century and is included in the therapeutic model developed by the Meridian Institute. …

If the subject is currently experiencing seizures and can sense the beginning of the episode, they are encouraged to use a piece of ice at the base of the brain for one to two minutes.

I encountered the ice packs trick on forums where people were talking about treating seizures in dogs. (Yes, there are dogs with epilepsy.) There are many accounts of people successfully stopping or preventing their dogs from going into a seizure by grabbing a cold pack at the first warning signs and putting it directly onto the dog’s lower back:

We have been using ice packs to help manage our girl’s seizures for over a year now. From what I have heard first hand from others is that it either doesn’t work at all or it works fabulously. With our girl it “works fabulously”. It is not the miracle cure and it does not prevent future seizures but it definitely stops her grand mal right in its tracks. It is the most amazing thing I have ever seen. … If we get the ice pack on her within the first 15 seconds or so, the grand mal just suddenly stops. Like a light switch. All motor movement comes to a halt. She continues to be incoherent for a bit but all movements stop.

Oddly, though, I haven’t found much discussion of the use of ice packs on humans. But if it works on dogs, why wouldn’t it work on people? On the grand evolutionary scale, our nervous systems are pretty similar–we’re both mammals with neocortexes, after all.

From The Hidden Genetics of Epilepsy

My epileptic friend has also reported continued good success with the technique; her husband says he can feel an immediate change in the pattern of the seizure

My original post outlines some of the scientific evidence in favor of the technique; I’ll just quote one bit:

The Journal of American Holistic Veterinary Medical Association published an article on the use of ice packs to stop seizures in dogs, A Simple, Effective Technique for Arresting Canine Epileptic Seizures, back in 2004. You can read it for a mere $95, or check out the highlights on Dawg Business’s blog:

Fifty-one epileptic canine patients were successfully treated during an epileptic seizure with a technique involving the application of ice on the back (T10 to L4). This technique was found to be effective in aborting or shortening the duration of the seizure.

I suspect the “ice trick” was once fairly well-known before there were medications for preventing seizures, but modern doctors are just taught about the medications. And ice packs, to be clear, can’t cure epilepsy. But they can help people who are in the midst of a seizure.

Any doctors out there, please do some research on this. I think a lot of people could benefit.

A mercifully short note on Lice and the Invention of Clothes

Lice apparently come in three varieties: head, body, and pubic. The body louse’s genome was published in 2010 and is the shortest known insect genome. (Does parasitism reduce genome length?) According to Wikipedia:

Pediculus humanus humanus (the body louse) is indistinguishable in appearance from Pediculus humanus capitis (the head louse) but will interbreed only under laboratory conditions. In their natural state, they occupy different habitats and do not usually meet. In particular, body lice have evolved to attach their eggs to clothes, whereas head lice attach their eggs to the base of hairs.

So when did the clothes-infesting body louse decide to stop associating with its hair-clinging cousins?

The body louse diverged from the head louse at around 100,000 years ago, hinting at the time of the origin of clothing.[7][8][9]

So, did Neanderthals have clothes? Or did they survive winters in ice age Europe by being really hairy?

Behavioral modernity–such as intentional burials and cave painting–is thought to have emerged around 50,000 years ago. Some people push this date back to 80,000 years ago, possibly just before the Out of Africa event (something that made people smarter and better at making tools may have been necessary for OOA to succeed.)

But perhaps we should consider the invention of clothing alongside other technological breakthroughs that made us modern–after all, I don’t think we hairless apes could have had much success at conquering the planet without clothes.

(On the other hand, other Wikipedia pages give other estimates for the origin of clothing, some even also citing louse studies, so I’m not sure of the 100k YA date, but surely clothes were invented before we went anywhere cold.)

Oddly, though, there appears to have been at least one human group that managed to survive in a cold climate without much in the way of clothes, the Yaghan people of Tierra del Fuego. In fact, the whole reason the region got named Tierra del Fuego (translation: Land of the fire) is because the nearly-naked locals carried fire with them wherever they went to stay warm.

Only 100-1,600 Yaghans remain; their language is an isolate with only one native speaker, and she’s 89 years old.

Unfortunately, searching for “people with no clothes” does not return any useful information about other groups that might have led similar lifestyles.

PS: Pubic lice evolved from gorilla lice 3 million yeas ago. I bet you didn’t want to know that. Someone should look for that introgression event.

Native Americans appear to also carry a strain of head lice that had previously occupied Homo erectus’s hair, suggesting that H.e. and the ancestors of today’s N.A.s once met. Since these lice aren’t found elsewhere, it’s evidence that H. e. might have survived somewhere out there until fairly recently.