Can Autism be Cured via a Gluten Free Diet?

I’d like to share a story from a friend and her son–let’s call them Heidi and Sven.

Sven was always a sickly child, delicate and underweight. (Heidi did not seem neglectful.) Once Sven started school, Heidi started receiving concerned notes from his teachers. He wasn’t paying attention in class. He wasn’t doing his work. They reported repetitious behavior like walking slowly around the room and tapping all of the books. Conversation didn’t quite work with Sven. He was friendly, but rarely responded when spoken to and often completely ignored people. He moved slowly.

Sven’s teachers suggested autism. Several doctors later, he’d been diagnosed.

Heidi began researching everything she could about autism. Thankfully she didn’t fall down any of the weirder rabbit holes, but when Sven’s started complaining that his stomach hurt, she decided to try a gluten-free diet.

And it worked. Not only did Sven’s stomach stop hurting, but his school performance improved. He stopped laying his head down on his desk every afternoon. He started doing his work and responding to classmates.

Had a gluten free diet cured his autism?

Wait.

A gluten free diet cured his celiac disease (aka coeliac disease). Sven’s troublesome behavior was most likely caused by anemia, caused by long-term inflammation, caused by gluten intolerance.

When we are sick, our bodies sequester iron to prevent whatever pathogen is infecting us from using it. This is a sensible response to short-term pathogens that we can easily defeat, but in long-term sicknesses, leads to anemia. Since Sven was sick with undiagnosed celiac disease for years, his intestines were inflamed for years–and his body responded by sequestering iron for years, leaving him continually tired, spacey, and unable to concentrate in school.

The removal of gluten from his diet allowed his intestines to heal and his body to finally start releasing iron.

Whether or not Sven had (or has) autism is a matter of debate. What is autism? It’s generally defined by a list of symptoms/behaviors, not a list of causes. So very different causes could nonetheless trigger similar symptoms in different people.

Saying that Sven’s autism was “cured” by this diet is somewhat misleading, since gluten-free diets clearly won’t work for the majority of people with autism–those folks don’t have celiac disease. But by the same token, Sven was diagnosed with autism and his diet certainly did work for him, just as it might for other people with similar symptoms. We just don’t have the ability right now to easily distinguish between the many potential causes for the symptoms lumped together under “autism,” so parents are left trying to figure out what might work for their kid.

Interestingly, the overlap between “autism” and feeding problems /gastrointestinal disorders is huge. Now, when I say things like this, I often notice that people are confused about the scale of problems. Nearly every parent swears, at some point, that their child is terribly picky. This is normal pickiness that goes away with time and isn’t a real problem. The problems autistic children face are not normal.

Parent of normal child: “My kid is so picky! She won’t eat peas!”

Parent of autistic child: “My kid only eats peas.”

See the difference?

Let’s cut to Wikipedia, which has a nice summary:

Gastrointestinal problems are one of the most commonly associated medical disorders in people with autism.[80] These are linked to greater social impairment, irritability, behavior and sleep problems, language impairments and mood changes, so the theory that they are an overlap syndrome has been postulated.[80][81] Studies indicate that gastrointestinalinflammation, immunoglobulin E-mediated or cell-mediated food allergies, gluten-related disorders (celiac diseasewheat allergynon-celiac gluten sensitivity), visceral hypersensitivity, dysautonomia and gastroesophageal reflux are the mechanisms that possibly link both.[81]

A 2016 review concludes that enteric nervous system abnormalities might play a role in several neurological disorders, including autism. Neural connections and the immune system are a pathway that may allow diseases originated in the intestine to spread to the brain.[82] A 2018 review suggests that the frequent association of gastrointestinal disorders and autism is due to abnormalities of the gut–brain axis.[80]

The “leaky gut” hypothesis is popular among parents of children with autism. It is based on the idea that defects in the intestinal barrier produce an excessive increase of the intestinal permeability, allowing substances present in the intestine, including bacteria, environmental toxins and food antigens, to pass into the blood. The data supporting this theory are limited and contradictory, since both increased intestinal permeability and normal permeability have been documented in people with autism. Studies with mice provide some support to this theory and suggest the importance of intestinal flora, demonstrating that the normalization of the intestinal barrier was associated with an improvement in some of the ASD-like behaviours.[82] Studies on subgroups of people with ASD showed the presence of high plasma levels of zonulin, a protein that regulates permeability opening the “pores” of the intestinal wall, as well as intestinal dysbiosis (reduced levels of Bifidobacteria and increased abundance of Akkermansia muciniphilaEscherichia coliClostridia and Candida fungi) that promotes the production of proinflammatory cytokines, all of which produces excessive intestinal permeability.[83] This allows passage of bacterial endotoxins from the gut into the bloodstream, stimulating liver cells to secrete tumor necrosis factor alpha (TNFα), which modulates blood–brain barrier permeability. Studies on ASD people showed that TNFα cascades produce proinflammatory cytokines, leading to peripheral inflammation and activation of microglia in the brain, which indicates neuroinflammation.[83] In addition, neuroactive opioid peptides from digested foods have been shown to leak into the bloodstream and permeate the blood–brain barrier, influencing neural cells and causing autistic symptoms.[83] (See Endogenous opiate precursor theory)

Here is an interesting case report of psychosis caused by gluten sensitivity:

 In May 2012, after a febrile episode, she became increasingly irritable and reported daily headache and concentration difficulties. One month after, her symptoms worsened presenting with severe headache, sleep problems, and behavior alterations, with several unmotivated crying spells and apathy. Her school performance deteriorated… The patient was referred to a local neuropsychiatric outpatient clinic, where a conversion somatic disorder was diagnosed and a benzodiazepine treatment (i.e., bromazepam) was started. In June 2012, during the final school examinations, psychiatric symptoms, occurring sporadically in the previous two months, worsened. Indeed, she began to have complex hallucinations. The types of these hallucinations varied and were reported as indistinguishable from reality. The hallucinations involved vivid scenes either with family members (she heard her sister and her boyfriend having bad discussions) or without (she saw people coming off the television to follow and scare her)… She also presented weight loss (about 5% of her weight) and gastrointestinal symptoms such as abdominal distension and severe constipation.

So she’s hospitalized and they do a bunch of tests. Eventually she’s put on steroids, which helps a little.

Her mother recalled that she did not return a “normal girl”. In September 2012, shortly after eating pasta, she presented crying spells, relevant confusion, ataxia, severe anxiety and paranoid delirium. Then she was again referred to the psychiatric unit. A relapse of autoimmune encephalitis was suspected and treatment with endovenous steroid and immunoglobulins was started. During the following months, several hospitalizations were done, for recurrence of psychotic symptoms.

Again, more testing.

In September 2013, she presented with severe abdominal pain, associated with asthenia, slowed speech, depression, distorted and paranoid thinking and suicidal ideation up to a state of pre-coma. The clinical suspicion was moving towards a fluctuating psychotic disorder. Treatment with a second-generation anti-psychotic (i.e., olanzapine) was started, but psychotic symptoms persisted. In November 2013, due to gastro-intestinal symptoms and further weight loss (about 15% of her weight in the last year), a nutritionist was consulted, and a gluten-free diet (GFD) was recommended for symptomatic treatment of the intestinal complaints; unexpectedly, within a week of gluten-free diet, the symptoms (both gastro-intestinal and psychiatric) dramatically improvedDespite her efforts, she occasionally experienced inadvertent gluten exposures, which triggered the recurrence of her psychotic symptoms within about four hours. Symptoms took two to three days to subside again.

Note: she has non-celiac gluten sensitivity.

One month after [beginning the gluten free diet] AGA IgG and calprotectin resulted negative, as well as the EEG, and ferritin levels improved.

Note: those are tests of inflammation and anemia–that means she no longer has inflammation and her iron levels are returning to normal.

She returned to the same neuro-psychiatric specialists that now reported a “normal behavior” and progressively stopped the olanzapine therapy without any problem. Her mother finally recalled that she was returned a “normal girl”. Nine months after definitely starting the GFD, she is still symptoms-free.

This case is absolutely crazy. That poor girl. Here she was in constant pain, had constant constipation, was losing weight (at an age when children should be growing,) and the idiot adults thought she had a psychiatric problem.

This is not the only case of gastro-intestinal disorder I have heard of that presented as psychosis.

Speaking of stomach pain, did you know Curt Cobain suffered frequent stomach pain that was so severe it made him vomit and want to commit suicide, and he started self-medicating with heroin just to stop the pain? And then he died.

Back to autism and gastrointestinal issues other than gluten, here is a fascinating new study on fecal transplants (h/t WrathofGnon):

Many studies have reported abnormal gut microbiota in individuals with Autism Spectrum Disorders (ASD), suggesting a link between gut microbiome and autism-like behaviors. Modifying the gut microbiome is a potential route to improve gastrointestinal (GI) and behavioral symptoms in children with ASD, and fecal microbiota transplant could transform the dysbiotic gut microbiome toward a healthy one by delivering a large number of commensal microbes from a healthy donor. We previously performed an open-label trial of Microbiota Transfer Therapy (MTT) that combined antibiotics, a bowel cleanse, a stomach-acid suppressant, and fecal microbiota transplant, and observed significant improvements in GI symptoms, autism-related symptoms, and gut microbiota. Here, we report on a follow-up with the same 18 participants two years after treatment was completed. Notably, most improvements in GI symptoms were maintained, and autism-related symptoms improved even more after the end of treatment.

Fecal transplant is exactly what it sounds like. The doctors clear out a person’s intestines as best they can, then put in new feces, from a donor, via a tube (up the butt or through the stomach; either direction works.)

Unfortunately, it wasn’t a double-blind study, but the authors are hopeful that they can get funding for a double-blind placebo controlled study soon.

I’d like to quote a little more from this study:

Two years after the MTT was completed, we invited the 18 original subjects in our treatment group to participate in a follow-up study … Two years after treatment, most participants reported GI symptoms remaining improved compared to baseline … The improvement was on average 58% reduction in Gastrointestinal Symptom Rating Scale (GSRS) and 26% reduction in % days of abnormal stools… The improvement in GI symptoms was observed for all sub-categories of GSRS (abdominal pain, indigestion, diarrhea, and constipation, Supplementary Fig. S2a) as well as for all sub-categories of DSR (no stool, hard stool, and soft/liquid stool, Supplementary Fig. S2b), although the degree of improvement on indigestion symptom (a sub-category of GSRS) was reduced after 2 years compared with weeks 10 and 18. This achievement is notable, because all 18 participants reported that they had had chronic GI problems (chronic constipation and/or diarrhea) since infancy, without any period of normal GI health.

Note that these children were chosen because they had both autism and lifelong gastrointestinal problems. This treatment may do nothing at all for people who don’t have gastrointestinal problems.

The families generally reported that ASD-related symptoms had slowly, steadily improved since week 18 of the Phase 1 trial… Based on the Childhood Autism Rating Scale (CARS) rated by a professional evaluator, the severity of ASD at the two-year follow-up was 47% lower than baseline (Fig. 1b), compared to 23% lower at the end of week 10. At the beginning of the open-label trial, 83% of participants rated in the severe ASD diagnosis per the CARS (Fig. 2a). At the two-year follow-up, only 17% were rated as severe, 39% were in the mild to moderate range, and 44% of participants were below the ASD diagnostic cut-off scores (Fig. 2a). … The Vineland Adaptive Behavior Scale (VABS) equivalent age continued to improve (Fig. 1f), although not as quickly as during the treatment, resulting in an increase of 2.5 years over 2 years, which is much faster than typical for the ASD population, whose developmental age was only 49% of their physical age at the start of this study.

Important point: their behavior matured faster than it normally does in autistic children.

This is a really interesting study, and I hope the authors can follow it up with a solid double-blind.

Of course, not all autists suffer from gastrointestinal complaints. Many eat and digest without difficulty. But the connection between physical complaints and mental disruption across a variety of conditions is fascinating. How many conditions that we currently believe are psychological might actually be caused a by an untreated biological illness?

Advertisement

Everything I’ve Read about Food, Summed up in One Graph:

A few years ago I went through a nutrition kick and read about a dozen books about food. Today I came across a graph that perfectly represents what I learned:

Basically, everything will kill you.

There are three major schools of thought on what’s wrong with modern diets: 1. fats, 2. carbs (sugars,) or 3. proteins.

Unfortunately, all food is composed of fats+carbs+proteins.

Ultimately, the best advice I came across was just to stop stressing out. We don’t really know the best foods to eat, and a lot of official health advice that people have tried to follow actually turned out to be quite bad, but we have a decent intuition that you shouldn’t eat cupcakes for lunch.

Dieting doesn’t really do much for the vast majority of people, but it’s a huge industry that sucks up a ton of time and money. How much you weigh has a lot more to do with factors outside of your control, like genetics or whether there’s a famine going on in your area right now.

You’re probably not going to do yourself any favors stressing out about food or eating a bunch of things you don’t like.

Remember the 20/80 rule: 80% of the effect comes from 20% of the effort, and vice versa. Eating reasonable quantities of good food and avoiding junk will do far more good than substituting chicken breast for chicken thighs in everything you cook.

There is definitely an ethnic component to diet–eg, people whose ancestors historically ate grain are better adapted to it than people who didn’t. So if you’re eating a whole bunch of stuff your ancestors didn’t and you don’t feel so good, that may be the problem.

Personally, I am wary of refined sugars in my foods, but I am very sensitive to sugars. (I don’t even drink juice.) But this may just be me. Pay attention to your body and how you feel after eating different kinds of food, and eat what makes you feel good.

Why is our Society so Obsessed with Salads?

It’s been a rough day. So I’m going to complain about something totally mundane: salads.

I was recently privy to a conversation between two older women on why it is so hard to stay thin in the South: lack of good salads. Apparently when you go to a southern restaurant, they serve a big piece of meat (often deep-fried steak) a lump of mashed potatoes and gravy, and a finger-bowl with 5 pieces of iceberg lettuce, an orange tomato, and a slathering of dressing.

Sounds good to me.

Now, if you like salads, that’s fine. You’re still welcome here. Personally, I just don’t see the point. The darn things don’t have any calories!

From an evolutionary perspective, obviously food provides two things: calories and nutrients. There may be some foods that are mostly calorie but little nutrient (eg, honey) and some foods that are nutrient but no calorie (salt isn’t exactly a food, but it otherwise fits the bill.)

Food doesn’t seem like it should be that complicated–surely we’ve evolved to eat effectively by now. So any difficulties we have (besides just getting the food) are likely us over-thinking the matter. There’s no problem getting people to eat high-calorie foods, because they taste good. It’s also not hard to get people to eat salt–it also tastes good.

But people seem to have this ambivalent relationship with salads. What’s so important about eating a bunch of leaves with no calories and a vaguely unpleasant flavor? Can’t a just eat a nice potato? Or some corn? Or asparagus?

Don’t get me wrong. I don’t hate vegetables. Just everything that goes in a salad. Heck, I’ll even eat most salad fixins if they’re cooked. I won’t turn down fried green tomatoes, you know.

While there’s nothing wrong with enjoying a bowl of lettuce if that’s your think, I think our society has gone down a fundamentally wrong collective path when it comes to nutrition wisdom. The idea here is that your hunger drive is this insatiable beast that will force you to consume as much food as possible, making you overweight and giving you a heart attack, and so the only way to save yourself is to trick the beast by filling your stomach with fluffy, zero-calorie plants until there isn’t anymore room.

This seems to me like the direct opposite of what you should be doing. See, I assume your body isn’t an idiot, and can figure out whether you’ve just eaten something full of calories, and so should go sleep for a bit, or if you just ate some leaves and should keep looking for food.

I recently tried increasing the amount of butter I eat each day, and the result was I felt extremely full an didn’t want to eat dinner. Butter is a great way to almost arbitrarily increase the amount of calories per volume of food.

If you’re wondering about my weight, well, let’s just say that despite the butter, never going on a diet, and abhorring salads, I’m still not overweight–but this is largely genetic. (I should note though that I don’t eat many sweets at all.)

Obviously I am not a nutritionist, a dietician, nor a doctor. I’m not a good source for health advice. But it seems to me that increasing or decreasing the number of sweats you eat per day probably has a bigger impact on your overall weight than adding or subtracting a salad.

But maybe I’m missing something.

Why do women love cupcakes?

Seriously.

One of my kids enjoys watching YouTube cooking videos, and they’re nearly 100% women making cakes.

Women’s magazines focus exclusively on 4 topics: men, fashion, diets, and cupcakes. You might think that diets and cupcakes are incompatible, but women’s magazines believe otherwise:

Picture 5 Picture 6 Picture 8

Just in case it’s not clear, that is not a watermellon. It is cake, cleverly disguised as a watermellon.

(YouTube has videos that show you how to make much better cake watermellons–for starters, you want red velvet cake for the middle, not just frosting…)

Picture 10 Picture 11Magazines specifically aimed at “people who want to make cakes” are also overwhelmingly feminine. Whether we’re talking wedding cakes or chocolate cravings, apple pastries or donuts, sweets and women just seem to go together.

If men’s magazines ever feature food, I bet they’re steak and BBQ. (*Image searches*)

Picture 19 Picture 18 Picture 14 Picture 16

 

 

 

 

Yup.

The meat-related articles do appear to be a little more gender-neutral than the cupcake-related articles–probably because men don’t tend to decorate their steaks with tiny baseball bats cut out of steak the way women like to decorate their cakes with tiny flowers made out of frosting.

It’s almost as if women have some kind of overwhelming craving for fats and sugars that men don’t really share.

I was talking with a friend recently about their workplace, where, “All of the women are on diets, but none of them can stay on their diets because they are all constantly eating at their workstations.” Further inquiries revealed that yes, they are eating sweets and pastries, not cashews and carrots, and that there is some kind of “office culture” of all of the women eating pastries together.

The irony here is pretty obvious.

Even many (most?) specialty “diet” foods are designed to still taste sweet. “Fat-free” yogurt is marketed as a health food even though it has as much sugar in it as a bowl of ice cream. Women are so attracted to the taste of sweet sodas, they drink disgusting Diet Coke. Dieting websites advise us that cake topped with fruit is “healthy.”

When men diet, they think “eat nothing but protein until ketosis kicks in” sounds like a great idea. When women diet, they want fat-free icecream.

I don’t think it is just “women lack willpower.” (Or at least, not willpower in the sense of something people have much control over.) Rather, I think that men and women actually have substantially different food cravings.

So do children, for that matter.

Throughout most of human history, from hunter-gatherers to agriculturalists, the vast majority of women have specialized in obtaining (gathering, tending, harvesting,) plants. (The only exceptions are societies where people don’t eat plants, like the Inuit and the Masai, and our modern society, where most of us aren’t involved in food production.) By contrast, men have specialized in hunting, raising, and butchering animals–not because they were trying to hog the protein or had some sexist ideas about food production, but because animals tend to be bigger and heavier than women can easily lift. Dragging home and butchering large game requires significant strength.

I am inventing a “Just So” story, of course. But it seems sensible enough that each gender evolved a tendency to crave the particular kinds of foods it was most adept at obtaining.

Exercise wears down muscles; protein is necessary to build them back up. Protein fuels active lifestyles, and active lifestyles, in turn, require protein. Our male ancestors’ most important activities were most likely heavy labor (eg, building huts, hauling firewood, butchering game,) and defending the tribe. Our female ancestors’ most important activities were giving birth and nursing children (we would not exist had they not, after all.) For these activities, women want to be fat. It’s not good enough to put on weight after you get pregnant, when the growing fetus is already dependent on its mother for nutrients. Far better for a woman to be plump before she gets pregnant (and to stay that way long after.)

Of course, this is “fat” by historical standards, not modern American standards.

I suspect, therefore, that women are naturally inclined to eat as much as possible of sweet foods in order to put on weight in preparation for pregnancy and lactation–only today, the average woman has 2 pregnancies instead of 12, and so instead of turning that extra weight into children and milk, it just builds up.

Obviously we are talking about a relatively small effect on food preferences, both because our ancestors could not afford to be too picky about what they ate, and because the genetic difference between men and women is slight–not like the difference between humans and lizards, say.

Interestingly, gender expression in humans appears to basically be female by default. If, by random chance, you are born with only one X chromosome, (instead of the normal XX or XY,) you can still survive. Sure, you’ll be short, you probably won’t menstruate, and you’ll likely have a variety of other issues, but you’ll be alive. By contrast, if you received only a Y chromosome from your parents and no accompanying X, you wouldn’t be here reading this post. You can’t survive with just a Y. Too many necessary proteins are encoded on the X.

Gender differences show up even in fetuses, but don’t become a huge deal until puberty, when the production of androgens and estrogens really cranks up.

Take muscle development: muscle development relies on the production of androgens (eg, testosterone.) Grownups produce more androgens than small children, and men produce more than women. Children can exercise and certainly children who do daily farm chores are stronger than children who sit on their butts watching TV all day, but children can’t do intense strength-training because they just don’t produce enough androgens to build big muscles. Women, likewise, produce fewer androgens, and so cannot build muscles at the same rate as men, though obviously they are stronger than children.

At puberty, boys begin producing the androgens that allow them to build muscles and become significantly stronger than girls.

Sans androgens, even XY people develop as female. (See Androgen Insensitivity Syndrome, in which people with XY chromosomes cannot absorb the androgens their bodies create, and so develop as female.) Children produce some androgens (obviously,) but not nearly as many as adults. Pre-pubescent boys, therefore, are more “feminine,” biologically, than post-pubescent men; puberty induces maleness.

All children seem pretty much obsessed with sweets, far more than adults. If allowed, they will happily eat cake until they vomit.

Even though food seems like a realm where evolution would heavily influence our tastes, it’s pretty obvious that culture has a huge effect. I doubt Jews have a natural aversion to pork or Hindus to beef. Whether you think chicken hearts are tasty or vomitous is almost entirely dependent on whether or not they are a common food in your culture.

But small children are blissfully less attuned to culture than grownups. Like little id machines, they spit out strained peas and throw them on the floor. They do not care about our notion that “vegetables are good for you.” This from someone who’ll eat bird poop if you let them.

The child’s affection for sweets, therefore, I suspect is completely natural and instinctual. Before the invention of refined sugars and modern food distribution systems, it probably kept them alive and healthy. Remember that the whole reason grownups try to eat more vegetables is that vegetables are low in calories. Grownups have larger stomachs and so can eat more than children, allowing them to extract adequate calories from low-calorie foods, but small children do not and cannot. In developing countries, children still have trouble getting enough calories despite abundant food in areas where that food is low-calorie plants, which they just cannot physically eat enough of. Children, therefore, are obsessed with high-calorie foods.

At puberty, this instinct changes for boys–orienting them more toward protein sources, which they are going to have to expend a lot of energy trying to haul back to their families for the rest of their lives, but stays basically unchanged in females.

ETA: I have found two more sources/items of relevance:

Calorie information effects on consumers’ food choices: Sources of observed gender heterogeneity, by Heiman and Lowengart:

When it comes to what we eat, men and women behave differently: Men consume more beef, eggs, and poultry; while women eat more fruits and vegetables and consume less fat than do men. … The gender differences in preferences for healthier foods begin in childhood. Previous literature has found that girls choose healthier food and are fonder of fruits and vegetables than are boys. Boys rated beef, processed meat, and eggs as more desirable than did girls. …

Sensory (taste) differences between the genders are the second most widely ventured explanation for the differences in food choices, although it is not clear that such genetic differences actually exist. While the popular media argue that females prefer sweetness and dislike bitterness, while males may enjoy bitterness, academic literature on this matter is less conclusive. The bitter taste receptor, gene TAS2R38, has been associated with the ability to taste PROP (6-n-propylthiouracil),
one source of genetic variation in PROP and PTC taste. Individuals who experience bitterness strongly are assumed to also experience sweetness strongly relative to those who experience PROP as only slightly bitter. While previous studies found that inherited taste-blindness to bitter compounds such as PROP may be a risk factor for obesity, this literature has been hotly disputed.

The distribution of perceived bitterness of PROP differs among women and men, as does the correlation between genetic taste measures and acceptance of sweetness. A higher percentage of women are PROP and PTC tasters, sensing bitterness above threshold. It has been suggested that women are more likely to be supertasters, or those who taste with far greater intensity than average.

(I have removed the in-line citations for ease of reading; please refer to the original if you want them.)

Also:

CiYHjSyUUAATAxX

Well, I don’t remember where this graph came from, but it looks like my intuitions were pretty good. males and females both have very low levels of testosterone during childhood, and duing puberty their levels become radically different.

Is there a correlation between intelligence and taste?

(I am annoyed by the lack of bands between 1200 and 1350)
(source)

De gustibus non disputandum est. — Confucius

We’re talking about foods, not whether you prefer Beethoven or Lil’ Wayne.

Certainly there are broad correlations between the foods people enjoy and their ethnicity/social class. If you know whether I chose fried okra, chicken feet, gefilte fish, escargot, or grasshoppers for dinner, you can make a pretty good guess about my background. (Actually, I have eaten all of these things. The grasshoppers were over-salted, but otherwise fine.) The world’s plethora of tasty (and not-so-tasty) cuisines is due primarily to regional variations in what grows well where (not a lot of chili peppers growing up in Nunavut, Canada,) and cost (the rich can always afford fancier fare than the poor,) with a side dish of seemingly random cultural taboos like “don’t eat pork” or “don’t eat cows” or “don’t eat grasshoppers.”

But do people vary in their experience of taste? Does intelligence influence how you perceive your meal, driving smarter (or less-smart) people to seek out particular flavor profiles or combinations? Or could there be other psychological or neurological factors at play n people’s eating decisions?

This post was inspired by a meal my husband, an older relative and I shared recently at McDonald’s. It had been a while since we’d last patronized McDonald’s, but older relative likes their burgers, so we went and ordered some new-to-us variety of meat-on-a-bun. As my husband and I sat there, deconstructing the novel taste experience and comparing it to other burgers, the older relative gave us this look of “Jeez, the idiots are discussing the flavor of a burger! Just eat it already!”

As we dined later that evening at my nemesis, Olive Garden, I began wondering whether we actually experienced the food the same way. Perhaps there is something in people that makes them prefer bland, predictable food. Perhaps some people are better at discerning different flavors, and the people who cannot discern them end up with worse food because they can’t tell?

Unfortunately, it appears that not a lot of people have studied whether there is any sort of correlation between IQ and taste (or smell.) There’s a fair amount of research on taste (and smell,) like “do relatives of schizophrenics have impaired senses of smell?” (More on Schizophrenics and their decreased ability to smell) or “can we get fat kids to eat more vegetables?” Oh, and apparently the nature of auditory hallucinations in epileptics varies with IQ (IIRC.) But not much that directly addresses the question.

I did find two references that, somewhat in passing, noted that they found no relationship between taste and IQ, but these weren’t studies designed to test for that. For example, in A Food Study of Monotony, published in 1958 (you know I am really looking for sources when I have to go back to 1958,) researchers restricted the diets of military personnel employed at an army hospital to only 4 menus to see how quickly and badly they’d get bored of the food. They found no correlation between boredom and IQ, but people employed at an army hospital are probably pre-selected for being pretty bright (and having certain personality traits in common, including ability to stand army food.)

Interestingly, three traits did correlate with (or against) boredom:

Fatter people got bored fastest (the authors speculate that they care the most about their food,) while depressed and feminine men (all subjects in the study were men) got bored the least. Depressed people are already disinterested in food, so it is hard to get less-interested, but no explanation was given of what they meant by “femininity” or how this might affect food preferences. (Also, the hypochondriacs got bored quickly.)

Some foods inspire boredom (or even disgust) quickly, while others are virtually immune. Milk and bread, for example, can be eaten every day without complaint (though you might get bored if bread were your only food.) Potted meat, by contrast, gets old fast.

Likewise, Personality Traits and Eating Habits (warning PDF) notes that:

Although self-reported eating practices were not associated with educational level, intelligence, nor various indices of psychopathology, they were related to the demographic variables of gender and age: older participants reported eating more fiber in their diets than did younger ones, and women reported more avoidance of fats from meats than did men.

Self-reported eating habits may not be all that reliable, though.

Autistic children do seem to be worse at distinguishing flavors (and smells) than non-autistic children, eg Olfaction and Taste Processing in Autism:

Participants with autism were significantly less accurate than control participants in identifying sour tastes and marginally less accurate for bitter tastes, but they were not different in identifying sweet and salty stimuli. … Olfactory identification was significantly worse among participants with autism. … True differences exist in taste and olfactory identification in autism. Impairment in taste identification with normal detection thresholds suggests cortical, rather than brainstem dysfunction.

(Another study of the eating habits of autistic kids found that the pickier ones were rated by their parents as more severely impaired than the less picky ones, but then severe food aversions are a form of life impairment. By the way, do not tell the parents of an autistic kid, “oh, he’ll eat when he’s hungry.” They will probably respond politely, but mentally they are stabbing you.)

On brainstem vs. cortical function–it appears that we do some of our basic flavor identification way down in the most instinctual part of the brain, as Facial Expressions in Response to Taste and Smell Stimulation explores. The authors found that pretty much everyone makes the same faces in response to sweet, sour, and bitter flavors–whites and blacks, old people and newborns, retarded people and blind people, even premature infants, blind infants, and infants born missing most of their brains. All of which is another point in favor of my theory that disgust is real. (And if that is not enough science of taste for you, I recommend Place and Taste Aversion Learning, in which animals with brain lesions lost their fear of new foods.)

Genetics obviously plays a role in taste. If you are one of the 14% or so of people who think cilantro tastes like soap (and I sympathize, because cilantro definitely tastes like soap,) then you’ve already discovered this in a very practical way. Genetics also obviously determine whether you continue producing the enzyme for milk digestion after infancy (lactase persistence). According to Why are you a picky eater? Blame genes, brains, and breastmilk:

In many cases, mom and dad have only themselves to blame for unwittingly passing on the genes that can govern finicky tastes. Studies show that genes play a major role in determining who becomes a picky eater, including recent research on a group of 4- to 7-year-old twins. Part of the pickiness can be attributed to specific genes that govern taste. Variants of the TAS2R38 gene, for example, have been found to encode for taste receptors that determine how strongly someone tastes bitter flavors.

Researchers at Philadelphia’s Monell Chemical Senses Center, a scientific institute dedicated to the study of smell and taste, have found that this same gene also predicts the strength of sweet-tooth cravings among children. Kids who were more sensitive to bitterness preferred sugary foods and drinks. However, adults with the bitter receptor genes remained picky about bitter foods but did not prefer more sweets, the Monell study found. This suggests that sometimes age and experience can override genetics.

I suspect that there is actually a sound biological, evolutionary reason why kids crave sweets more than grownups, and this desire for sweets is somewhat “turned off” as we age.

Picture 10

From a review of Why some like it hot: Food, Genetics, and Cultural Diversity:

Ethnobotanist Gary Paul Nabhan suggests that diet had a key role in human evolution, specifically, that human genetic diversity is predominately a product of regional differences in ancestral diets. Chemical compounds found within animals and plants varied depending on climate. These compounds induced changes in gene expression, which can vary depending on the amount within the particular food and its availability. The Agricultural Age led to further diet-based genetic diversity. Cultivation of foods led to the development of novel plants and animals that were not available in the ancestral environment. …

There are other fascinating examples of gene-diet interaction. Culturally specific recipes, semi-quantitative blending of locally available foods and herbs, and cooking directions needed in order to reduce toxins present in plants, emerged over time through a process of trial-and error and were transmitted through the ages. The effects on genes by foods can be extremely complex given the range of plant-derived compounds available within a given region. The advent of agriculture is suggested to have overridden natural selection by random changes in the environment. The results of human-driven selection can be highly unexpected. …

In sedentary herding societies, drinking water was frequently contaminated by livestock waste. The author suggests in order to avoid contaminated water, beverages made with fermented grains or fruit were drunk instead. Thus, alcohol resistance was selected for in populations that herded animals, such as Europeans. By contrast, those groups which did not practice herding, such as East Asians and Native Americans, did not need to utilize alcohol as a water substitute and are highly sensitive to the effects of alcohol.

Speaking of genetics:

(source?)
From Eating Green could be in your Genes

Indians and Africans are much more likely than Europeans and native South Americans to have an allele that lets them eat a vegetarian diet:

The vegetarian allele evolved in populations that have eaten a plant-based diet over hundreds of generations. The adaptation allows these people to efficiently process omega-3 and omega-6 fatty acids and convert them into compounds essential for early brain development and controlling inflammation. In populations that live on plant-based diets, this genetic variation provided an advantage and was positively selected in those groups.

In Inuit populations of Greenland, the researchers uncovered that a previously identified adaptation is opposite to the one found in long-standing vegetarian populations: While the vegetarian allele has an insertion of 22 bases (a base is a building block of DNA) within the gene, this insertion was found to be deleted in the seafood allele.

Of course, this sort of thing inspires a wealth of pop-psych investigations like Dr. Hirsch’s What Flavor is your Personality?  (from a review:

Dr. Hirsh, neurological director of the Smell and Taste Research and Treatment Foundation in Chicago, stands by his book that is based on over 24 years of scientific study and tests on more than 18,000 people’s food choices and personalities.)

that nonetheless may have some basis in fact, eg: Personality may predict if you like spicy foods:

Byrnes assessed the group using the Arnett Inventory of Sensation Seeking (AISS), a test for the personality trait of sensation-seeking, defined as desiring novel and intense stimulation and presumed to contribute to risk preferences. Those in the group who score above the mean AISS score are considered more open to risks and new experiences, while those scoring below the mean are considered less open to those things.

The subjects were given 25 micrometers of capsaicin, the active component of chili peppers, and asked to rate how much they liked a spicy meal as the burn from the capsaicin increased in intensity. Those in the group who fell below the mean AISS rapidly disliked the meal as the burn increased. People who were above the mean AISS had a consistently high liking of the meal even as the burn increased. Those in the mean group liked the meal less as the burn increased, but not nearly as rapidly as those below the mean.

And then there are the roughly 25% of us who are “supertasters“:

A supertaster is a person who experiences the sense of taste with far greater intensity than average. Women are more likely to be supertasters, as are those from Asia, South America and Africa.[1] The cause of this heightened response is unknown, although it is thought to be related to the presence of the TAS2R38 gene, the ability to taste PROP and PTC, and at least in part, due to an increased number of fungiform papillae.[2]

Perhaps the global distribution of supertasters is related to the distribution of vegetarian-friendly alleles. It’s not surprising that women are more likely to be supertasters, as they have a better sense of smell than men. What may be surprising is that supertasters tend not to be foodies who delight in flavoring their foods with all sorts of new spices, but instead tend toward more restricted, bland diets. Because their sense of taste is essentially on overdrive, flavors that taste “mild” to most people taste “overwhelming” on their tongues. As a result, they tend to prefer a much more subdued palette–which is, of course, perfectly tasty to them.

Picture 8A French study, Changes in Food Preferences and Food Neophobia during a Weight Reduction Session, measured kids’ ability to taste flavors, then the rate at which they became accustomed to new foods. The more sensitive the kids were to flavors, the less likely they were to adopt a new food; the less adept they were at tasting flavors, the more likely they were to start eating vegetables.

Speaking of pickiness again:

“During research back in the 1980s, we discovered that people are more reluctant to try new foods of animal origin than those of plant origin,” Pelchat says. “That’s ironic in two ways. As far as taste is concerned, the range of flavors in animal meat isn’t that large compared to plants, so there isn’t as much of a difference. And, of course, people are much more likely to be poisoned by eating plants than by animals, as long as the meat is properly cooked.” …

It’s also possible that reward mechanisms in our brain can drive changes in taste. Pelchat’s team once had test subjects sample tiny bits of unfamiliar food with no substantial nutritional value, and accompanied them with pills that contained either nothing or a potent cocktail of caloric sugar and fat. Subjects had no idea what was in the pills they swallowed. They learned to like the unfamiliar flavors more quickly when they were paired with a big caloric impact—suggesting that body and brain combined can alter tastes more easily when unappetizing foods deliver big benefits.

So trying to get people to adopt new foods while losing weight may not be the best idea.

(For all that people complain about kids’ pickiness, parents are much pickier. Kids will happily eat playdoh and crayons, but one stray chicken heart in your parents’ soup and suddenly it’s “no more eating at your house.”)

Of course, you can’t talk about food without encountering meddlers who are convinced that people should eat whatever they’re convinced is the perfect diet, like these probably well-meaning folks trying to get Latinos to eat fewer snacks:

Latinos are the largest racial and ethnic minority group in the United States and bear a disproportionate burden of obesity related chronic disease. Despite national efforts to improve dietary habits and prevent obesity among Latinos, obesity rates remain high. …

there is a need for more targeted health promotion and nutrition education efforts on the risks associated with soda and energy-dense food consumption to help improve dietary habits and obesity levels in low-income Latino communities.

Never mind that Latinos are one of the healthiest groups in the country, with longer life expectancies than whites! We’d better make sure they know that their food ways are not approved of!

I have been saving this graph for just such an occasion.
Only now I feel bad because I forgot to write down who made this graph so I can properly credit them. If you know, please tell me!

(Just in case it is not clear already: different people are adapted to and will be healthy on different diets. There is no magical, one-size-fits-all diet.)

And finally, to bring this full circle, it’s hard to miss the folks claiming that Kids Who Eat Fast Food Have Lower IQs:

4,000 Scottish children aged 3-5 years old were examined to compare the intelligence dampening effects of fast food consumption versus  “from scratch”  fare prepared with only fresh ingredients.

Higher fast food consumption by the children was linked with lower intelligence and this was even after adjustments for wealth and social status were taken into account.

It’d be better if they controlled for parental IQ.

The conclusions of this study confirm previous research which shows long lasting effects on IQ from a child’s diet. An Australian study from the University of Adelaide published in August 2012 showed that toddlers who consume junk food grow less smart as they get older. In that study, 7000 children were examined at the age of 6 months, 15 months, 2 years to examine their diet.

When the children were examined again at age 8, children who were consuming the most unhealthy food had IQs up to 2 points lower than children eating a wholesome diet.

 

 

Kabloona Friday

(Part of a series on de Poncins’s Kabloona, an ethnography of the Eskimo/Inuit.)

How’s winter treating you?

Up near the North Pole, I hear it gets really cold. Like, really cold:

That journey homeward in darkness was an unrelieved agony. I was cold; I was freezing; not only in the flesh, but my soul was frozen. As I sat on the swaying and creaking sled the cold became an obsession, almost an hallucination, and soon I was in a delirium of cold. … My brain had shrunk to the dimensions of a dried raisin. Stubbornly, painfully, almost maliciously, it clung to a single thought, made room for no other image: “I am cold!” I was not cold as people Outside are cold. I was not shivering. I was in the cold, dipped into a trough where the temperature was thirty degrees below zero…

During this same journey across the frozen polar sea, the Eskimo, dressed in the same clothes and just as many layers, experienced no such hypothermic delusions. Undoubtedly this is at least in part due to evolutionary adaptations that help them withstand the cold, but a few pages earlier, de Poncins had vividly (and unknowingly) described another reason the Eskimos were much warmer than he:

I do not know what the hour was, but I who had dozed off woke up. Under my eye were the three Eskimos, three silhouettes lit up from behind by the uncertain glow of a candle that threw on the walls of the igloo a mural of fantastically magnified shadows. All three men were down on the floor in the same posture… They were eating, and whether it was that the smell of the seal had been irresistible, or that the idea of the hunt had stimulated their appetites, they had embarked upon a feast. Each had a huge chunk of meat in his hands and mouth, and by the soundless flitting of their arms made immeasurably long in the shadows on the wall, I could see that even before one piece had been wholly gobbled their hands were fumbling in the basin for the next quarter. The smell in the igloo was of seal and of savages hot and gulping. …

I have seen astonishing things, in remote places and not merely in circuses. In the New Hebrides, for example, I have unpacked my own meat in a circle of cannibals and have seen in their eyes a gleam that was perhaps more intense than comforting. Here, in this igloo, all that I had seen before was now surpassed. There were three men, and there must have been fifty pounds of meat. The three men attacked that meat with the rumbling and growling of animals warning their kind away from their private prey. They ground their teeth and their jaws cracked as they ate, and they belched… The walls of the igloo were horrid with the ruddy dripping of bloody spittle and still they ate on, and still they put out simian arms and turned over with indescribable hands morsels in the beginning disdained and now become dainties greedily swallowed. And till, like beats, they picked up chunks and flung them almost instantly down again in order to put their teeth into other and perhaps more succulent bits. They had long since stopped cutting the meat with their circular knives: their teeth sufficed, and the very bones of the seal cracked and splintered in their faces. What those teeth could do, I already knew. When the cover of a gasoline drum could not be pried off with the fingers, an Eskimo would take it between his teeth and it would come easily away. When a strap made of seal skin freezes hard–and I know nothing tougher than seal skin–an Eskimo will put it in his mouth and chew it soft again. And those teeth were hardly to be called teeth. Worn down to the gums, they were sunken and unbreakable stumps of bone. If I were to fight with an Eskimo, my greatest fear would be lest he crack my skull with his teeth.

But on this evening their hands were even more fantastic than their teeth. … Their capacity of itself was fascinating to observe, and it was clear that like animals they were capable of absorbing amazing quantities of food, quite ready to take their chances with hunger a few days later.

The traditional Eskimo diet contains little to no vegetable matter, because very few plants grow up near the North Pole, especially in winter. It consists primarily of fish, seal, polar bear, foxes, and other meats, but by calorie, it is mostly fat. (This is because you can’t actually survive on a majority-protein diet.)

To run through the dietary science quickly, de Poncins has throughout the book been generally eating white-man’s food, which includes things like bread and beans. This is not to say that he disdained fish and seals–he does not make much mention of whether he ate those, but he does talk about bread, potatoes, beans, etc. So de Poncins is eating what you’d call a “normal” diet that makes use of glucose to transform food into energy. The Eskimo, by contrast, are eating the “Atkins” diet, making use of the ketogenic cycle.

No plants = no carbs; no carbs = no glucose.

But the brain cannot run without glucose, so luckily your body can make it out of protein.

Interestingly, you will die without proteins and fats in your diet, but you can survive without carbs.

Anyway, one of the side effects of a high-protein, ketogenic diet is (at least occasionally,) increased body heat:

Karst H, Steiniger J, Noack R, Steglich HD: Diet-induced thermogenesis in man: thermic effects of single proteins, carbohydrates and fats depending on their energy amount. Ann Nutr Metab 1984, 28(4):245-252.

Abstract: The diet-induced thermogenesis of 12 healthy males of normal body weight was measured by means of indirect calorimetry over 6 h after test meals of 1, 2 or 4 MJ protein (white egg, gelatin, casein), carbohydrate (starch, hydrolyzed starch) or fat (sunflower oil, butter). The effect of 1 MJ protein was at least three times as large as that of an isocaloric carbohydrate supply. [bold mine]

(isocaloric = having similar caloric values)

In other words, the Inuits’ low-carb diet probably increased their internal body temperature, keeping them warmer than our author.

I have attempted a low-carb diet, (solely for health reasons–I have never wanted to lose weight,) and one of the things I remember about it is that I would suddenly feel completely, ravenously hungry. There were times that, had I not been able to get food, I would not have begun devouring anything even remotely chewable. Of course, that may have just been a personal digestive quirk.

I feel compelled to note that this post is not advocating any particular diet; you are most likely not an Eskimo and there is no particular reason to believe, a priori, that you are better adapted to their diet than to the diet of your ancestors (whatever that happens to be.)

Unfortunately, this also holds true for the Eskimo, who probably are adapted to their ancestral diet and not adapted to the white man’s foods, which explains why diabetes and obesity are becoming epidemic among them:

Age-standardized rates of T2D show 17.2% prevalence of Type 2 Diabetes among First Nations individuals living on reserves, compared to 5.0% in the non-Aboriginal population; … First Nations women in particular suffer from diabetes, especially between ages 20–49. They have a 4 times higher incidence of diabetes than non-first nation women[3] as well as experiencing higher rates of gestational diabetes than non-Aboriginal females, 8-18% compared to 2-4%.[1]

“First nations” is Canadian for “Indian”.

In Greenland (majority Inuit):

The age-standardized prevalences of diabetes and IGT were 10.8 and 9.4% among men and 8.8 and 14.1% among women, respectively.

I am reminded here of the chapter in Dr. Price’s Nutrition and Physical Degeneration (copyright 1939) on the Eskimo (which is, alas, too long to quote in full):

During the rise and fall of historic and prehistoric cultures that have often left their monuments and arts following each other in succession in the same location, one culture, the Eskimo, living on until today, bring us a robust sample of the Stone Age people. … The Eskimo face has remained true to ancestral type to give us a living demonstration of what Nature can do in the building of a race competent to withstand for thousands of years the rigors of an Arctic climate. Like the Indian, the Eskimo thrived as long as he was not blighted by the touch of modern civilization, but with it, like all primitives, he withers and dies.

In his primitive state he has provided an example of physical excellence and dental perfection such as has seldom been excelled by any race in the past or present. … It is a sad commentary that with the coming of the white man the Eskimos and Indians are rapidly reduced both in numbers and physical excellence by the white man’s diseases. …

Bethel is the largest settlement on the Kuskokwim, and contains in addition to the white residents many visiting Eskimos from the nearby Tundra country surrounding it.

From this population, Dr. Price noted:

88 Eskimos and mixed-race people, with 2,490 teeth.

27 lived on the traditional Eskimo diet. Of their 796 teeth, one had a cavity.

21 lived on a mixed Eskimo/white diet. Of their 600 teeth, 38–6.3%–had cavities.

40 lived on imported white foods. Of their 1,094 teeth, 252–or 21.1%–had cavities.

In another location, 28 people eating a traditional Eskimo diet had one cavity.

13 people on traditional Eskimo diet: 0 cavities.

72 people on Eskimo diet: 2 cavities.

81 people eating white foods: 394 cavities.

20 people eating white foods: 175 cavities.

(Yes, Dr. Price is a dentist.)

It is a common belief around the world that childbearing makes women lose teeth (my own grandmother lost two teeth while pregnant;) Dr. Price notes the case of an Eskimo woman who had borne 20 children without losing a single tooth or developing any cavities.

One does not get a conception of the magnificent dental development of the more primitive Eskimos simply by learning that they have freedom from dental carries [cavities]. The size and strength of the mandible, the breadth of the face and the strength of the muscles of mastication all reach a degree of excellence that is seldom seen in other races. …

Much has been reported in the literature of the excessive wear of the Eskimo’s teeth, which in the case of the women has been ascribed to the chewing of the leather in the process of tanning. [de Poncins also makes note of the frequent chewing of hides–evX.] It is of interest that while many of the teeth studied gave evidence of excessive wear involving the crowns down to a depth that in many individuals would have exposed the pulps, there was in no case an open pulp chamber. They were always filled with secondary dentin.

Chinchillas

Photo credit Melissa Wolf
Photo credit Melissa Wolf (no, it’s not my birthday.)

Chinchillas are probably the cutest of the rodents.

They hail from the desert of the high Andes, where it is simultaneously cold and dry. They are very well adapted to their native habitat, which unfortunately results in them being not very well adapted to places like the US. Some common problems that therefore plague chinchillas kept as pets:

  1. You can’t get them wet. Chinchilla fur is actually so thick and fluffy that it can’t dry out properly on its own, so a wet chinchilla quickly becomes a moldy chinchilla. (Chinchillas take dust baths to get clean.)
  2. They can’t take heat, or even warmth. Our “room temperature” is their “oh god it’s hot.” They prefer to be below 60 degrees F; if the temp heads north of 75, they’ll probably die.
  3. Too many raisins will kill them. Chinchillas love raisins, but unfortunately for them, they’re only adapted to digest dry, brittle, nutrient-poor desert plants. A chinchilla can easily eat a couple of raisins a day without trouble, but if allowed to eat raisins to its heart’s content, its intestines will get all blocked up and the poor creature will die. (At least according to all of the chinchilla-related websites I have read; I have never personally killed a chinchilla.)

(Even though they are cute and fluffy, I don’t get the impression that chinchillas make very good pets, both because they don’t really bond with humans and because they poop constantly. If you really want a rodent, I hear that rats are rather sociable, though honestly, you could just get a dog.)

When I look at modern humans, I can’t help but think of the humble chinchilla, gorging itself to death on raisins. Sometimes we just don’t know what’s bad for us. With us, it’s not just the food–it’s pretty much everything. Find a cute cat picture on the internet? Next thing you know, you’ve just wasted three hours looking at pictures of cats. There are massive internet empires devoted to peoples’ love of looking at a picture of a cat for about two seconds. Sure, you could use that time to interact with a real cat, but that would require getting off your butt.

Facebook is worse than cat pictures. Do you really need to know that your Aunt Susie “likes” IHOP? Or exactly what your Uncle Joe thinks of Obamacare? Or where your vague acquaintance from three years ago had lunch today? No, but you’ll scroll through all of that crap, anyway, rather than face the horrifying prospect of actually interacting with another human being.

I swear, next time I go to a family gathering where people have flown over a thousand miles just to be there, and someone whips out their phone in the middle of a conversation just to check Twitter or FB, I am going to… well actually I will probably just be politely annoyed, but I will definitely be imagining stomping all over that phone.

Modernity is a drug. It tastes great. It’s wonderful. It’s fun. You get TVs and air conditioning and you don’t die of plague. Frankly, it’s awesome. But in the meanwhile, fertility drops. You end up inside, isolated, no longer talking to other humans, simply because that’s more work than clicking on another cake picture. Communities wither. So we get replaced by people who resist modernity, people who still have children and build communities.

Are you here for the long haul? Or are you just here for the raisins?

And if you’re just here for the raisins, why aren’t you enjoying them more?

Is Acne an Auto-Immune Disorder?

Like our lack of fur, acne remains an evolutionary mystery to me.

Do other furless mammals get acne? Like elephants or whales? Or even chimps; their faces don’t have fur. If so, everyone’s keeping it a secret–I’ve never even seen an add for bonobo anti-acne cream, and with bonobos’ social lives, you know they’d want it. :)

So far, Google has returned no reports of elephants or whales with acne.

Now, a few skin blemishes here and there are not terribly interesting or mysterious. The weird thing about acne (IMO) is that it pops up at puberty*, and appears to have a genetic component.

Considering that kids with acne tend to feel rather self-conscious about it, I think it reasonable to assume that people with more severe acne have more difficulty with dating than people without. (Remember, some people have acne well into their 30s or beyond.)

Wouldn’t the non-acne people quickly out-compete the acne-people, resulting in less acne among humans? (Okay, now I really want to know if someone has done a study on whether people with more acne have fewer children.) Since acne is extremely common and shows up right as humans reach puberty, this seems like a pretty easy thing to study/find an effect if there is any.

Anyway, I totally remember a reference to acne in Dr. Price’s Nutrition and Physical Degeneration, (one of my favorite books ever,) but can’t find it now. Perhaps I am confusing it with Nutrition and Western Disease or a book with a similar title. At any rate, I recall a picture of a young woman’s back with a caption to the effect that none of the people in this tropical local had acne, which the author could tell rather well since this was one of those tropical locals where people typically walk around with rather little clothing.

The Wikipedia has this to say about the international incidence of acne:

“Rates appear to be lower in rural societies. While some find it affects people of all ethnic groups, it may not occur in the non-Westernized people of Papua New Guinea and Paraguay.

Acne affects 40 to 50 million people in the United States (16%) and approximately 3 to 5 million in Australia (23%). In the United States, acne tends to be more severe in Caucasians than people of African descent.”

I consider these more “hints” than “conclusive proof of anything.”

Back when I was researching hookworms, I ran across these bits:

“The [Hygiene Hypothesis] was first proposed by David P. Strachan who noted that hay fever and eczema were less common in children who belonged to large families. Since then, studies have noted the effect of gastrointestinal worms on the development of allergies in the developing world. For example, a study in Gambia found that eradication of worms in some villages led to increased skin reactions to allergies among children. … [bold mine.]

Moderate hookworm infections have been demonstrated to have beneficial effects on hosts suffering from diseases linked to overactive immune systems. … Research at the University of Nottingham conducted in Ethiopia observed a small subset of people with hookworm infections were half as likely to experience asthma or hay fever. Potential benefits have also been hypothesized in cases of multiple sclerosis, Crohn’s Disease and diabetes.”

So I got to thinking, if allergies and eczema are auto-immune reactions (I know someone in real life, at least, whose skin cracks to the point of bleeding if they eat certain foods, but is otherwise fine if they don’t eat those foods,) why not acne?

Acne is generally considered a minor problem, so people haven’t necessarily spent a ton of time researching it. Googling “acne autoimmune” gets me some Paleo-Dieter folks talking about curing severe cases with a paleo-variant (they’re trying to sell books, so they didn’t let on the details, but I suspect the details have to do with avoiding refined sugar, milk, and wheat.)

While I tend to caution against over-enthusiastic embrace of a diet one’s ancestors most likely haven’t eaten in thousands or ten thousand years, if some folks are reporting a result, then I’d love to see scientists actually test it and try to confirm or disprove it.

The problem with dietary science is that it is incredibly complicated, full of confounds, and most of the experiments you might think up in your head are completely illegal and impractical.

For example, scientists figured out that Pellagra is caused by nutritional deficiency–rather than an infectious agent–by feeding prisoners an all-corn diet until they started showing signs of gross malnutrition. (For the record, the prisoners joined the program voluntarily. “All the corn you can eat” sounded pretty good for the first few months.) Likewise, there was a program during WWII to study the effects of starvation–on voluntary subjects–and try to figure out the best way to save starving people, started because the Allies knew they would have a lot of very real starvation victims on their hands very soon.

These sorts of human experiments are no longer allowed. What a scientist can do to a human being is pretty tightly controlled, because no one wants to accidentally kill their test subjects and universities and the like don’t like getting sued. Even things like the Milgram Experiments would have trouble getting authorized today.

So most of the time with scientific studies, you’re left with using human analogs, which means rats. And rats don’t digest food the exact same way we do–Europeans and Chinese don’t digest food the exact same way, so don’t expect rats to do it the same way, either. An obvious oversight as a result of relying on animal models is that most animals can synthesize Vitamin C, but humans can’t. This made figuring out this whole Vitamin C thing a lot trickier.

Primates are probably a little closer, digestively, to humans, but people get really squeamish about monkey research, and besides, they eat a pretty different diet than we do, too. Gorillas are basically vegan (I bet they eat small bugs by accident all the time, of course,) and chimps have almost no body fat–this is quite remarkable, actually. Gorillas and orangutans have quite a bit of body fat, “normal” levels by human standards. Hunter-gatherers, agriculturalists, and sedentary butt-sitters like us have different amounts, but they still all have some. But chimps and bonobos have vanishingly little; male chimps and bonobos have almost zero body fat, even after being raised in zoos and fed as much food as they want.

Which means that if you’re trying to study diet, chimps and bonobos are probably pretty crappy human analogs.

(And I bet they’re really expensive to keep, relative to mice or having humans fill out surveys and promise to eat more carbs.)

So you’re left with trying to figure out what people are eating and tinker with it in a non-harmful, non-invasive way. You can’t just get a bunch of orphans and raise them from birth on two different diets and see what happens. You get people to fill out questionnaires about what they eat and then see if they happen to drop dead in the next 40 or 50 years.

And that doesn’t even take into account the fact that “corn” can mean a dozen different things to different people. Someone whose ancestors were indigenous to North and South America may digest corn differently than someone from Europe, Africa, or Asia. Different people cook corn differently–we don’t typically use the traditional method of mixing it with lime (the mineral), which frees up certain nutrients and traditionally protected people from Pellagra. We don’t all eat corn in the same combinations with other foods (look at the interaction between the calcium in milk and Vitamin D for one of the ways which combining foods can complicate matters.) And we aren’t necessarily even cooking the same “corn”. Modern hybrid corns may not digest in exactly the same way as corn people were growing a hundred or two hundred years ago. Small differences are sometimes quite important, as we discovered when we realized the artificially-created trans-fats we’d stuck in our foods to replace saturated fats were causing cancer–our bodies were trying to use these fats like normal fats, but when we stuck them into our cell walls, their wonky shapes (on a chemical level, the differences between different kinds of fats can be mostly understood that they are shaped differently, and trans fats have been artificially modified to have a different shape than they would have otherwise,) fucked up the structure of the cells they were in.

In short, this research is really hard, but I still encourage people to go do it and do it well.

 

Anyway, back on topic, here’s another quote from the Wikipedia, on the subject of using parasites to treat autoimmunie disorders:

“While it is recognized that there is probably a genetic disposition in certain individuals for the development of autoimmune diseases, the rate of increase in incidence of autoimmune diseases is not a result of genetic changes in humans; the increased rate of autoimmune-related diseases in the industrialized world is occurring in too short a time to be explained in this way. There is evidence that one of the primary reasons for the increase in autoimmune diseases in industrialized nations is the significant change in environmental factors over the last century. …

Genetic research on the interleukin genes (IL genes) shows that helminths [certain kinds of parasites] have been a major selective force on a subset of these human genes. In other words, helminths have shaped the evolution of at least parts of the human immune system, especially the genes responsible for Crohn’s disease, ulcerative colitis, and celiac disease — and provides further evidence that it is the absence of parasites, and in particular helminths, that has likely caused a substantial portion of the increase in incidence of diseases of immune dysregulation and inflammation in industrialized countries in the last century. …

Studies conducted on mice and rat models of colitis, muscular sclerosis, type 1 diabetes, and asthma have shown helminth-infected subjects to display protection from the disease.”

 

Right, so I’m curious if acne falls into this category, too.