Re Nichols: Times the Experts were Wrong

In preparation for our review of The Death of Expertise: The Campaign Against Established Knowledge and Why it Matters, I wanted to make list of “times the experts were wrong.” Professor Nichols, if you ever happen to read this, I hope it gives you some insight into where we, the common people, are coming from. If you don’t happen to read it, it still gives me a baseline before reading your book.

Nichols devotes a chapter to the subject–expert failures are, he claims, “rare but spectacular when they do happen, like plane crashes.” (I may be paraphrasing slightly.)

How often are the experts wrong? (And how would we measure that?)

For starters, we have to define what “experts” are. Nichols might define experts as, “anyone who has a PhD in a thing or has worked in that field for 10 years,” but the general layman is probably much laxer in his definitions.

Now, Nichols’s argument that “experts” are correct most of the time probably is correct, at least if we use a conservative definition of “expert”. We live in a society that is completely dependent on the collective expertise of thousands if not millions of people, and yet that society keeps running. For example, I do not know how to build a road, but road-building experts do, and our society has thousands of miles of functional roads. They’re not perfect, but they’re a huge improvement over dirt paths. I don’t know how to build a car, but car-building experts do, and so society is full of cars. From houses to skyscrapers, smartphones to weather satellites, electricity to plumbing: most of the time, these complicated systems get built and function perfectly well. Even airplanes, incredibly, don’t fall out of the sky most of the time (and according to Steven Pinker, they’re getting even better at it.)

But these seem like the kind of experts that most people don’t second-guess too often (“I think you should only put three wheels on the car–and make them titanium,”) nor is this the sort of questioning that I think Nichols is really concerned about. Rather, I think Nichols is concerned about people second-guessing experts like himself whose opinions bear not on easily observed, physical objects like cars and roads but on abstract policies like “What should our interest rates be?” or “Should we bomb Syria?”

We might distinguish here between practical experts employed by corporations, whose expertise must be “proven” via production of actual products that people actually use, and academic experts whose products are primarily ideas that people can’t touch, test, or interact with.

For ordinary people, though, we must include another form of experts: writers–of newspapers, magazines, TV programs, textbooks, even some well-respected bloggers. Most people don’t read academic journals nor policy papers. They read Cosmo and watch daytime talk shows, not because they “hate experts” but because this is the level of information they can understand.

In other words, most people probably think Cosmo’s “style expert” and Donald Trump are as much “experts” as Tom Nichols. Trump is a “business expert” who is so expert he not only has a big tower with his name on it, they even let him hire and fire people on TV! Has anyone ever trusted Nichols’s expertise enough to give him a TV show about it?

Trump Tower is something people can touch–the kind of expertise that people trust. Nichols’s expertise is the Soviet Union (now Russia) and how the US should approach the threat of nuclear war and deterrence–not things you can easily build, touch, and test.

Nichols’s idea of “experts” is probably different from the normal person’s idea of “experts.” Nichols probably uses metrics like “How long has this guy been in the field?” and “Which journals has he been published in?” while normal people use metrics like “Did CNN call him an expert?” and “Did I read it in a magazine?” (I have actually witnessed people citing margarine advertizements as “nutrition advice.”)

If anything, I suspect the difference between “normal people’s idea of expert” and “Nichols’s idea of experts” is part of the tension Nichols is feeling, as for the first time, ordinary people like me who would in the past have been limited largely to discussing the latest newspaper headlines with friends can now pull up any academic’s CV and critique it online. “The people,” having been trained on daytime TV and butter ads, can now critique foreign policy advisers…

Let’s sort “people who distrust experts” into three main categories:

  1. Informed dissenters: People who have read a lot on a particular topic and have good reason to believe the expert consensus is wrong, eg, someone involved in nutrition research who began sounding warning bells about the dangers of partially hydrogenated fats in the ’80s.
  2. General contrarians: Other people are wrong. Music has been downhill ever since the Beatles. The schools are failing because teachers are dumb. Evolution isn’t real. Contrarians like to disagree with others and sometimes they’re correct.
  3. Tinfoil hatters: CHEMTRAILS POISON YOU. The Tinfoil hatters don’t think other people are dumb; they think others are actively conspiring against them.

People can fall into more than one category–in fact, being a General Contrarian by nature probably makes it much easier to be an Informed Dissenter. Gregory Cochran, for example, probably falls into both categories. (Scott Alexander, by contrast, is an informed dissenter but not contrarian.)

Tinfoil hatters are deprecated, but even they are sometimes correct. If a Jew in 1930’s Germany had said, “Gee, I think those Germans have it out for us,” they’d have been correct. A white South African today who thinks the black South Africans have it out for them is probably also correct.

So the first question is whether more people actually distrust experts, or if the spread of the internet has caused Nichols to interact with more people who distrust experts. For example, far more people in the 80s were vocally opposed to the entire concept of “evolution” than are today, but they didn’t have the internet to post on. Nichols, a professor at the US Naval War College and the Harvard Extension School, probably doesn’t interact in real life with nearly as many people who are actively hostile to the entire edifice of modern science as the Kansas City School Board does, and thus he may have been surprised to finally encounter these people online.

But let’s get on with our point: a few cases where “the experts” have failed:

Part 1: Medicine and Doctors

Trans Fats

Artificially created trans (or partially hydrogenated) fats entered the American diet in large quantities in the 1950s. Soon nutrition experts, dieticians, healthcare philanthropists, and the federal government itself were all touting the trans fat mantra: trans fats like margarine or crisco were healthier and better for you than the animal fats like butter or lard traditionally used in cooking.

Unfortunately, the nutrition experts were wrong. Trans fats are deadly. According to a study published in 1993 by the Harvard School of Public Health, trans fats are probably responsible for about 100,000 deaths a year–or a million every decade. (And that’s not counting the people who had heart attacks and survived because of modern medical care.)

The first people to question the nutritional orthodoxy on trans fats (in any quantity) were probably the General Contrarians: “My grandparents ate lard and my parents ate lard and I grew up eating lard and we turned out just fine! We didn’t have ‘heart attacks’ back in the ’30s.” After a few informed dissenters started publishing studies questioning the nutritional orthodoxy, nutrition’s near-endless well of tinfoil hatters began promoting their findings (if any field is perfect for paranoia about poisons and contaminants, well, it’s food.)

And in this case, the tinfoil hatters were correct: corporations really were promoting the consumption of something they by then knew was killing people just because it made them money

Tobacco

If you’re old enough, you remember not only the days of Joe Camel, but also Camel’s ads heavily implying that doctors endorsed smoking. Dentists recommended Viceroys, the filtered cigarettes. Camels were supposed to “calm the nerves” and “aid the digestion.” Physicians recommended “mell-o-wells,” the “health cigar.” Some brands were even supposed to cure coughs and asthma.

Now, these weren’t endorsements from actual doctors–if anything, the desire to give cigarettes a healthy sheen was probably driven by the accumulating evidence that they weren’t healthy–but when my grandmother took up smoking, do you think she was reading medical journals? No, she trusted that nice doctor in that Camel ad.

Chesterfield, though, claimed that actual doctors had confirmed that their cigarettes had no adverse health effects:

In the 70s, the tobacco companies found doctors willing to testify not that tobacco was healthy, but that there was no proof–or not enough data–to accuse it of being unhealthy.

Even when called before Congress in the 90s, tobacco companies kept insisting their products weren’t damaging. If the CEO of Philip Morris isn’t an expert on cigarettes, I don’t know who is.

The CDC estimates that 480,000 Americans die due to cigarettes per year, making them one of our leading killers.

Freudianism, recovered memories, multiple personality disorder, and Satanic Daycares

In retrospect, Freudian Psychoanalysis is so absurd, it’s amazing it ever became a widely-believed, mainstream idea. And yet it was.

For example:

In the early 1890s, Freud used a form of treatment based on the one that Breuer had described to him, modified by what he called his “pressure technique” and his newly developed analytic technique of interpretation and reconstruction. According to Freud’s later accounts of this period, as a result of his use of this procedure most of his patients in the mid-1890s reported early childhood sexual abuse. He believed these stories, which he used as the basis for his seduction theory, but then he came to believe that they were fantasies. He explained these at first as having the function of “fending off” memories of infantile masturbation, but in later years he wrote that they represented Oedipal fantasies, stemming from innate drives that are sexual and destructive in nature.[121]

Another version of events focuses on Freud’s proposing that unconscious memories of infantile sexual abuse were at the root of the psychoneuroses in letters to Fliess in October 1895, before he reported that he had actually discovered such abuse among his patients.[122] In the first half of 1896, Freud published three papers, which led to his seduction theory, stating that he had uncovered, in all of his current patients, deeply repressed memories of sexual abuse in early childhood.[123] In these papers, Freud recorded that his patients were not consciously aware of these memories, and must therefore be present as unconscious memories if they were to result in hysterical symptoms or obsessional neurosis. The patients were subjected to considerable pressure to “reproduce” infantile sexual abuse “scenes” that Freud was convinced had been repressed into the unconscious.[124] Patients were generally unconvinced that their experiences of Freud’s clinical procedure indicated actual sexual abuse. He reported that even after a supposed “reproduction” of sexual scenes the patients assured him emphatically of their disbelief.[125]

To sum: Freud became convinced that patients had suffered sexual abuse.

The patients replied emphatically that they had not.

Freud made up a bunch of sexual abuse scenarios.

The patients insisted they remembered nothing of the sort.

Freud decided the memories must just be repressed.

Later, Freud decided the sexual abuse never actually happened, but that the repressed, inverted memories were of children masturbating to the thought of having sex with their parents.

So not only was Freud’s theory derived from nothing–directly contradicted by the patients he supposedly based it on–he took it a step further and actually denied the stories of patients who had been sexually abused as children.

Freud’s techniques may have been kinder than the psychology of the 1800s, which AFAIK involved locking insane people in asylums and stomping them to death, but there remains a cruel perversity to insisting that people have memories of horrible experiences they swear they don’t, and then turning around and saying that horrible things they clearly remember never happened.

Eventually Freudian psychoanalysis and its promise of “recovering repressed memories” morphed into the recovered traumatic memory movement of the 1980s, in which psychologists used hypnosis to convince patients they had been the victims of a vast world-wide Satanic conspiracy and that they had multiple, independent personalities that could only be accessed via hypnosis.

The satanic Daycare conspiracy hysteria resulted in the actual conviction and imprisonment of real people for crimes like riding broomsticks and sacrificing elephants, despite a total lack of local dead elephants. Judges, lawyers, juries, and prosecutors found the testimony of “expert” doctors and psychologists (and children) convincing enough to put people in prison for running an underground, global network of “Satanic Daycares” that were supposedly raping and killing children. Eventually the hysteria got so bad that the FBI got involved, investigated, and found a big fat nothing. No sacrificial altars. No secret basements full of Satanic paraphernalia and torture devices. No dead elephants or giraffes. No magic brooms. No dead infants.

Insurance companies began investigating the extremely expensive claims of psychologists treating women with “multiple personality disorder” (many of whom had so degenerated while in the psychologists care that they had gone from employed, competent people to hospitalized mental patients.) Amazingly, immediately after insurance companies decided the whole business was a scam and stopped paying for the treatment, the patients got better. Several doctors were sued for malpractice and MPD was removed from the official list of psychological conditions, the DSM-V. (It has been replaced with DID, or dissasociative disorder.)

I wrote about the whole sordid business at length in Satanic Daycares: the scandal that should have never been, Part Two, and Part Three.

(Ironically, people attack psychiatry’s use of medications like Prozac, but if anything, these are the most evidence-based parts of mental care. At least you can collect data on things like “Does Prozac work better than placebo for making people feel better?” unlike Freudian psychoanalysis, which contained so many levels of “repression” and “transference” that there was always a ready excuse for why it wasn’t working–or for why “the patient got worse” was actually exactly what was supposed to happen.)

All Doctors pre-1900

One of West Hunter’s frequent themes is just how bad pre-modern medicine was:

Between 1839 and 1847, the First Clinic at the Vienna General Hospital had 20,204 births and 1,989 maternal deaths. The Second Clinic, attended by midwives, had 17,791 birth and 691 maternal deaths. An MD’s care conferred an extra 6% chance of death. Births at home were even safer, with maternal mortality averaging about 0.5%

In that period, MDs caused about 1200 extra deaths. …

We know that wounded men in the Civil War had a better chance of surviving when they managed to hide from Army surgeons. Think how many people succumbed to bloodletting, over the centuries.

Ever wondered why Christian Scientists, who are otherwise quite pro-science, avoid doctors? It’s because their founder, Mary Baker Eddy (born in 1821) was often sick as a child. Her concerned parents dragged her to every doctor they could find, but poor Mary found that she got better when she stopped going to the doctors.

West Hunt gives a relevant description of pre-modern medicine:

Back in the good old days, Charles II, age 53, had a fit one Sunday evening, while fondling two of his mistresses.

Monday they bled him (cupping and scarifying) of eight ounces of blood. Followed by an antimony emetic, vitriol in peony water, purgative pills, and a clyster. Followed by another clyster after two hours. Then syrup of blackthorn, more antimony, and rock salt. Next, more laxatives, white hellebore root up the nostrils. Powdered cowslip flowers. More purgatives. Then Spanish Fly. They shaved his head and stuck blistering plasters all over it, plastered the soles of his feet with tar and pigeon-dung, then said good-night.

Tuesday. ten more ounces of blood, a gargle of elm in syrup of mallow, and a julep of black cherry, peony, crushed pearls, and white sugar candy.

Wednesday. Things looked good:: only senna pods infused in spring water, along with white wine and nutmeg.

Thursday. More fits. They gave him a spirituous draft made from the skull of a man who had died a violent death. Peruvian bark, repeatedly, interspersed with more human skull. Didn’t work.

Friday. The king was worse. He tells them not to let poor Nelly starve. They try the Oriental Bezoar Stone, and more bleeding. Dies at noon.

Homeopathy has a similar history: old medicines were so often poisonous that even if some of them worked, on average, you were probably better off eating sugar pills (which did nothing) than taking “real” medicines. But since people can’t market “pills with nothing in them,” homeopathy’s strange logic of “diluting medicine makes it stronger” was used to give the pills a veneer of doing something. (Freudian psychotherapy, the extent that it “helped” anyone, was probably similar. Not that the practitioner himself brought anything to the table, but the idea of “I am having treatment so I will get better” plus the opportunity to talk about your problems probably helped some people.)

Today, “alternative” medical treatments like homeopathy and “faith healing” are less effective than conventional medicine, but for most of the past 2,000 years or so, you’d have been better off distrusting the “experts” (ie doctors) than trusting them.

It was only in the 20th century that doctors (or researchers) developed enough technology like vaccines, antibiotics, the germ theory of disease, nutrition, insulin, traumatic care, etc., that doctors began saving more lives than they cost, but the business was still fraught:

Source (PDF)

Disclaimer: I have had the whole birth trifecta: natural birth without medication, vaginal birth with medication, and c-section. Natural birth was horrifically painful and left me traumatized. The c-section, while medically necessary, was almost as terrible. Recovery from natural (and medicated) birth was almost instant–within minutes I felt better; within days I was back on my feet and regaining mobility. The c-section left me in pain for a month, trying to nurse a new baby and care for my other children while on pain killers that made me feel awful and put me to sleep. Without the pain killers, I could barely sit up and get out of bed.

Medically necessary c-sections save lives, perhaps mine. I support them, but I do NOT support medically unnecessary c-sections.

The “international healthcare community” recommends a c-section rate of 10-15% (maybe 19%.) The US rate is over 30%. Half of our c-sections are unnecessary traumas inflicted on women.

In cases where c-sections are not medically necessary (low-risk pregnancies), c-sections carry more than triple the risk of maternal death (13 per 100,000 for c sections and 3.5 per 100,000 for vaginal births.) Medically necessary c-sections, of course, save more lives than they take.

Given: 1,258,581 c-sections in the US in 2016, if half of those were unnecessary, then I estimate 60 women per year died from unnecessary c-sections. Not the kind of death rate Semmelweis was fighting against when he tried to convince doctors they needed to wash their hands between dissecting corpses and delivering babies, (for his efforts he was branded “a guy who didn’t believe the wisdom of experts,” “crazy,” and was eventually put in an insane asylum and literally stomped to death by the guards. (Freudianism looks really good by comparison.)

C-sections have other effects besides just death: they are more expensive, can get infected, and delay recovery. (I’ve also seen data linking them to an increased chance of post-partum depression.) For women who want to have more children, a c-section increases the chances of problems during subsequent pregnancies and deliveries.

Why do we do so many c-sections? Because in the event of misfortune, a doctor is more likely to get sued if he didn’t do a c-section (“He could have done more to save the baby’s life but chose to ignore the signs of fetal distress!”) than if he does do one (“We tried everything we could to save mother and baby.”) Note that this is not what’s in the mother’s best interests, but in the doctor’s.

Although I am obviously not a fan of natural childbirth, (I favor epidurals,) I am sympathetic to the movement’s principle logic: avoiding unnecessary c-sections by avoiding the doctors who give them. These women are anti-experts, and I can’t exactly blame them.

At the intersection of the “natural food” and “natural birth” communities we find the anti-vaxers.

Now, I am unabashedly pro-vaccine (though I reserve the right to criticize any particular vaccine,) but I still understand where the anti-vax crew is coming from. If doctors were wrong about blood-letting, are wrong about many c-sections (or pushing them on unsuspecting women to protect their own bottom lines) and doctors were just plain wrong for decades about dangerous but lucrative artificial fats that they actively pushed people to eat, who’s to say they’re right about everything else? Maybe some of the other chemicals we’re being injected with are actually harmful.

We can point to (and I do) massive improvements in public health and life expectancies as a result of vaccinations, but (anti-vaxers counter) how do we know these outcomes weren’t caused by other things, like the development of water treatment systems and sewers that ensured people weren’t drinking fecal-contaminated water anymore?

(I am also pro-not drinking contaminated water.)

Like concerns about impurities in one’s food, concerns about vaccinations make a certain instinctual sense: it is kind of creepy to inject people (mostly infants) with a serum composed of, apparently, dead germs and “chemicals.” The idea that exposing yourself to germs will somehow make you healthier is counter-intuitive, and hypodermic needles are a well-publicized disease vector.

So even though I think anti-vaxers are wrong, I don’t think they’re completely irrational.

 

This is the end of Part 1. We’ll continue with Part 2 on Wed.

Advertisements

Everything I’ve Read about Food, Summed up in One Graph:

A few years ago I went through a nutrition kick and read about a dozen books about food. Today I came across a graph that perfectly represents what I learned:

Basically, everything will kill you.

There are three major schools of thought on what’s wrong with modern diets: 1. fats, 2. carbs (sugars,) or 3. proteins.

Unfortunately, all food is composed of fats+carbs+proteins.

Ultimately, the best advice I came across was just to stop stressing out. We don’t really know the best foods to eat, and a lot of official health advice that people have tried to follow actually turned out to be quite bad, but we have a decent intuition that you shouldn’t eat cupcakes for lunch.

Dieting doesn’t really do much for the vast majority of people, but it’s a huge industry that sucks up a ton of time and money. How much you weigh has a lot more to do with factors outside of your control, like genetics or whether there’s a famine going on in your area right now.

You’re probably not going to do yourself any favors stressing out about food or eating a bunch of things you don’t like.

Remember the 20/80 rule: 80% of the effect comes from 20% of the effort, and vice versa. Eating reasonable quantities of good food and avoiding junk will do far more good than substituting chicken breast for chicken thighs in everything you cook.

There is definitely an ethnic component to diet–eg, people whose ancestors historically ate grain are better adapted to it than people who didn’t. So if you’re eating a whole bunch of stuff your ancestors didn’t and you don’t feel so good, that may be the problem.

Personally, I am wary of refined sugars in my foods, but I am very sensitive to sugars. (I don’t even drink juice.) But this may just be me. Pay attention to your body and how you feel after eating different kinds of food, and eat what makes you feel good.

Why is our Society so Obsessed with Salads?

It’s been a rough day. So I’m going to complain about something totally mundane: salads.

I was recently privy to a conversation between two older women on why it is so hard to stay thin in the South: lack of good salads. Apparently when you go to a southern restaurant, they serve a big piece of meat (often deep-fried steak) a lump of mashed potatoes and gravy, and a finger-bowl with 5 pieces of iceberg lettuce, an orange tomato, and a slathering of dressing.

Sounds good to me.

Now, if you like salads, that’s fine. You’re still welcome here. Personally, I just don’t see the point. The darn things don’t have any calories!

From an evolutionary perspective, obviously food provides two things: calories and nutrients. There may be some foods that are mostly calorie but little nutrient (eg, honey) and some foods that are nutrient but no calorie (salt isn’t exactly a food, but it otherwise fits the bill.)

Food doesn’t seem like it should be that complicated–surely we’ve evolved to eat effectively by now. So any difficulties we have (besides just getting the food) are likely us over-thinking the matter. There’s no problem getting people to eat high-calorie foods, because they taste good. It’s also not hard to get people to eat salt–it also tastes good.

But people seem to have this ambivalent relationship with salads. What’s so important about eating a bunch of leaves with no calories and a vaguely unpleasant flavor? Can’t a just eat a nice potato? Or some corn? Or asparagus?

Don’t get me wrong. I don’t hate vegetables. Just everything that goes in a salad. Heck, I’ll even eat most salad fixins if they’re cooked. I won’t turn down fried green tomatoes, you know.

While there’s nothing wrong with enjoying a bowl of lettuce if that’s your think, I think our society has gone down a fundamentally wrong collective path when it comes to nutrition wisdom. The idea here is that your hunger drive is this insatiable beast that will force you to consume as much food as possible, making you overweight and giving you a heart attack, and so the only way to save yourself is to trick the beast by filling your stomach with fluffy, zero-calorie plants until there isn’t anymore room.

This seems to me like the direct opposite of what you should be doing. See, I assume your body isn’t an idiot, and can figure out whether you’ve just eaten something full of calories, and so should go sleep for a bit, or if you just ate some leaves and should keep looking for food.

I recently tried increasing the amount of butter I eat each day, and the result was I felt extremely full an didn’t want to eat dinner. Butter is a great way to almost arbitrarily increase the amount of calories per volume of food.

If you’re wondering about my weight, well, let’s just say that despite the butter, never going on a diet, and abhorring salads, I’m still not overweight–but this is largely genetic. (I should note though that I don’t eat many sweets at all.)

Obviously I am not a nutritionist, a dietician, nor a doctor. I’m not a good source for health advice. But it seems to me that increasing or decreasing the number of sweats you eat per day probably has a bigger impact on your overall weight than adding or subtracting a salad.

But maybe I’m missing something.

Is Acne an Auto-Immune Disorder?

Like our lack of fur, acne remains an evolutionary mystery to me.

Do other furless mammals get acne? Like elephants or whales? Or even chimps; their faces don’t have fur. If so, everyone’s keeping it a secret–I’ve never even seen an add for bonobo anti-acne cream, and with bonobos’ social lives, you know they’d want it. :)

So far, Google has returned no reports of elephants or whales with acne.

Now, a few skin blemishes here and there are not terribly interesting or mysterious. The weird thing about acne (IMO) is that it pops up at puberty*, and appears to have a genetic component.

Considering that kids with acne tend to feel rather self-conscious about it, I think it reasonable to assume that people with more severe acne have more difficulty with dating than people without. (Remember, some people have acne well into their 30s or beyond.)

Wouldn’t the non-acne people quickly out-compete the acne-people, resulting in less acne among humans? (Okay, now I really want to know if someone has done a study on whether people with more acne have fewer children.) Since acne is extremely common and shows up right as humans reach puberty, this seems like a pretty easy thing to study/find an effect if there is any.

Anyway, I totally remember a reference to acne in Dr. Price’s Nutrition and Physical Degeneration, (one of my favorite books ever,) but can’t find it now. Perhaps I am confusing it with Nutrition and Western Disease or a book with a similar title. At any rate, I recall a picture of a young woman’s back with a caption to the effect that none of the people in this tropical local had acne, which the author could tell rather well since this was one of those tropical locals where people typically walk around with rather little clothing.

The Wikipedia has this to say about the international incidence of acne:

“Rates appear to be lower in rural societies. While some find it affects people of all ethnic groups, it may not occur in the non-Westernized people of Papua New Guinea and Paraguay.

Acne affects 40 to 50 million people in the United States (16%) and approximately 3 to 5 million in Australia (23%). In the United States, acne tends to be more severe in Caucasians than people of African descent.”

I consider these more “hints” than “conclusive proof of anything.”

Back when I was researching hookworms, I ran across these bits:

“The [Hygiene Hypothesis] was first proposed by David P. Strachan who noted that hay fever and eczema were less common in children who belonged to large families. Since then, studies have noted the effect of gastrointestinal worms on the development of allergies in the developing world. For example, a study in Gambia found that eradication of worms in some villages led to increased skin reactions to allergies among children. … [bold mine.]

Moderate hookworm infections have been demonstrated to have beneficial effects on hosts suffering from diseases linked to overactive immune systems. … Research at the University of Nottingham conducted in Ethiopia observed a small subset of people with hookworm infections were half as likely to experience asthma or hay fever. Potential benefits have also been hypothesized in cases of multiple sclerosis, Crohn’s Disease and diabetes.”

So I got to thinking, if allergies and eczema are auto-immune reactions (I know someone in real life, at least, whose skin cracks to the point of bleeding if they eat certain foods, but is otherwise fine if they don’t eat those foods,) why not acne?

Acne is generally considered a minor problem, so people haven’t necessarily spent a ton of time researching it. Googling “acne autoimmune” gets me some Paleo-Dieter folks talking about curing severe cases with a paleo-variant (they’re trying to sell books, so they didn’t let on the details, but I suspect the details have to do with avoiding refined sugar, milk, and wheat.)

While I tend to caution against over-enthusiastic embrace of a diet one’s ancestors most likely haven’t eaten in thousands or ten thousand years, if some folks are reporting a result, then I’d love to see scientists actually test it and try to confirm or disprove it.

The problem with dietary science is that it is incredibly complicated, full of confounds, and most of the experiments you might think up in your head are completely illegal and impractical.

For example, scientists figured out that Pellagra is caused by nutritional deficiency–rather than an infectious agent–by feeding prisoners an all-corn diet until they started showing signs of gross malnutrition. (For the record, the prisoners joined the program voluntarily. “All the corn you can eat” sounded pretty good for the first few months.) Likewise, there was a program during WWII to study the effects of starvation–on voluntary subjects–and try to figure out the best way to save starving people, started because the Allies knew they would have a lot of very real starvation victims on their hands very soon.

These sorts of human experiments are no longer allowed. What a scientist can do to a human being is pretty tightly controlled, because no one wants to accidentally kill their test subjects and universities and the like don’t like getting sued. Even things like the Milgram Experiments would have trouble getting authorized today.

So most of the time with scientific studies, you’re left with using human analogs, which means rats. And rats don’t digest food the exact same way we do–Europeans and Chinese don’t digest food the exact same way, so don’t expect rats to do it the same way, either. An obvious oversight as a result of relying on animal models is that most animals can synthesize Vitamin C, but humans can’t. This made figuring out this whole Vitamin C thing a lot trickier.

Primates are probably a little closer, digestively, to humans, but people get really squeamish about monkey research, and besides, they eat a pretty different diet than we do, too. Gorillas are basically vegan (I bet they eat small bugs by accident all the time, of course,) and chimps have almost no body fat–this is quite remarkable, actually. Gorillas and orangutans have quite a bit of body fat, “normal” levels by human standards. Hunter-gatherers, agriculturalists, and sedentary butt-sitters like us have different amounts, but they still all have some. But chimps and bonobos have vanishingly little; male chimps and bonobos have almost zero body fat, even after being raised in zoos and fed as much food as they want.

Which means that if you’re trying to study diet, chimps and bonobos are probably pretty crappy human analogs.

(And I bet they’re really expensive to keep, relative to mice or having humans fill out surveys and promise to eat more carbs.)

So you’re left with trying to figure out what people are eating and tinker with it in a non-harmful, non-invasive way. You can’t just get a bunch of orphans and raise them from birth on two different diets and see what happens. You get people to fill out questionnaires about what they eat and then see if they happen to drop dead in the next 40 or 50 years.

And that doesn’t even take into account the fact that “corn” can mean a dozen different things to different people. Someone whose ancestors were indigenous to North and South America may digest corn differently than someone from Europe, Africa, or Asia. Different people cook corn differently–we don’t typically use the traditional method of mixing it with lime (the mineral), which frees up certain nutrients and traditionally protected people from Pellagra. We don’t all eat corn in the same combinations with other foods (look at the interaction between the calcium in milk and Vitamin D for one of the ways which combining foods can complicate matters.) And we aren’t necessarily even cooking the same “corn”. Modern hybrid corns may not digest in exactly the same way as corn people were growing a hundred or two hundred years ago. Small differences are sometimes quite important, as we discovered when we realized the artificially-created trans-fats we’d stuck in our foods to replace saturated fats were causing cancer–our bodies were trying to use these fats like normal fats, but when we stuck them into our cell walls, their wonky shapes (on a chemical level, the differences between different kinds of fats can be mostly understood that they are shaped differently, and trans fats have been artificially modified to have a different shape than they would have otherwise,) fucked up the structure of the cells they were in.

In short, this research is really hard, but I still encourage people to go do it and do it well.

 

Anyway, back on topic, here’s another quote from the Wikipedia, on the subject of using parasites to treat autoimmunie disorders:

“While it is recognized that there is probably a genetic disposition in certain individuals for the development of autoimmune diseases, the rate of increase in incidence of autoimmune diseases is not a result of genetic changes in humans; the increased rate of autoimmune-related diseases in the industrialized world is occurring in too short a time to be explained in this way. There is evidence that one of the primary reasons for the increase in autoimmune diseases in industrialized nations is the significant change in environmental factors over the last century. …

Genetic research on the interleukin genes (IL genes) shows that helminths [certain kinds of parasites] have been a major selective force on a subset of these human genes. In other words, helminths have shaped the evolution of at least parts of the human immune system, especially the genes responsible for Crohn’s disease, ulcerative colitis, and celiac disease — and provides further evidence that it is the absence of parasites, and in particular helminths, that has likely caused a substantial portion of the increase in incidence of diseases of immune dysregulation and inflammation in industrialized countries in the last century. …

Studies conducted on mice and rat models of colitis, muscular sclerosis, type 1 diabetes, and asthma have shown helminth-infected subjects to display protection from the disease.”

 

Right, so I’m curious if acne falls into this category, too.