Does the DSM need to be re-written?

I recently came across an interesting paper that looked at the likelihood that a person, once diagnosed with one mental disorder, would be diagnosed with another. (Exploring Comorbidity Within Mental Disorders Among a Danish National Population, by Oleguer Plana-Ripoll.)

This was a remarkable study in two ways. First, it had a sample size of 5,940,778, followed up for 83.9 million person-years–basically, the entire population of Denmark over 15 years. (Big Data indeed.)

Second, it found that for virtually every disorder, one diagnoses increased your chances of being diagnosed with a second disorder. (“Comorbid” is a fancy word for “two diseases or conditions occurring together,” not “dying at the same time.”) Some diseases were particularly likely to co-occur–in particular, people diagnosed with “mood disorders” had a 30% chance of also being diagnosed with “neurotic disorders” during the 15 years covered by the study.

Mood disorders includes bipolar, depression, and SAD;

Neurotic disorders include anxieties, phobias, and OCD.

Those chances were considerably higher for people diagnosed at younger ages, and decreased significantly for the elderly–those diagnosed with mood disorders before the age of 20 had a +40% chance of also being diagnosed with a neurotic disorder, while those diagnosed after 80 had only a 5% chance.

I don’t find this terribly surprising, since I know someone with at least five different psychological diagnoses, (nor is it surprising that many people with “intellectual disabilities” also have “developmental disorders”) but it’s interesting just how pervasive comorbidity is across conditions that are ostensibly separate diseases.

This suggests to me that either many people are being mis-diagnosed (perhaps diagnosis itself is very difficult,) or what look like separate disorders are often actually one, single disorder. While it is certainly possible, of course, for someone to have both a phobia of snakes and seasonal affective disorder, the person I know with five diagnoses most likely has only one “true” disorder that has just been diagnosed and treated differently by different clinicians. It seems likely that some people’s depression also manifests itself as deep-rooted anxiety or phobias, for example.

While this is a bit of a blow for many psychiatric diagnoses, (and I am quite certain that many diagnostic categories will need a fair amount of revision before all is said and done,) autism recently got a validity boost–How brain scans can diagnose Autism with 97% accuracy.

The title is overselling it, but it’s interesting anyway:

Lead study author Marcel Just, PhD, professor of psychology and director of the Center for Cognitive Brain Imaging at Carnegie Mellon University, and his team performed fMRI scans on 17 young adults with high-functioning autism and 17 people without autism while they thought about a range of different social interactions, like “hug,” “humiliate,” “kick” and “adore.” The researchers used machine-learning techniques to measure the activation in 135 tiny pieces of the brain, each the size of a peppercorn, and analyzed how the activation levels formed a pattern. …

So great was the difference between the two groups that the researchers could identify whether a brain was autistic or neurotypical in 33 out of 34 of the participants—that’s 97% accuracy—just by looking at a certain fMRI activation pattern. “There was an area associated with the representation of self that did not activate in people with autism,” Just says. “When they thought about hugging or adoring or persuading or hating, they thought about it like somebody watching a play or reading a dictionary definition. They didn’t think of it as it applied to them.” This suggests that in autism, the representation of the self is altered, which researchers have known for many years, Just says.

N=34 is not quite as impressive as N=Denmark, but it’s a good start.

Neanderthal DNA–hey!–what is it good for?

Quite a bit.

First, a bit of history:

neanderthalmap
map of Neanderthal DNA in humans

It appears that there were (at least) 3 main cross-breeding events with Neanderthals. The first event most likely happened when one small band of humans had left Africa and ventured into the Middle East, where Neanderthals were living. The DNA acquired from that partnership can be found in all modern non-Africans, since they are all descended from this same group. (Since there has also been back-migration from the Middle East into Africa sometime in the past 70,000 years, many African groups also have a small amount of this DNA.)

Soon after, the group that became the Melanesians, Papuans, and Aborigines split off from the rest and headed east, where they encountered–and interbred with–the mysterious Denisovans, a third human species that we know mostly from DNA. Various sources claim this happened before the second neanderthal inter-breeding event, but just looking at the amount of admixed neanderthal in Oceanans suggests this is wrong.

Meanwhile, the rest of the non-African humans, probably still living in the Middle East or Eurasian Steppe, encountered a second band of Neanderthals, resulting in a second admixture event, shared by all Asians and Europeans, but not Melanesians &c. Then the Asians and Europeans went their separate ways, and the Asians encountered yet a third group of Neanderthals, giving them the highest rates of Neanderthal ancestry.

nature-siberian-neanderthals-17.02.16-v2

During their wanderings, some of these Asians encountered Melanesians, resulting in a little Denisovan DNA in today’s south Asians (especially Tibetans, who appear to have acquired some useful adaptations to Tibet’s high altitude from ancient Denisovans.)

There were other interbreeding events, including a much older one that left homo sapiens DNA in Neanderthals, and one that produced Denny, a Neanderthal/Denisovan hybrid. There were also interbreeding events in Africa, involving as-yet unidentified hominins. (In the human family tree to the right/above, Melanesians are included within the greater Asian clade.)

Who married whom? So far, we’ve found no evidence of Neanderthal mitochondrial DNA–passed from mothers to their children–in modern humans, so the pairings most likely involved Neanderthal men and human women. But we have also found extremely little Neanderthal DNA on the Y chromosome–so it is likely that they only had female children, or any male children they had were infertile.

Anthropogenesis-DenisovaAlleleMapInterestingly, we find higher amounts of Neanderthal DNA in older skeletons, like the 40,000 year old Tianyuan Man, or this fellow from Romania with 10% Neanderthal DNA, than in modern humans. Two potential explanations for the decrease: later mixing with groups that didn’t have Neanderthal DNA resulted in dilution, or people with more Neanderthal DNA just got out-competed by people with less.

Given the dearth of DNA on the Y chromosome and the number of diseases linked to Neanderthal DNA, including Lupus, Crohn’s, cirrhosis, and Type-2 diabetes, the fact that morphological differences between Sapiens and Neanderthals are large enough that we classify them as different species, and the fact that Neanderthals had larger craniums than Sapiens but Sapiens women attempting to give birth to hybrid children still had regular old Sapiens pelvises, gradual selection against Neanderthal DNA in humans seems likely.

However, the Neanderthals probably contributed some useful DNA that has been sorted out of the general mix and come down the ages to us. For example, the trait that allows Tibetans to live at high altitudes likely came from a Denisovan ancestor:

Researchers discovered in 2010 that Tibetans have several genes that help them use smaller amounts of oxygen efficiently, allowing them to deliver enough of it to their limbs while exercising at high altitude. Most notable is a version of a gene called EPAS1, which regulates the body’s production of hemoglobin. They were surprised, however, by how rapidly the variant of EPAS1spread—initially, they thought it spread in 3000 years through 40% of high-altitude Tibetans, which is the fastest genetic sweep ever observed in humans—and they wondered where it came from.

Modern humans have Neanderthal DNA variants for keratin (a protein found in skin, nails, hair, etc.,) and UV-light adaptations that likely helped us deal with the lower light levels found outside Africa. There’s circumstantial evidence that microcephalin D could have Neanderthal origins (it appeared about 37,000 years ago and is located primarily outside of Africa,) but no one has found microcephalin D in a Neanderthal, so this has not been proven. (And, indeed, another study has found that Neanderthal DNA tends not to be expressed in the brain.)

Yet on the other hand, Neanderthal admixture affected sapiens’ skull shapes:

Here, using MRI in a large cohort of healthy individuals of European-descent, we show that the amount of Neanderthal-originating polymorphism carried in living humans is related to cranial and brain morphology. First, as a validation of our approach, we demonstrate that a greater load of Neanderthal-derived genetic variants (higher “NeanderScore”) is associated with skull shapes resembling those of known Neanderthal cranial remains, particularly in occipital and parietal bones. Next, we demonstrate convergent NeanderScore-related findings in the brain (measured by gray- and white-matter volume, sulcal depth, and gyrification index) that localize to the visual cortex and intraparietal sulcus. This work provides insights into ancestral human neurobiology and suggests that Neanderthal-derived genetic variation is neurologically functional in the contemporary population.

(Not too surprising, given Neanderthals’ enormous craniums.)

Homo sapiens also received Neanderthal genes affecting the immune system, which were probably quite useful when encountering new pathogens outside of Africa, and genes for the “lipid catabolic process,”[19] which probably means they were eating new, fattier diets that Neanderthals were better adapted to digest.

Even Neanderthal-derived traits that today we cast as problems, like Type II Diabetes and depression, might have been beneficial to our ancestors:

“Depression risk in modern human populations is influenced by sunlight exposure, which differs between high and low latitudes, and we found enrichment of circadian clock genes near the Neanderthal alleles that contribute most to this association.”

Why would we find an association between Neanderthal DNA and circadian clock genes? Neanderthals had thousands of years more exposure to Europe’s long nights and cold winters than homo Sapiens’; it is unlikely that they developed these adaptations in order to become less well-adapted to their environment. It is more likely that Neanderthals downregulated their activity levels during the winter–to put it colloquially, they hibernated.

No problem for furry hunter-gatherers who lived in caves–much more problematic for information age workers who are expected to show up at the office at 9 am every day.

Type II diabetes affects digestion by decreasing the production of insulin, necessary for transporting converting carbs (glucose) into cells so it can be transformed into energy. However, your body can make up for a total lack of carbs via ketosis–essentially converting fats into energy.

Our hunter-gatherer ancestors–whether Neanderthal or Sapiens–didn’t eat a lot of plants during the European and Siberian winters because no a lot of plants grow during the winter. If they were lucky enough to eat at all, they ate meat and fat, like the modern Inuit and Eskimo.

And if your diet is meat and fat, then you don’t need insulin–you need ketosis and maybe some superior lipid digestion. (Incidentally, the data on ketogenic diets and type II diabetes looks pretty good.)

In sum, Neanderthal and Denisovan DNA, while not always useful, seems to have helped Homo sapiens adapt to colder winters, high altitudes, new pathogens, new foods, and maybe changed how we think and perceive the world.

 

 

Homeostasis, personality, and life (part 2)

Warning: This post may get a little fuzzy, due to discussion of things like personality, psychology, and philosophy.

Yesterday we discussed homeostatic systems for normal organism/organization maintenance and defense, as well as pathological malfunctions of over or under-response from the homeostatic systems.

But humans are not mere action-reaction systems; they have qualia, an inner experience of being.

One of my themes here is the idea that various psychological traits, like anxiety, guilt, depression, or disgust, might not be just random things we feel, but exist for evolutionary reasons. Each of these emotions, when experienced moderately, may have beneficial effects. Guilt (and its cousin, shame,) helps us maintain our social relationships with other people, aiding in the maintenance of large societies. Disgust protects us from disease and helps direct sexual interest at one’s spouse, rather than random people. Anxiety helps people pay attention to crucial, important details, and mild depression may help people concentrate, stay out of trouble, or–very speculatively–have helped our ancestors hibernate during the winter.

In excess, each of these traits is damaging, but a shortage of each trait may also be harmful.

I have commented before on the remarkable statistic that 25% of women are on anti-depressants, and if we exclude women over 60 (and below 20,) the number of women with an “anxiety disorder” jumps over 30%.

The idea that a full quarter of us are actually mentally ill is simply staggering. I see three potential causes for the statistic:

  1. Doctors prescribe anti-depressants willy-nilly to everyone who asks, whether they’re actually depressed or not;
  2. Something about modern life is making people especially depressed and anxious;
  3. Mental illnesses are side effects of common, beneficial conditions (similar to how sickle cell anemia is a side effect of protection from malaria.)

As you probably already know, sickle cell anemia is a genetic mutation that protects carriers from malaria. Imagine a population where 100% of people are sickle cell carriers–that is, they have one mutated gene, and one regular gene. The next generation in this population will be roughly 25% people who have two regular genes (and so die of malaria,) 50% of people who have one sickle cell and one regular gene (and so are protected,) and 25% of people will have two sickle cell genes and so die of sickle cell anemia. (I’m sure this is a very simplified scenario.)

So I consider it technically possible for 25% of people to suffer a pathological genetic condition, but unlikely–malaria is a particularly ruthless killer compared to being too cheerful.

Skipping to the point, I think there’s a little of all three going on. Each of us probably has some kind of personality “set point” that is basically determined by some combination of genetics, environmental assaults, and childhood experiences. People deviate from their set points due to random stuff that happens in their lives, (job promotions, visits from friends, car accidents, etc.,) but the way they respond to adversity and the mood they tend to return to afterwards is largely determined by their “set point.” This is all a fancy way of saying that people have personalities.

The influence of random chance on these genetic/environmental factors suggests that there should be variation in people’s emotional set points–we should see that some people are more prone to anxiety, some less prone, and some of average anxiousness.

Please note that this is a statistical should, in the same sense that, “If people are exposed to asbestos, some of them should get cancer,” not a moral should, as in, “If someone gives you a gift, you should send a thank-you note.”

Natural variation in a trait does not automatically imply pathology, but being more anxious or depressive or guilt-ridden than others can be highly unpleasant. I see nothing wrong, a priori, with people doing things that make their lives more pleasant and manageable (and don’t hurt others); this is, after all, why I enjoy a cup of coffee every morning. If you are a better, happier, more productive person with medication (or without it,) then carry on; this post is not intended as a critique of anyone’s personal mental health management, nor a suggestion for how to take care of your mental health.

Our medical/psychological health system, however, operates on the assumption that medications are for pathologies only. There is not form to fill out that says, “Patient would like anti-anxiety drugs in order to live a fuller, more productive life.”

That said, all of these emotions are obviously responses to actual stuff that happens in real life, and if 25% of women are coming down with depression or anxiety disorders, I think we should critically examine whether anxiety and depression are really the disease we need to be treating, or the body’s responses to some external threat.

I am reminded here of Peter Frost’s On the Adaptive Value of “Aw Shucks:

In a mixed group, women become quieter, less assertive, and more compliant. This deference is shown only to men and not to other women in the group. A related phenomenon is the sex gap in self-esteem: women tend to feel less self-esteem in all social settings. The gap begins at puberty and is greatest in the 15-18 age range (Hopcroft, 2009).

If more women enter the workforce–either because they think they ought to or because circumstances force them to–and the workforce triggers depression, then as the percent of women formally employed goes up, we should see a parallel rise in mental illness rates among women. Just as Adderal and Ritalin help little boys conform to the requirements of modern classrooms, Prozac and Lithium help women cope with the stress of employment.

As we discussed yesterday, fever is not a disease, but part of your body’s system for re-asserting homeostasis by killing disease microbes and making it more difficult for them to reproduce. Extreme fevers are an over-reaction and can kill you, but a normal fever below 104 degrees or so is merely unpleasant and should be allowed to do its work of making you better. Treating a normal fever (trying to lower it) interferes with the body’s ability to fight the disease and results in longer sicknesses.

Likewise, these sorts of emotions, while definitely unpleasant, may serve some real purpose.

We humans are social beings (and political animals.) We do not exist on our own; historically, loneliness was not merely unpleasant, but a death sentence. Humans everywhere live in communities and depend on each other for survival. Without refrigeration or modern storage methods, saving food was difficult. (Unless you were an Eskimo.) If you managed to kill a deer while on your own, chances are you couldn’t eat it all before it began to rot, and then your chances of killing another deer before you started getting seriously hungry were low. But if you share your deer with your tribesmates, none of the deer goes to waste, and if they share their deer with yours, you are far less likely to go hungry.

If you end up alienated from the rest of your tribe, there’s a good chance you’ll die. It doesn’t matter if they were wrong and you were right; it doesn’t matter if they were jerks and you were the nicest person ever. If you can’t depend on them for food (and mates!) you’re dead. This is when your emotions kick in.

People complain a lot that emotions are irrational. Yes, they are. They’re probably supposed to be. There is nothing “logical” or “rational” about feeling bad because someone is mad at you over something they did wrong! And yet it happens. Not because it is logical, but because being part of the tribe is more important than who did what to whom. Your emotions exist to keep you alive, not to prove rightness or wrongness.

This is, of course, an oversimplification. Men and women have been subject to different evolutionary pressures, for example. But this is close enough for the purposes of the current conversation.

If modern people are coming down with mental illnesses at astonishing rates, then maybe there is something about modern life that is making people ill. If so, treating the symptoms may make life more bearable for people while they are subject to the disease, but still does not fundamentally address whatever it is that is making them sick in the first place.

It is my own opinion that modern life is pathological, not (in most cases,) people’s reactions to it. Modern life is pathological because it is new and therefore you aren’t adapted to it. Your ancestors have probably only lived in cities of millions of people for a few generations at most (chances are good that at least one of your great-grandparents was a farmer, if not all of them.) Naturescapes are calming and peaceful; cities noisy, crowded, and full of pollution. There is some reason why schizophrenics are found in cities and not on farms. This doesn’t mean that we should just throw out cities, but it does mean we should be thoughtful about them and their effects.

People seem to do best, emotionally, when they have the support of their kin, some degree of ethnic or national pride, economic and physical security, attend religious services, and avoid crowded cities. (Here I am, an atheist, recommending church for people.) The knowledge you are at peace with your tribe and your tribe has your back seems almost entirely absent from most people’s modern lives; instead, people are increasingly pushed into environments where they have no tribe and most people they encounter in daily life have no connection to them. Indeed, tribalism and city living don’t seem to get along very well.

To return to healthy lives, we may need to re-think the details of modernity.

Politics

Philosophically and politically, I am a great believer in moderation and virtue as the ethical, conscious application of homeostatic systems to the self and to organizations that exist for the sake of humans. Please understand that this is not moderation in the conventional sense of “sometimes I like the Republicans and sometimes I like the Democrats,” but the self-moderation necessary for bodily homeostasis reflected at the social/organizational/national level.

For example, I have posted a bit on the dangers of mass immigration, but this is not a call to close the borders and allow no one in. Rather, I suspect that there is an optimal amount–and kind–of immigration that benefits a community (and this optimal quantity will depend on various features of the community itself, like size and resources.) Thus, each community should aim for its optimal level. But since virtually no one–certainly no one in a position of influence–advocates for zero immigration, I don’t devote much time to writing against it; it is only mass immigration that is getting pushed on us, and thus mass immigration that I respond to.

Similarly, there is probably an optimal level of communal genetic diversity. Too low, and inbreeding results. Too high, and fetuses miscarry due to incompatible genes. (Rh- mothers have difficulty carrying Rh+ fetuses, for example, because their immune systems identify the fetus’s blood as foreign and therefore attack it, killing the fetus.) As in agriculture, monocultures are at great risk of getting wiped out by disease; genetic heterogeneity helps ensure that some members of a population can survive a plague. Homogeneity helps people get along with their neighbors, but too much may lead to everyone thinking through problems in similar ways. New ideas and novel ways of attacking problems often come from people who are outliers in some way, including genetics.

There is a lot of talk ’round these parts that basically blames all the crimes of modern civilization on females. Obviously I have a certain bias against such arguments–I of course prefer to believe that women are superbly competent at all things, though I do not wish to stake the functioning of civilization on that assumption. If women are good at math, they will do math; if they are good at leading, they will lead. A society that tries to force women into professions they are not inclined to is out of kilter; likewise, so is a society where women are forced out of fields they are good at. Ultimately, I care about my doctor’s competence, not their gender.

In a properly balanced society, male and female personalities complement each other, contributing to the group’s long-term survival.

Women are not accidents of nature; they are as they are because their personalities succeeded where women with different personalities did not. Women have a strong urge to be compassionate and nurturing toward others, maintain social relations, and care for those in need of help. These instincts have, for thousands of years, helped keep their families alive.

When the masculine element becomes too strong, society becomes too aggressive. Crime goes up; unwinable wars are waged; people are left to die. When the feminine element becomes too strong, society becomes too passive; invasions go unresisted; welfare spending becomes unsustainable. Society can’t solve this problem by continuing to give both sides everything they want, (this is likely to be economically disastrous,) but must actually find a way to direct them and curb their excesses.

I remember an article on the now-defunct neuropolitics (now that I think of it, the Wayback Machine probably has it somewhere,) on an experiment where groups with varying numbers of ‘liberals” and “conservatives” had to work together to accomplish tasks. The “conservatives” tended to solve their problems by creating hierarchies that organized their labor, with the leader/s giving everyone specific tasks. The “liberals” solved their problems by incorporating new members until they had enough people to solve specific tasks. The groups that performed best, overall, were those that had a mix of ideologies, allowing them to both make hierarchical structures to organize their labor and incorporate new members when needed. I don’t remember much else of the article, nor did I read the original study, so I don’t know what exactly the tasks were, or how reliable this study really was, but the basic idea of it is appealing: organize when necessary; form alliances when necessary. A good leader recognizes the skills of different people in their group and uses their authority to direct the best use of these skills.

Our current society greatly lacks in this kind of coherent, organizing direction. Most communities have very little in the way of leadership–moral, spiritual, philosophical, or material–and our society seems constantly intent on attacking and tearing down any kind of hierarchies, even those based on pure skill and competence. Likewise, much of what passes for “leadership” is people demanding that you do what they say, not demonstrating any kind of competence. But when we do find competent leaders, we would do well to let them lead.

Back to part one.

Thoughts on Frost’s The Adaptive Value of “Aw Shucks”

Peter Frost recently posted on female shyness among men–more specifically, on the observation that adolescent white females appear to become very shy among groups of males and suffer depression, but adolescent black females don’t.

Frost theorizes that women are instinctually deferential to men, especially when they are economically dependent on them, and that whites show more of this deference than blacks because traditional white marriage patterns–monogamy–have brought women into more contact with men while making them more economically dependent on them than traditional African marriage patterns–polygyny–and therefore white women have evolved to have more shyness.

This explanation is decent, but feels incomplete.

Did anyone bother to ask the girls why they felt shy around the boys? Probably someone has, but that information wasn’t included in the post. But I can share my own experiences.

For starters, I’ve never felt–and this may just be me–particularly shyer around males than around females, nor do I recall ever talking less in highschool due to class composition. Rather, the amount I talked had entirely to do with how much I liked the subject matter vs. how tired I was. However, in non-school settings, I am less likely to talk when conversations are dominated by men, simply because men tend to talk about things I find boring, like cars, sports, or finance. (I suspect I have an unusually high tolerance for finance/economic discussions for a female, but there are limits to what even I can stand, and the other two topics drive me to tears of boredom. Sports, as far as I am concerned, are the Kardashians of men.) I am sure the same is true in reverse–when groups of women get together, they talk about stuff that men find horribly dull.

Even in classroom conversations that are ostensibly led by the teacher, male students may make responses that just aren’t interesting to the female students, leading to the females getting bored or having little to say in response.

So, do black adolescent girls and boys have more conversation topics in common than whites?

Second, related to Frost’s observations, men tend to be more aggressive while talking than women. They are louder, they interrupt more, they put less effort into assuaging people’s feelings, etc. I am sure women do things men find annoying, like ramble on forever without getting to the point or talking about their feelings in these weirdly associative ways. Regardless, I suspect that women/adolescents (at least white ones) often find the male style overwhelming, and their response is to retreat.

When feminists say they need “safe spaces” away from men to discuss their feminism things, they aren’t entirely inaccurate. It’s just that society used to have these “safe spaces” for women back before the feminists themselves destroyed them! Even now, it is easy to join a Mommy Meetup group or find an all-female Bible study club. But, oh wait, these are regressive! What we need are all-female lawyers, or doctors, or mathematicians…

*Ahem* back on subject, if testosterone => aggression, it would be interesting to see if the difference in black vs white females is simply a result of different testosterone levels (though of course that is just kicking the ball back a bit, because we then must ask what causes different testosterone levels.)

I suspect that Frost is on the right track looking at polygyny vs. monogamy, but I think his mechanism (increased time around/dependence on men => increase shyness) is incomplete. He’s missed something from his own work: polygynous males have higher testosterone than monogamous ones (even within their own society.) (See: The Contradictions of Polygyny and Polygyny Makes Men Bigger, Tougher, and Meaner.) Even if women in polygynous societies were expected to behave exactly like women from monogamous societies, I’d expect some “spillover” effect from the higher testosterone in their men–that is, everyone in society ought to have higher testosterone levels than they would otherwise.

Additionally, let us consider that polygyny is not practiced the same everywhere. In the Middle East, sexual access to women is tightly controlled–to the point where women may be killed for extra-marital sexual activity. In this case, the women are effectively monogamous, while the men are not. By contrast, in the societies Frost describes from Sub-Saharan Africa, it sounds like both men and women have a great many sexual partners during adolescence and early adulthood (which explains the high STD rates.)

If polygamy increases male aggression and testosterone levels because them men have to invest more energy into finding mates, then it stands to reason that women who have lots of mates are also investing lots of energy into finding them, and so would also have increased levels of aggression and testosterone.

Speaking again from personal experience, I observed that my own desire to talk to men basically cratered after I got married (and then had kids.) Suddenly something about it seemed vaguely tawdry. Of course, this leaves me in a bit of a pickle, because there aren’t that many moms who want to discuss HBD or related topics. (Thankfully I have the internet, because talking to words on a screen is a very different dynamic.) Of course, if I were back on the dating market again (god forbid!) I’d have to talk to lots of men again.

So I think the equation here shouldn’t be +time with men => +shyness, -time with men => -shyness, but +pursuit of partners => +aggression, -pursuit of partners => -aggression.

None of this gets into the “depression” issue. What’s up with that?

Personally, while I felt plenty of annoying things during highschool, the only ones triggered by boys were of the wanting to fall in love variety and the feeling sad if someone didn’t like me variety. I did feel some distress over wanting the adults to treat me like an adult, but that has nothing to do with boys. But this may just be me being odd.

We know that whites, women, and the subset of white women suffer from depression, anxiety, and other forms of mental illness at higher rates than blacks, men, and pretty much everyone else. I speculate that anxiety, shyness, disgust, and possibly even depression are part of a suite of traits that help women women avoid male aggression, perform otherwise dull tasks like writing English papers or washing dishes, keep out of trouble, and stay interested in their husbands and only their husbands.

In a society where monogamy is enforced, people (or their parents) may even preferrentially chose partners who seem unlikely to stray–that is, women (or men) who display little interest in actively pursuing the opposite sex. So just as women in polygamous societies may be under selective pressure to become more aggressive, women in monogamous societies may be under selective pressure to have less interest in talking to men.

Eventually, you get Japan.

Amusingly, the studies Frost quotes view white female shyness as a bad thing to be corrected, and black female non-shyness as a good thing that mysteriously exists despite adverse conditions. But what are the effects of white female shyness? Do white women go to prison, become pregnant out of wedlock, or get killed by their partners at higher rates than black women? Do they get worse grades, graduate from school at lower rates, or end up in worse professions?

Or maybe shy girls are perfectly fine the way they are and don’t need fixing.

 

The Utility of Anxiety

Disclaimer time: I am not a doctor. I am not a psychologist/psychiatrist. If you have a mental illness/disorder/concerns, take them up with a trained professional who knows what they’re talking about. For the love of god, DO NOT make medical/mental health decisions based on my speculative babbling about what might have been useful to our ancestors.

Carrying on…

Americans are an anxious people.

According to the Kim Foundation (I don’t actually know who they are, but they are the first hit that comes up when you Google “Percent Americans with anxiety,”) about 18% of us have some form of anxiety disorder, such as, “panic disorder, obsessive-compulsive disorder, post-traumatic stress disorder, generalized anxiety disorder, and phobias.”

An additional 10% of us have mood disorders, eg, “major depressive disorder, dysthymic disorder, and bipolar disorder.”

(The Anxiety and Depression Association of America gives the same stat, citing the National Institute of Mental Health as their source.) The NIMH made some lovely graphs:

 

NCS-R_AnxietyDisorders-Chart2-360_147928_2 NCS-R_AnxietyDisorders-Chart1-360_147927_2

Also from the NIMH:

NSDUH_AMI-_2012_GRAPH_148270_2

There’s a lot of interesting data in this graph. For simplicity’s sake, from here on out, when I say, “Women,” I am referring primarily to “white women,” but remember that no group is entirely lacking in crazy.

Also, the graphs for mood disorders:

NCS-R_MoodDisorders-Chart2-360_148105_1 NCS-R_MoodDisorders-Chart1-360_148104_1

Now, you’re probably thinking, “Wait a minute, those numbers don’t add up!”

They don’t have to add up. You can get diagnosed with two things at once. Or five. It just depends on how often you go pester the shrinks.

It’s no secret that women are kind of crazy, but I still find the numbers a little shocking. According to the Huffington Post, 25% of women are on psychiatric drugs of some sort. The article also claims that, “One in four women is on antidepressants,” so I guess 100% of women taking psychiatric drugs are on anti-depressants, or the math got fucked up somewhere.

Why do 22-25% of women feel so bad that they need psychiatric medication just to deal with their lives? (Not to mention 15% of men.)

Some quick possibilities:

1. Shrinks are handing out pills like crazy, whether patients are actually mentally ill or not, because who wouldn’t like to be happier and better-adjusted?

2. Something about modern life makes people (especially white women) very anxious.

3. Highly anxious people are a side effect of low infant mortality + the baby boom expanding the class of parents.

4. Anxiety/depression are actually adaptive, and so we are supposed to feel this way.

5. Some combination of all of the above.

Personally, I lean toward #5.

Now, a quick aside: I don’t really like feelings. Oh, sure, I’m okay with the good ones. Happiness, love, joy, enthusiasm, sure, I like those. But the rest of the feelings I could generally do without. I especially dislike other people’s emotions. “I am having a sad,” translates all too quickly into, “I am yelling at you.” So, as I stated at the beginning, if you think you need help handling your emotions, or the people around you think you do, please consider getting help. You don’t have to live in pain.

That said, I think anxiety is supposed to serve some purpose that modern conditions have gotten out of whack.

I have already posted about how depression, in small quantities, may help keep us out of trouble and sleep through the long European winters. In general, there are a lot of traits where I think a little bit may be beneficial, even though a lot is damaging.

So what purpose could anxiety serve?

According to WebMD, the most common causes of anxiety include:

  • Stress at work
  • Stress from school
  • Stress in a personal relationship such as marriage
  • Financial stress
  • Stress from an emotional trauma such as the death of a loved one
  • Stress from a serious medical illness
  • Side effect of drugs, legal or otherwise
  • Medical symptom, eg, low oxygen

The last three I consider perfectly rational biological responses–it’s very understandable that someone who can’t breathe feels anxious. But other than coffee, I doubt these are seriously affecting the overall anxiety rates.

That leaves us with “stress,” (which is basically a synonym for “anxiety”) from pretty much every part of life. Almost 20% of women cannot cope with work/school/relationships/finances without medication. It is tempting, therefore, to think that our entire modern lifestyle, from large, dense cities to two-income households could not exist without medicating women into not freaking out.

But why would they freak out in the first place?

Biochemically, “stress” is the feeling of your body responding to threatening or potentially threatening situations via your “fight or flight” response. In nature, fight or flight is very useful: it prepares you to run for your life or fight to the death. According to Wikipedia, Fight or Flight works like this:

The reaction begins in the amygdala, which triggers a neural response in the hypothalamus. The initial reaction is followed by activation of the pituitary gland and secretion of the hormone ACTH. The adrenal gland is activated almost simultaneously and releases the neurotransmitter epinephrine. The release of chemical messengers results in the production of the hormone cortisol, which increases blood pressure, blood sugar, and suppresses the immune system. The initial response and subsequent reactions are triggered in an effort to create a boost of energy. This boost of energy is activated by epinephrine binding to liver cells and the subsequent production of glucose. Additionally, the circulation of cortisol functions to turn fatty acids into available energy, which prepares muscles throughout the body for response. Catecholamine hormones, such as adrenaline (epinephrine) or noradrenaline (norepinephrine), facilitate immediate physical reactions associated with a preparation for violent muscular action.

Oh, look, it’s our old friend, the amygdala! (See also here, here and here.)

According to Neuropolitics,

The basolateral amygdala has been linked to conditioned fear and disgust learning, while the central amygdala has been linked to conditioned fear learning. … liberals had elevated amydalar responses to the viewing of a political commercial about nuclear war.

Hart et al. (2000) selected an equal number of blacks and whites, repeatedly showing them pictures of white and black faces while performing fMRI. They noted: “across all subjects, we observed significantly greater…BOLD signal in the amygdala to outgroup vs ingroup faces, but only during later stimulus presentations. …

Further, Phelps found that activation in the left amygdala and right amygdala (all the way to the insular cortex) were correlated with a negative bias towards black faces on the Implicit Association Test.”

Last time I took an implicit association test, it told me that I prefer fat people over skinny and blacks over whites. I don’t know why everyone else fails these things.

the only region that was activated in both the Implicit Association and Startle Eyeblink tests was the left-superior amygdala. … Phelps noted: “the region in the amygdala most strongly correlated with negative evaluation [of black faces] was the left-superior amygdala”.

Richeson et al. (2003) performed an fMRI investigation of the impact of interracial contact on executive function, and uncovered a critical findings with regards to racial prejudice: it is inhibited by right hemispheric neural networks such as the dorsal lateral prefrontal cortex and anterior cingulate. Richeson’s findings of a right-hemispheric network that inhibits racial prejudice shows the push-pull mechanism of the amygdala and the dorsolateral prefrontal cortex, especially on the right side.

… Cunningham used two different exposure periods: an subconscious exposure of 30 milliseconds; and, a conscious exposure of 525 milliseconds. During the subconscious exposure, which was not long enough for most of the subjects to even be aware of the black and white face photos, Cunningham found the right amygdala to be activated in the black minus white condition, … Longer presentations of racial stimuli favor activation in the left amygdala, at least according to Phelps.

But with the 525 millisecond presentation, the amygdala’s racial responsiveness was inhibited, meaning it didn’t take very long for another area in the brain to assume control. And that region was located predominately in the right hemisphere, confirming the work of Richeson. Cunningham noted: “the regions Richeson et al. identified as underlying the control of prejudice were nearly identical to the regions identified in this study as being associated with modulation of automatic evaluations”.

Here is where I get speculative:

When we meet another human, we automatically assess whether they are a threat or not. If we know them well or they look like someone we know (and like), they go into the “not a threat” category. If they don’t look familiar, they go into the “might be a threat” box, and your body begins preparing to run/fight for your life.

Your brain makes this assessment subconsciously and begins preparing your fight or flight response before your conscious networks have even kicked in. Your conscious networks appear to be trying to override your unconscious ones–perhaps by just rationally evaluating potential threat, or perhaps by yelling at your amygdala to stop being so racist.

I wouldn’t be surprised if this mental push-pull between the amygdala and the dorsolateral prefrontal cortex created more stress. 

Men seem to cope better than women with stress and aggression. They have a naturally higher aggression “set point” due to being descended from the men who killed all the other men. Aggression has historically been a winning strategy for men, but not women. Aggressive women, historically, were more likely to kill their own children or, if pregnant, get their children killed by someone else. Being the smallest, weakest person around makes aggression a losing strategy.

Personal anecdote time: In my younger, dumber days, I was a lot more aggressive than I am now. Not so much in real life, because men are bigger than me and I’m not dumb. But in the relative safety of the internet, certainly. Then I got pregnant. Suddenly, I couldn’t stand aggression. I remember watching a YouTube video of police aggression. My heart started racing. My palms were sweating. I was reacting as though the aggression were in the same room with me, not a recording on a little screen of something that happened hundreds of miles away. After that, I stopped watching TV News and stopped fighting with random strangers on the internet. I couldn’t take them anymore.

Aggression is useful for finding mates, because it gets people out of the house and helps them talk to each other. Sometimes it also results in punching.

Pregnant women have no need for aggression. They have already found a mate, and now they need to keep him. (Mates are very useful for bringing you food during that healing period after birth.) Further, pregnant women need to protect their fetuses (and later, babies.) The mother needs aggression only to save her own life or her child’s life.

School, work, corporations, and daily city life all involve being constantly around hundreds if not thousands of unrelated people. And as you probably already know, trust and diversity are negatively correlated. (Or just read the book.)

Corporations are stressful because they’re full of aggressive men, who interrupt more, take credit for other people’s accomplishments, are noisy, and use their physical size to intimidate each other. Women respond to this in a variety of ways you’re already familiar with, including the consumption of large quantities of Xanax to keep them from freaking out and having a meltdown every time a strange man gets into an elevator with them.


You know what? This… isn’t helping.

Neither are these:

Carmen Tarleton, white woman whose ex husband doused her with lye and beat her with a baseball bat
Carmen Tarleton, white woman whose ex husband doused her with lye and beat her with a baseball bat
Carmen Tarleton's ex husband, who will not be executed.
Carmen Tarleton’s ex husband, who will not be executed.
Still from Rhianna's music video about torturing a white woman for money
Still from Rhianna’s “empowering” music video about torturing a white woman

Anxiety exists because it helped our ancestors avoid dangerous situations, but modern life basically requires spending high amounts of time in anxiety-inducing situations. Some people eventually learn not to freak out and suppress their instincts, but for many people, repeated stimulus exposure only makes things worse.

 

But aside from preparing people to flee or fight,  I suspect that anxiety serves another purpose: it forces women to do whatever it takes to remain part of the group, the tribe, because the tribe is survival, and outside the tribe is nothing but the howling wind and empty, barren waste. Female survival and evolutionary success has not historically depended on dominating the tribe, but on not getting kicked out.

Anxiety does not manifest itself as a rational response. Someone else does something wrong, you tell them not to, and afterward, you feel anxious. Objectively, you are in the right. The other person did something wrong. But your emotions tell a different story. Your emotions say that you are wrong. This is because you are not at peace with your tribe, with your friend or family member.

Or let us suppose that you say something innocently, even helpfully to another person, and they take it the wrong way and become angry and yell at you. Afterwards, do you feel mad at them? Or do you just feel unhappy that they are feeling so unhappy?

Okay, maybe not you, my faithful reader. You probably aren’t female.

Anxiety is one of those things that I suspect is good in moderation. A bit of concern for safety makes people pay attention as they go about their business. Double-checking that the locks are locked and the stove is off before going to bed could save your life. Being willing to put aside hurt feelings and make amends with others makes life more pleasant, and is probably crucial to living in large communities. Taken in excess, any of these behaviors becomes debilitating–the person develops agoraphobia, OCD, or pathological unwillingness to stick up for themselves.

A small amount of anxiety may also be useful in getting people to pay attention to little details. It’s making sure that all of the is are dotted and ts are crossed that makes sure airplanes stay in the air, after all.

Peter Frost has laid out a series of posts on guilt, and by contrast, shame. Now, here I must make a confession: I lack an intuitive sense of the distinction he is drawing between guilt and shame, or perhaps just lack sufficient exposure to “shame cultures” to really get it. Regardless, I don’t think it is too much of a stretch to suspect that “guilt” and “anxiety” may be deeply linked.

Frost proposes that, “Pervasive feelings of guilt are part of a behavioral package that enabled Northwest Europeans to adapt to complex social environments where kinship is less important and where rules of correct behavior must be obeyed with a minimum of surveillance.” 

While most commentator posit the European guilt complex arose in response to specific events, eg, the Holocaust, Frost traces it back to a much earlier time, citing, for example, Aelfric of Eynsham, an English abbot born in 955:

He who cannot because of shame confess his faults to one man, then it must shame him before the heaven-dwellers and the earth-dwellers and the hell-dwellers, and the shame for him will be endless. (Bedingfield, 2002, p. 80)

And The Song of Beowulf:

That was sorrow to the good man’s soul, greatest of griefs to the heart. The wise man thought that, breaking established law, he had bitterly angered God, the Lord everlasting. His breast was troubled within by dark thoughts, as was not his wont.

(Personally, I’ve always thought Grendel was a metaphor for plague, and Beowulf plunging into the lake represents a human sacrifice by drowning/throwing the sacrificed victim into the lake to appease the gods, but I am really not an Anglo Saxon culture expert.)

Frost pushes back the potential beginnings of guilt culture even further, to the semi-sedentary Scandinavian/Baltic hunter-gatherer/fishing communities of 8,500 years ago. He suggests that in this environment, guilt made people cooperate, Prisoner’s Dilemma-style, and community sanctions against defectors ensured that they stayed a low enough percent of the population that they couldn’t take advantage of the folks who felt a lot of guilt. Quoting Frost:

What is to stop some individuals from exploiting the guilt proneness of others while feeling no guilt themselves? This free-rider dilemma may have been resolved in part by identifying such individuals and ostracizing them. It may also be that these semi-sedentary communities were conducive to evolution of altruistic behavior, as described by Maynard Smith’s haystack model (Wikipedia, 2013). According to this model, guilt-prone individuals are at a disadvantage within any one community and will thus become fewer and fewer with each generation. If, however, a community has a high proportion of guilt-prone individuals, it will have an advantage over other communities and thus expand in numbers at their expense. And if these communities disperse and regroup on a regular basis, the overall proportion of guilt-prone individuals will increase over time. …

There is an obvious issue that arises if a guilt-ridden society suddenly obtains a large number of individuals who don’t buy into the whole guilt complex.

… it was the hunter-fisher-gatherers of the North Sea and the Baltic who led the way to behavioral modernity, i.e., individualism, reduced emphasis on kinship, and the market as the main organizing principle of social and economic life. Their mode of subsistence was not wiped out by agriculture, unless one sees fishing as a kind of farming. They not only survived, but also went on to create what we now call the Western World. Not bad for a bunch of losers.

The guilt complex is obviously deep in Christianity. My researches so far have not revealed a similar guilt complex in other religions, though to be fair, Hinduism is vast and well beyond my understanding. IMO, some Christians take this guilt to an unhealthy level:

Self-flagellation, from the Wikipedia
Self-flagellation, from the Wikipedia

The Wikipedia further claims:

Some members of strict monastic orders, and some members of the Catholic lay organization Opus Dei, practice mild self-flagellation using an instrument called a “discipline”, a cattail whip usually made of knotted cords, which is flung over the shoulders repeatedly during private prayer. Pope John Paul II took the discipline regularly.

The Wikipedia page on Flagellantism, a Medieval Religious movement, deserves reading in its own right, but I will try to quote a representative bit here:

The 11th-century zealot Dominicus Loricatus repeated the entire Psalter twenty times in one week, accompanying each psalm with a hundred lash-strokes to his back. … The movement did not have a central doctrine or overall leaders, but a popular passion for the movement occurred all over Europe in separate outbreaks. … The prime cause of the Perugia episode is unclear, but it followed an outbreak of an epidemic and chroniclers report how the mania spread throughout almost all the people of the city. Thousands of citizens gathered in great processions, singing and with crosses and banners, they marched throughout the city whipping themselves. … The movement spread across Northern Italy, up to 10,000 strong groups processing in Modena, Bologna, Reggio and Parma …

The German and Low Countries movement … established their camps in fields near towns and held their rituals twice a day. The ritual began with the reading of a letter, claimed to have been delivered by an angel and justifying the Flagellants’ activities. Next the followers would fall to their knees and scourge themselves, gesturing with their free hands to indicate their sin and striking themselves rhythmically to songs, known as Geisslerlieder, until blood flowed. Sometimes the blood was soaked up in rags and treated as a holy relic. … some towns began to notice that sometimes Flagellants brought plague to towns where it had not yet surfaced. Therefore later they were denied entry. They responded with increased physical penance.

The anchorites were early hermits/monks who were literally walled into tiny rooms they never left for the rest of their lives:

The original Tiny House
Medieval illustration of anchorite cell

Maybe if Xanax had existed in Medieval Europe, people would have been less prone to walling themselves up in churches.

Note that self-flagellation and anchoritism are not rational responses to life in Medieval Europe–not only do they not solve problems like the Black Death, they may have exacerbated them. They are extreme emotional responses to overwhelming feelings of guilt and anxiety.

Properly balanced, guilt and anxiety can prompt people to treat each other fairly and be attentive in their work. Unbalanced, the individual (or society,) becomes unhinged. They start demanding that their own societies be destroyed because they they must have done something wrong to have more advanced tech than other societies, or groveling for forgiveness for things they didn’t even do:

white woman begs forgiveness

White woman begs forgiveness for slavery

Anxiety and guilt have their good sides. Society probably couldn’t exist without them. But they have to be in balance.

Whites like Goth and Metal because Whites are Depressives

On a global scale, poverty is probably a bigger predictor of suicide. But within the US there are some clear looking racial differences in depression:

Actually, the interesting thing is just how non-suicidal blacks seem to be.

Yes, I know that suicide and depression aren’t the same word. But I figure “depression” is kinda tricky to accurately document, (Is he really depressed, or just kinda bummed?), whereas suicide seems pretty reliable. And since whites and Asians probably have the best access to mental health care, the numbers probably aren’t being skewed by lack of Prozac among the poor.

I remember an article I read a year or two ago, but can’t find now, which found a correlation between depression and intelligence. More or less, the implication as I interpreted it, is that “depression” is functionally a slowing down of the brain, and during intellectual tasks, people who could slow down and concentrate performed better–thus, concentrating and depression look rather similar.

There are other, additional possibilities: people from further north get depressed because it’s dark and cold all winter/as an adaptation to the winters, and so the Finns listen to a ton of Death Metal:

 

This came from Reddit, but I'm sure it's totally legit
Death Metal Bands Per Capita throughout the World

I don’t have a map for Goth music; does anyone listen to Goth anymore? Hot Topic seems to be doing fine at the mall.

Or maybe depression is an evolutionary adaptation to make people more peaceful and cooperative by internalizing their aggression instead of killing other people. Here the difference between whites and blacks seems like a point of evidence, since whites seem to kill themselves at higher rates than they kill others, while blacks kill others at higher rates than they kill themselves. Perhaps aggression/depression can be toggled on and off in some way, genetically or, in the case of folks with bi-polar, in a single individual.

Asians, I suspect, are also depressives, but have lower aggression than whites,  so they don’t kill themselves very often. Also, I don’t know what kinds of music they like.

 

How Much anti-Psych Research is Funded by Guys who Think all Mental Illness is Caused by Dead Aliens?

And how much is just idiots?

How the US Mental Health System Makes Natives Sick and Suicidal,” by David Walker.

Important backstory: once upon a time, I made some offhand comments about mental health/psychiatric drugs that accidentally influenced someone else to go off their medication, which began a downward spiral that  ended with them in the hospital after attempting suicide. Several years later, you could still see the words “I suck” scarred into their skin.

There were obviously some other nasty things that had nothing to do with me before the attempt, but regardless, there’s an important lesson: don’t say stupid ass things about mental health shit you know nothing about.

Also, don’t take mental health advice from people who don’t know what they’re talking about.

In my entirely inadequate defense, I was young and very dumb. David Walker is neither–and he is being published by irresponsible people who ought to know better.

To be clear: I am not a psychiatrist. I’m a dumb person on the internet with opinions. I am going to do my very damn best to counteract even dumber ideas, but for god’s sakes, if you have mental health issues, consult with someone with actual expertise in the field.

Also, you know few things bug me like watching science and logic be abused. So let’s get down to business:

This is one of those articles where SJW-logic plus sketchy research of the sort that I suspect originated with funding from guys trying to prove that all mental illnesses were caused by Galactic Overlord Xenu combine to make a not very satisfying article. I suppose it is petty to complain that the piece didn’t flow well, but still, it irked.

Basically, to sum: The Indian Health Service is evil because it uses standard psychiatry language and treatment–the exact same language and treatment as everyone else in the country is getting–instead of filling its manuals with a bunch of social-justice buzzwords like “colonization” and “historical trauma”. The article does not tell us how, exactly, inclusion of these buzzwords is supposed to actually change the practice of psychiatry–part of what made the piece frustrating on a technical level.

The author then makes a bunch of absolutist claims about standard depression treatment that range from the obviously false to matters of real debate in the field. Very few of his claims are based on what I’d call “settled science”–and if you’re going to make absolutist claims about medical related things, please, try to only say things that are actually settled.

The crux of Walker’s argument is a claim that anti-depressants actually kill people and decrease libido, so therefore the IHS is committing genocide by murdering Indians and preventing the births of new ones.

Ugh, when I put it like that, it sounds so obviously dumb.

Some actual quotes:

“In the last 40 years, certain English words and phrases have become more acceptable to indigenous scholars, thought leaders, and elders for describing shared Native experiences. They include genocide, cultural destruction, colonization, forced assimilation, loss of language, boarding school, termination, historical trauma and more general terms, such as racism, poverty, life expectancy, and educational barriers. There are many more.”

Historical trauma is horribly sad, of course, but as a cause for depression, I suspect it ranks pretty low. If historical trauma suffered by one’s ancestors results in continued difficulties several generations down the line, then the descendants of all traumatized groups ought to show similar effects. Most of Europe got pretty traumatized during WWII, but most of Europe seems to have recovered. Even the Jews, who practically invented modern psychiatry, use standard psychiatric models for talking about their depression without invoking the Holocaust. (Probably because depression rates are pretty low in Israel.)

But if you want to pursue this line of argument, you would need to show first that Indians are being diagnosed with depression (or other mental disorders) at a higher rate than the rest of the population, and then you would want to show that a large % of the excess are actually suffering some form of long-term effects of historical trauma. Third, you’d want to show that some alternative method of treatment is more effective than the current method.

To be fair, I am sure there are many ways that psychiatry sucks or could be improved. I just prefer good arguments on the subject.

“…the agency’s behavioral health manual mentions psychiatrist and psychiatric 23 times, therapy 18 times, pharmacotherapy, medication, drugs, and prescription 16 times, and the word treatment, a whopping 89 times. But it only uses the word violence once, and you won’t find a single mention of genocide, cultural destruction, colonization, historical trauma, etc.—nor even racism, poverty, life expectancy or educational barriers.

It’s absolutely shocking that a government-issued psychiatry manual uses standard terms used in the psychiatry field like “medication” and “psychiatrist,” but doesn’t talk about particular left-wing political theories. It’s almost like the gov’t is trying to be responsible and follow accepted practice in the field or something. Of course, to SJWs, even medical care should be sacrificed before the altar of advancing the buzz-word agenda.

“This federal agency doesn’t acknowledge the reality of oppression within the lives of Native people.”

and… so? I know it sucks to deal with people who don’t acknowledge what you’re going through. My own approach to such people is to avoid them. If you don’t like what the IHS has to offer, then offer something better. Start your own organization offering support to people suffering from historical trauma. If your system is superior, you’ll not only benefit thousands (perhaps millions!) of people, and probably become highly respected and well-off in the process. Even if you, personally, don’t have the resources to start such a project, surely someone does.

If you can’t do that, you can at least avoid the IHS if you don’t like them. No one is forcing you to go to them.

BTW, in case you are wondering what the IHS is, here’s what Wikipedia has to say about them:

“The Indian Health Service (IHS) is an operating division (OPDIV) within the U.S. Department of Health and Human Services (HHS). IHS is responsible for providing medical and public health services to members of federally recognized Tribes and Alaska Natives. … its goal is to raise their health status to the highest possible level. … IHS currently provides health services to approximately 1.8 million of the 3.3 million American Indians and Alaska Natives who belong to more than 557 federally recognized tribes in 35 states. The agency’s annual budget is about $4.3 billion (as of December 2011).”

Sounds nefarious. So who runs this evil agency of health?

“The IHS employs approximately 2,700 nurses, 900 physicians, 400 engineers, 500 pharmacists, and 300 dentists, as well as other health professionals totaling more than 15,000 in all. The Indian Health Service is one of two federal agencies mandated to use Indian Preference in hiring. This law requires the agency to give preference hiring to qualified Indian applicants before considering non-Indian candidates for positions. … The Indian Health Service is headed by Dr. Yvette Roubideaux, M.D., M.P.H., a member of the Rosebud Sioux in South Dakota.”

So… the IHS, run by Indians, is trying to genocide other Indians by giving them mental health care?

And maybe I’m missing something, but don’t you think Dr. Roubideaux has some idea about the historical oppression of her own people?

Then we get into some anti-Pfizer/Zoloft business:

“For about a decade, IHS has set as one of its goals the detection of Native depression. [How evil of them!] This has been done by seeking to widen use of the Patient Health Questionnaire-9 (PHQ-9), which asks patients to describe to what degree they feel discouraged, downhearted, tired, low appetite, unable to sleep, slow-moving, easily distracted or as though life is no longer worth living.

The PHQ-9 was developed in the 1990s for drug behemoth Pfizer Corporation by prominent psychiatrist and contract researcher Robert Spitzer and several others. Although it owns the copyright, Pfizer offers the PHQ-9 for free use by primary health care providers. Why so generous? Perhaps because Pfizer is a top manufacturer of psychiatric medications, including its flagship antidepressant Zoloft® which earned the company as much as $2.9 billion annually before it went generic in 2006.”

I agree that it is reasonable to be skeptical of companies trying to sell you things, but the mere fact that a company is selling a product does not automatically render it evil. For example, the umbrella company makes money if you buy umbrellas, but that doesn’t make the umbrella company evil. Pfizer wants to promote its product, but also wants to make sure it gets prescribed properly.

” Even with the discovery that the drug can increase the risk of birth defects, 41 million prescriptions for Zoloft® were filled in 2013.”

Probably to people who weren’t pregnant.

“The DSM III-R created 110 new psychiatric labels, a number that had climbed by another 100 more by the time I started working at an IHS clinic in 2000.

Around that time, Pfizer, like many other big pharmaceutical corporations, was pouring millions of dollars into lavish marketing seminars disguised as “continuing education” on the uses of psychiatric medication for physicians and nurses with no mental health training.

… After this event, several primary care colleagues began touting their new expertise in mental health, and I was regularly advised that psychiatric medications were (obviously) the new “treatment of choice.” ”

Seriously, he’s claiming that psychiatric medications were the “new” “treatment of choice” in the year 2000? Zoloft was introduced in 1991. Prozac revolutionized the treatment of depression way back in 1987. Walker’s off by over a decade.

Now, as Scott Alexander says, beware the man of one study: you can visit Prozac and Zoloft’s Wikipedia pages yourself and read the debate about effectiveness.

Long story short, as I understand it: psychiatric medication is actually way cheaper than psychological therapy. If your primary care doctor can prescribe you Zoloft, then you can skip paying to see a psychiatrist all together.

Back in the day, before we had much in the way of medication for anything, the preferred method for helping people cope with their problems was telling them that they secretly wanted to fuck their mothers. This sounds dumb, but it beats the shit out of locking up mentally ill people in asylums where they tended to die hideously. Unfortunately, talking to people about their problems doesn’t seem to have worked all that well, though you could bill a ton for half hour session every week for forty years straight or until the patient ran out of money.

Modern anti-depressant medications appear to actually work for people with moderate to severe depression, though last time I checked, medication combined with therapy/support had the best outcomes–if anything, I suspect a lot of people could use a lot more support in their lives.

I should clarify: when I say “work,” I don’t mean they cure the depression. This has not been my personal observation of the depressed people I know, though maybe they do for some people. What they do seem to do is lessen the severity of the depression, allowing the depressed person to function.

” Since those days, affixing the depression label to Native experience has become big business. IHS depends a great deal upon this activity—follow-up “medication management” encounters allow the agency to pull considerable extra revenue from Medicaid. One part of the federal government supplements funding for the other. That’s one reason it might be in the best interest of IHS to diagnose and treat depression, rather than acknowledge the emotional and behavioral difficulties resulting from chronic, intergenerational oppression.”

It’s totally awful of the US gov’t to give free medication and health care to people. Medically responsible follow up to make sure the patients are responding properly to their medication and not having awful side effects is especially evil. The government should totally cut that out. From now on, lets cancel health services for the Native Peoples. That will totally end oppression.

Also, anyone who has ever paid an ounce of attention to anything the government does knows that expanding the IHS’s mandate to acknowledge the results of oppression would increase their funding, not decrease it.

Forgive me if it sounds a bit like Walker is actually trying to increase his pay.

“The most recent U.S. Public Health Service practice guidelines, which IHS primary care providers are required to use, states that “depression is a medical illness,” and in a nod to Big Pharma suppliers like Pfizer, serotonin-correcting medications (SSRIs) like Zoloft® “are frequently recommended as first-line antidepressant treatment options.” ”

My god, they use completely standard terminology and make factual statements about their field! Just like, IDK, all other mental healthcare providers in the country and throughout most of the developed world.

“This means IHS considers Native patients with a positive PHQ-9 screen to be mentally ill with depression.”

Dude, this means the that patients of EVERY RACE with a positive PHQ-9 are mentally ill with depression. Seriously, it’s not like Pfizer issues a separate screening guide for different races. If I visit a shrink, I’m going to get the exact same questionaires as you are.

Also, yes, depression is considered a mental illness, but Walker knows as well as I do that there’s a big difference between mentally ill with depression and, say, mentally ill with untreated schizophrenia.

” instance, the biomedical theory IHS is still promoting is obsolete. After more than 50 years of research, there’s no valid Western science to back up this theory of depression (or any other psychiatric disorder besides dementia and intoxication). There’s no chemical imbalance to correct.”

Slate Star Codex did a very long and thorough takedown of this particular claim: simply put, Walker is full of shit and should be ashamed of himself. The “chemical imbalance” model of depression, while an oversimplification, is actually pretty darn accurate, mostly because your brain is full of chemicals. As Scott Alexander points out:

“And this starts to get into the next important point I want to bring up, which is chemical imbalance is a really broad idea.

Like, some of these articles seem to want to contrast the “discredited” chemical imbalance theory with up-and-coming “more sophisticated” theories based on hippocampal neurogenesis and neuroinflammation. Well, I have bad news for you. Hippocampal neurogenesis is heavily regulated by brain-derived neutrophic factor, a chemical. Neuroinflammation is mediated by cytokines. Which are also chemicals. Do you think depression is caused by stress? The stress hormone cortisol is…a chemical. Do you think it’s entirely genetic? Genes code for proteins – chemicals again. Do you think it’s caused by poor diet? What exactly do you think food is made of?

One of the most important things about the “chemical imbalance model” is that it helps the patient (again quoting Scott):

” People come in with depression, and they think it means they’re lazy, or they don’t have enough willpower, or they’re bad people. Or else they don’t think it, but their families do: why can’t she just pull herself up with her own bootstraps, make a bit of an effort? Or: we were good parents, we did everything right, why is he still doing this? Doesn’t he love us?

And I could say: “Well, it’s complicated, but basically in people who are genetically predisposed, some sort of precipitating factor, which can be anything from a disruption in circadian rhythm to a stressful event that increases levels of cortisol to anything that activates the immune system into a pro-inflammatory mode, is going to trigger a bunch of different changes along metabolic pathways that shifts all of them into a different attractor state. This can involve the release of cytokines which cause neuroinflammation which shifts the balance between kynurinins and serotonin in the tryptophan pathway, or a decrease in secretion of brain-derived neutrotrophic factor which inhibits hippocampal neurogenesis, and for some reason all of this also seems to elevate serotonin in the raphe nuclei but decrease it in the hippocampus, and probably other monoamines like dopamine and norepinephrine are involved as well, and of course we can’t forget the hypothalamopituitaryadrenocortical axis, although for all I know this is all total bunk and the real culprit is some other system that has downstream effects on all of these or just…”

Or I could say: “Fuck you, it’s a chemical imbalance.””

I’m going to quote Scott a little more:

“I’ve previously said we use talk of disease and biology to distinguish between things we can expect to respond to rational choice and social incentives and things that don’t. If I’m lying in bed because I’m sleepy, then yelling at me to get up will solve the problem, so we call sleepiness a natural state. If I’m lying in bed because I’m paralyzed, then yelling at me to get up won’t change anything, so we call paralysis a disease state. Talk of biology tells people to shut off their normal intuitive ways of modeling the world. Intuitively, if my son is refusing to go to work, it means I didn’t raise him very well and he doesn’t love me enough to help support the family. If I say “depression is a chemical imbalance”, well, that means that the problem is some sort of complicated science thing and I should stop using my “mirror neurons” and my social skills module to figure out where I went wrong or where he went wrong. …

“What “chemical imbalance” does for depression is try to force it down to this lower level, tell people to stop trying to use rational and emotional explanations for why their friend or family member is acting this way. It’s not a claim that nothing caused the chemical imbalance – maybe a recent breakup did – but if you try to use your normal social intuitions to determine why your friend or family member is behaving the way they are after the breakup, you’re going to get screwy results. …

“So this is my answer to the accusation that psychiatry erred in promoting the idea of a “chemical imbalance”. The idea that depression is a drop-dead simple serotonin deficiency was never taken seriously by mainstream psychiatry. The idea that depression was a complicated pattern of derangement in several different brain chemicals that may well be interacting with or downstream from other causes has always been taken seriously, and continues to be pretty plausible. Whatever depression is, it’s very likely it will involve chemicals in some way, and it’s useful to emphasize that fact in order to convince people to take depression seriously as something that is beyond the intuitively-modeled “free will” of the people suffering it. “Chemical imbalance” is probably no longer the best phrase for that because of the baggage it’s taken on, but the best phrase will probably be one that captures a lot of the same idea.”

Back to the article.

Walker states, ” Even psychiatrist Ronald Pies, editor-in-chief emeritus of Psychiatric Times, admitted “the ‘chemical imbalance’ notion was always a kind of urban legend.” ”

Oh, look, Dr. Pies was kind enough to actually comment on the article. You can scroll to the bottom to read his evisceration of Walker’s points–” …First, while I have indeed called the “chemical imbalance” explanation of mood disorders an “urban legend”—it was never a real theory propounded by well-informed psychiatrists—this in no way means that antidepressants are ineffective, harmful, or no better than “sugar pills.” The precise mechanism of action of antidepressants is not relevant to how effective they are, when the patient is properly diagnosed and carefully monitored. …

” Even Kirsch’s data (which have been roundly criticized if not discredited) found that antidepressants were more effective than the placebo condition for severe major depression. In a re-analysis of the United States Food and Drug Administration database studies previously analyzed by Kirsch et al, Vöhringer and Ghaemi concluded that antidepressant benefit is seen not only in severe depression but also in moderate (though not mild) depression. …

” While there is no clear evidence that antidepressants significantly reduce suicide rates, neither is there convincing evidence that they increase suicide rates.”

Here’s my own suspicion: depressed people on anti-depressants have highs and lows, just like everyone else, but because their medication can’t completely 100% cure them, sooner or later they end up feeling pretty damn shitty during a low point and start thinking about suicide or actually try it.

However, Pies notes that there are plenty of studies that have found that anti-depressants reduce a person’s overall risk of suicide.

In other words, Walker is, at best, completely misrepresenting the science to make his particular side sound like the established wisdom in the field when he is, in fact, on the minority side. That doesn’t guarantee that he’s wrong–it just means he is a liar.

And you know what I think about liars.

And you can probably imagine what I think about liars who lie in ways that might endanger the mental health of other people and cause them to commit suicide.

But wait, he keeps going:

“In an astonishing twist, researchers working with the World Health Organization (WHO) concluded that building more mental health services is a major factor in increasing the suicide rate. This finding may feel implausible, but it’s been repeated several times across large studies. WHO first studied suicide in relation to mental health systems in 100 countries in 2004, and then did so again in 2010, concluding that:

“[S]uicide rates… were increased in countries with mental health legislation, there was a significant positive correlation between suicide rates, and the percentage of the total health budget spent on mental health; and… suicide rates… were higher in countries with greater provision of mental health services, including the number of psychiatric beds, psychiatrists and psychiatric nurses, and the availability of training in mental health for primary care professionals.””

Do you know why I’ve been referring to Walker as “Walker” and not “Dr. Walker,” despite his apparent PhD? It’s because anyone who does not understand the difference between correlation and causation does not deserve a doctorate degree–or even a highschool degree–of any sort. Maybe people spend more on mental health because of suicides?

Oh, look, here’s the map he uses to support his claim:

This map has been confounding my attempt to claim that Finno-Scandians like death metal because they're depressives
Look at all those high-mental healthcare spending African countries!

I don’t know about you, but it looks to me like the former USSR, India/Bhutan/Nepal, Sub-Saharan Africa, Guyana, and Japan & the Koreas have the highest suicide rates in the world. Among these countries, all but Japan and S. Korea are either extremely poor and probably have little to no public spending on mental healthcare, or are former Soviet countries that are both less-developed than their lower-suicide brothers to the West and whatever is going on in them is probably related to them all being former Soviet countries, rather than their fabulous mental healthcare funding.

In other words, this map shows the opposite of what Walker claims it does.

Again, this doesn’t mean he’s necessarily wrong. It just means that the data on the subject is mixed and does not clearly support his case in the manner he claims.

” Despite what’s known about their significant limitations and scientific groundlessness, antidepressants are still valued by some people for creating “emotional numbness,” according to psychiatric researcher David Healy.”

So they don’t have any effects, but people keep using them for their… effects? Which is it? Do they work or not work?

And emotional numbness is a damn sight better than wanting to kill yourself. That Walker does not recognize this shows just how disconnected he is from the realities of life for many people struggling with depression.

“The side effect of antidepressants, however, in decreasing sexual energy (libido) is much stronger than this numbing effect—sexual disinterest or difficulty becoming aroused or achieving orgasm occurs in as many as 60 percent of consumers.”

Which, again, is still better than wanting to kill yourself. I hear death really puts a dent in your sex life.

However, I will note that this is a real side effect, and if you are taking anti-depressants and really can’t stand the mood kill (pardon the pun,) talk to your doctor, because there’s always the possibility that a different medication will treat your depression without affecting your libido.

“A formal report on IHS internal “Suicide Surveillance” data issued by Great Lakes Inter-Tribal Epidemiology Center states the suicide rate for all U.S. adults currently hovers at 10 for every 100,000 people, while for the Native patients IHS tracked, the rate was 17 per 100,000. This rate varied widely across the regions IHS serves—in California it was 5.5, while in Alaska, 38.5.”

Interesting statistics. I’m guessing the difference between Alaska and California holds true for whites, too–I suspect it’s the long, cold, dark winters.

According to the American Foundation for Suicide Prevention,

“In 2013, the highest U.S. suicide rate (14.2) was among Whites and the second highest rate (11.7) was among American Indians and Alaska Natives (Figure 5). Much lower and roughly similar rates were found among Asians and Pacific Islanders (5.8), Blacks (5.4) and Hispanics (5.7).”

Their graph:

Actually, the interesting thing is just how non-suicidal blacks seem to be.
So much for that claim

Hey, do you know which American ethnic group also has a history of trauma and oppression? Besides the Jews. Black people.

If trauma and oppression leads to depression and suicide, then the black suicide rate ought to be closer to the Indian suicide rate, and the white rate ought to be down at the bottom.

I guess this is a point in favor of my “whites are depressive” theory, though.

Also, “In 2013, nine U.S. states, all in the West, had age-adjusted suicide rates in excess of 18: Montana (23.7), Alaska (23.1), Utah (21.4), Wyoming (21.4), New Mexico (20.3), Idaho (19.2), Nevada (18.2), Colorado (18.5), and South Dakota (18.2). Five locales had age-adjusted suicide rates lower than 9 per 100,000: District of Columbia (5.8), New Jersey (8.0), New York (8.1), Massachusetts (8.2), and Connecticut (8.7).”

I'd like to see thi map compared to a map of white violence rates
States by suicide rate

Hrm, looks like there’s also a guns and impulsivity/violence correlation–I think the West was generally settled by more violent, impulsive whites who like the rough and tumble lifestyle, and where there are guns, people kill themselves with them.

I bet CA has some restrictive gun laws and some extensive mental health services.

You know the dark blue doesn’t look like it correllates with?

Healthcare funding.

Back to Walker. “Nearly one in four of these suicidal medication overdoses used psychiatric medications. The majority of these medications originated through the Indian Health Service itself and included amphetamine and stimulants, tricyclic and other antidepressants, sedatives, benzodiazepines, and barbiturates.”

Shockingly, people diagnosed with depression sometimes try to commit suicide.

Wait, aren’t amphetamines and “stimulants” used primarily for treating conditions like ADHD or to help people stay awake, not depression? And aren’t sedatives, benzos, and barbiturates used primarily for things like anxiety and pain relief? I don’t think these were the drugs Walker is looking for.

” What’s truly remarkable is that this is not the first time the mental health movement in Indian Country has helped to destroy Native people. Today’s making of a Mentally Ill Indian to “treat” is just a variation on an old idea, … The Native mental health system has been a tool of cultural genocide for over 175 years—seven generations. Long before there was this Mentally Ill Indian to treat, this movement was busy creating and perpetuating the Crazy Indian, the Dumb Indian, and the Drunken Indian.”

Walker’s depiction of the past may be accurate. His depiction of the present sounds like total nonsense.

” We must make peace with the fabled Firewater Myth, a false tale of heightened susceptibility to alcoholism and substances that even Native people sometimes tell themselves.”

The fuck? Of course Indians are more susceptible to alcoholism than non-Indians–everyone on earth whose ancestors haven’t had a long exposure to wheat tends to handle alcohol badly. Hell, the Scottich are more susceptible to alcoholism than, say, the Greeks:

alcoholdeaths

Some people just have trouble with alcohol. Like the Russians.

 

Look, I don’t know if the IHS does a good job. Maybe its employes are poorly-trained, abrasive pharmaceutical shills who diagnose everyone who comes through their doors with depression and then prescribes them massive quantities of barbiturates.

And it could well be that the American psychiatric establishment is doing all sorts of things wrong.

But the things Walker cites in the article don’t indicate anything of the sort.

And for goodness sakes, if you’re depressed or have any other mental health problem, get advice from someone who actually knows what they’re talking about.

Hey, DNA: What is it good for?

So why do we still have bits of Neanderthal DNA hanging around after so many years? Of course it could just be random junk, but it’s more fun to think that it might be useful.

And the obvious useful thing for it to do is climate adaptation, since Neanderthals had been living in dark, cold, ice-age Europe for much longer than the newly-arrived h. Sapiens, and so might have had some adaptations to help deal with it.

Okay, so here is something related I was reading the other day, that I consider pretty interesting. So it looks like the people who live up on the Tibetan Plateau (like the Tibetans,) are really well-adapted to the altitude. No mean feat, considering that other populations who live at similar altitudes don’t seem to be as well-adjusted, despite living up there for similar lengths of time.

Well, now it appears that the Tibetans have actually been living in Tibet for waaaay longer than expected, because the original h. Sapiens who moved into Tibet intermarried with archaic hominids who had already lived there for hundreds of thousands of years, and so probably picked up their altitude adaptations from those guys.

BTW, “species” is a social construct and you probably shouldn’t bother with it here.

So what kind of useful stuff might we have picked up from Neanderthals?

First I’d like to interject that I still find declarations of “aha, we got this gene from Neanderthals and it does this!” to be speculative and prone to changing. All of the articles I’ve read tend to report the same list of stuff in a similar fashion, so I suspect they’re all workign off one or two sources, which makes everything doubly sketchy. So we’re going in here with a big “if” this is true…

Some of the results are fairly boring, like Neanderthal DNA affecting hair and skin. We already speculate that skin tone helps us deal with sunlight levels, so that’s sensible.

More interesting is the claim that Neanderthal DNA may predispose people to Type-2 Diabetes and depression.

Now why the hell would it do that? It’s probably not *just* random–after all, large stretches of DNA have little to no Neanderthal admixture at all, suggesting that genes in those spots just weren’t useful, so why would we have retained such apparently negative traits?

Maybe, like sickle cell anemia, these things actually have a positive function–at least in the right environments.

I read a fascinating theory a few years ago that Type 2 Diabetes and Seasonal Affective Disorder are actually just part of our bodies’ natural mechanisms for dealing with winter. Basically, you’re supposed to eat plants and get fat all summer long, while plants are available, and then by winter, your ability to absorb more glucose shuts down (there’s no point since the plants are all dead) and you switch over to burning ketones instead and eating an all-mammoth diet.

(Some groups, like the Inuit and Masai, historically [and may today still] survived on diets that included virtually no plants and so ran all of their cellular energy needs through the ketogenic instead of the glucose system.)

During this winter time, humans, like other animals, slowed down and semi-hibernated to save energy and because why the fuck not, it’s dark and no one has invented lightbulbs, yet.

By spring, you’ve lost a lot of weight, the plants come back, and so does your ability to use glucose.

This theory is laid out in the book Lights Out by T. S. Wiley, if you’re curious. I thought it was a really interesting book, but you might just think it’s all crank, I dunno.

Anyway, a big hole in Wiley’s plot is how we actually got this adaptation in the first place, since it’s a pretty complicated one and h. Sapiens hasn’t actually been living in places with winter for all that long. Wiley just claims that it’s a deep internal mechanism that animals have, which always struck me as kinda bunk because why would a species that evolved in Africa, from other animals in Africa, etc., probably going back for million upon millions of years, have some sort of complicated system like this still functional in its genome? A trait that is not undergoing positive selective pressure is probably going to become non-functional pretty quickly. But the theory was cool enough otherwise to ignore this bit, so I’ve kept it around.

Right, so here’s the (potential) answer: h. Sapiens didn’t have this adaptation hiding deep inside of them, Neanderthals had it. Neanderthals had been living in cold places for way, way longer than h. Sapiens, and by inter-breeding with them, we got to take advantage of a bunch of cold-weather adaptations they’d developed over that time frame–thus getting a jump-start on evolving to cope with the weather.

At any rate, if Wiley is correct, and SAD and Type-2 Diabetes are actually part of a dealing with winter complex that benefited our cold-weather ancestors, then that wold explain why these genes would have persisted over the years instead of being bred out.

An easy way to test this would be to compare rates of Type-2 Diabetes and SAD among African immigrants to Europe/other wintery latitudes, African Americans (who have a small amount of Euro admixture,) and Europeans. (Watching out, of course, for Vit D issues.) If the Euros have more SAD and Type-2 Diabetes than Africans living at the same latitude, then those would appear to be adaptations to the latitude. If the Africans have more, then my theory fails.

Genetic Aristotelian Moderation

I suspect a lot of genetic traits (being that many involve the complex interaction of many different genes) are such that having a little bit of the trait is advantageous, but having too much (or conversely, too little) is negative.

A few obvious examples:

Aggression: too much, and you go to jail. Historically, prison conditions were awful enough in the West that this likely exerted an upper bound on the population’s criminality by eliminating violent people from the gene pool.

But too little, and you get taken advantage of. You can’t compete in job interviews, get promoted, make friends, or ask people out on dates. Aggressive people take your stuff, and you can’t protect against them.

From getting jobs to getting mates to not being mugged, a little bit of aggression is clearly a good thing.

Intelligence: High IQ is tremendously mal-adaptive in modern society. (This may always have been true.) The upper end of the IQ curve basically does not have children. (On both micro and macro levels.) I’m not prepared to say whether this is a bug or a feature.

But, low IQ appears to also maladaptive. This was certainly true historically in the West, where extremely high death rates and intense resource competition left the dumber members of society with few surviving offspring. Dumb people just have trouble accomplishing the things necessary for raising lots of children.

Somewhat above average IQ appears to be the most adaptive, at least in the present and possibly historically.

Height: Really tall men have health problems and die young. Really short men are considered undateable and so don’t have children. So the pressure is to be tall, but not too tall.

(Speculatively) Depression: Too much depression, and you commit suicide. Not enough, and you’re a happy-go-lucky person who drops out of school and punches people. Just enough, and you work hard, stay true to your spouse, don’t get into fights, and manage to do all of the boring stuff required by Western society. (Note: this could have changed in the past hundred years.)

Sickle Cell Anemia: I don’t think I need to explain this one.

(Also speculative) Tay Sach’s: Tay Sach’s is a horrible neurological disease that shows up in populations with evidence of very high recent pressure to increase IQ, such as Ashkenazim (one of the worlds’ highest IQ groups) and Quebecois. There is therefore speculation that in its heterozygous form, Tay Sach’s may enhance neural development, instead of killing you hideously.

Sickle Cell Anemia Metaphor for Depression

Depression and suicide have non-immediately obvious distributions–countries with things like low crime rates, social equality, and plenty of food tend to have really high rates, while poor, violent countries seem to be quite happy.

Latin American countries, for example, score quite high on happiness surveys, despite being some of the world’s most violent places.

By contrast, the Japanese and Scandinavians have some of the world’s highest rates of suicide.

When something doesn’t make sense, try inverting it: Why might it be useful to be depressed?

I posit that in societies where delaying gratification, working hard, and tolerating high densities of people without getting into fights are prerequisites to reproducing (which has historically been true of China, Japan, and the West,) mild to sub-clinical levels of depression helped people succeed.

(Remember, the phenomenon of most orphans and illegitimate children surviving infancy is only about a hundred years old. Historically, these children almost all died.)

This is where I draw an analogy to Sickle Cell Anemia. With SCA, No SC chromosome = you get malaria. One SC chromosome = you’re not as healthy, but you’re protected against malaria. Two SC chromosomes = you die.

With depression, No Depression => Fun, risky behaviors => you never get a farm and die without any surviving offspring. One Depression trait => you’re not quite sure about this “fun” business => work hard, get a farm, and have children. Two Depression traits => Suicide.

(Obviously depression need not be caused by a mere one or two genes for the idea to hold.)
Seems like the question for Utilitarians becomes, “Is there a way to make people productive, non-violent, and happy, all at the same time?”