Time Preference: the most under-appreciated mental trait

Time Preference isn’t sexy and exciting, like anything related to, well, sex. It isn’t controversial like IQ and gender. In fact, most of the ink spilled on the subject isn’t even found in evolutionary or evolutionary psychology texts, but over in economics papers about things like interest rates that no one but economists would want to read.

So why do I think Time Preference is so important?

Because I think Low Time Preference is the true root of high intelligence.

First, what is Time Preference?

Time Preference (aka future time orientation, time discounting, delay discounting, temporal discounting,) is the degree to which you value having a particular item today versus having it tomorrow. “High time preference” means you want things right now, whereas “low time preference” means you’re willing to wait.

A relatively famous test of Time Preference is to offer a child a cookie right now, but tell them they can have two cookies if they wait 10 minutes. Some children take the cookie right now, some wait ten minutes, and some try to wait ten minutes but succumb to the cookie right now about halfway through.

Obviously, many factors can influence your Time Preference–if you haven’t eaten in several days, for example, you’ll probably not only eat the cookie right away, but also start punching me until I give you the second cookie. If you don’t like cookies, you won’t have any trouble waiting for another, but you won’t have much to do with it. Etc. But all these things held equal, your basic inclination toward high or low time preference is probably biological–and by “biological,” I mean, “mostly genetic.”

Luckily for us, scientists have actually discovered where to break your brain to destroy your Time Preference, which means we can figure out how it works.

The scientists train rats to touch pictures with their noses in return for sugar cubes. Picture A gives them one cube right away, while picture B gives them more cubes after a delay. If the delay is too long or the reward too small, the rats just take the one cube right away. But there’s a sweet spot–apparently 4 cubes after a short wait—where the rats will figure it’s worth their while to tap picture B instead of picture A.

But if you snip the connection between the rats’ hippocampi and nucleus accumbenses, suddenly they lose all ability to wait for sugar cubes and just eat their sugar cubes right now, like a pack of golden retrievers in a room full of squeaky toys. They become completely unable to wait for the better payout of four sugar cubes, no matter how much they might want to.

So we know that this connection between the hippocampus and the nucleus accumbens is vitally important to your Time Orientation, though I don’t know what other modifications, such as low hippocampal volume or low nucleus accumbens would do.

So what do the hippocampus and nucleus accumbens do?

According to the Wikipedia, the hippocampus plays an important part in inhibition, memory, and spatial orientation. People with damaged hippocampi become amnesiacs, unable to form new memories.There is a pretty direct relationship between hippocampus size and memory, as documented primarily in old people:

“There is, however, a reliable relationship between the size of the hippocampus and memory performance — meaning that not all elderly people show hippocampal shrinkage, but those who do tend to perform less well on some memory tasks.[71] There are also reports that memory tasks tend to produce less hippocampal activation in elderly than in young subjects.[71] Furthermore, a randomized-control study published in 2011 found that aerobic exercise could increase the size of the hippocampus in adults aged 55 to 80 and also improve spatial memory.” (wikipedia)

Amnesiacs (and Alzheimer’s patients) also get lost a lot, which seems like a perfectly natural side effect of not being able to remember where you are, except that rat experiments show something even more interesting: specific cells that light up as the rats move around, encoding data about where they are.

“Neural activity sampled from 30 to 40 randomly chosen place cells carries enough information to allow a rat’s location to be reconstructed with high confidence.” (wikipedia)

"Spatial firing patterns of 8 place cells recorded from the CA1 layer of a rat. The rat ran back and forth along an elevated track, stopping at each end to eat a small food reward. Dots indicate positions where action potentials were recorded, with color indicating which neuron emitted that action potential." (from Wikipedia)
“Spatial firing patterns of 8 place cells recorded from the CA1 layer of a rat. The rat ran back and forth along an elevated track, stopping at each end to eat a small food reward. Dots indicate positions where action potentials were recorded, with color indicating which neuron emitted that action potential.” (from Wikipedia)

According to Wikipedia, the Inhibition function theory is a little older, but seems like a perfectly reasonable theory to me.

“[Inhibition function theory] derived much of its justification from two observations: first, that animals with hippocampal damage tend to be hyperactive; second, that animals with hippocampal damage often have difficulty learning to inhibit responses that they have previously been taught, especially if the response requires remaining quiet as in a passive avoidance test.”

This is, of course, exactly what the scientists found when they separated the rats’ hippocampi from their nucleus accumbenses–they lost all ability to inhibit their impulses in order to delay gratification, even for a better payout.

In other word, the hippocampus lets you learn, process the moment of objects through space (spatial reasoning) and helps you suppress your inhibitions–that is, it is directly involved in IQ and Time Preference.

 

So what is the Nucleus Accumbens?

According to Wikipedia:

“As a whole, the nucleus accumbens has a significant role in the cognitive processing of aversion, motivation, pleasure, reward and reinforcement learning;[5][6][7] hence, it has a significant role in addiction.[6][7] It plays a lesser role in processing fear (a form of aversion), impulsivity, and the placebo effect.[8][9][10] It is involved in the encoding of new motor programs as well.[6]

Dopaminergic input from the VTA modulate the activity of neurons within the nucleus accumbens. These neurons are activated directly or indirectly by euphoriant drugs (e.g., amphetamine, opiates, etc.) and by participating in rewarding experiences (e.g., sex, music, exercise, etc.).[11][12] …

The shell of the nucleus accumbens is involved in the cognitive processing of motivational salience (wanting) as well as reward perception and positive reinforcement effects.[6] Particularly important are the effects of drug and naturally rewarding stimuli on the NAc shell because these effects are related to addiction.[6] Addictive drugs have a larger effect on dopamine release in the shell than in the core.[6] The specific subset of ventral tegmental area projection neurons that synapse onto the D1-type medium spiny neurons in the shell are responsible for the immediate perception of the rewarding property of a stimulus (e.g., drug reward).[3][4] …

The nucleus accumbens core is involved in the cognitive processing of motor function related to reward and reinforcement.[6] Specifically, the core encodes new motor programs which facilitate the acquisition of a given reward in the future.[6]

So it sounds to me like the point of the nucleus accumbens is to learn “That was awesome! Let’s do it again!” or “That was bad! Let’s not do it again!”

Together, the nucleus accumbens + hippocampus can learn “4 sugar cubes in a few seconds is way better than 1 sugar cube right now.” Apart, the nucleus accumbens just says, “Sugar cubes! Sugar cubes! Sugar cubes!” and jams the lever that says “Sugar cube right now!” and there is nothing the hippocampus can do about it.

 

What distinguishes humans from all other animals? Our big brains, intellects, or impressive vocabularies?

It is our ability to acquire new knowledge and use it to plan and build complex, multi-generational societies.

Ants and bees live in complex societies, but they do not plan them. Monkeys, dolphins, squirrels, and even rats can plan for the future, but only humans plan and build cities.

Even the hunter-gatherer must plan for the future; a small tendril only a few inches high is noted during the wet season, then returned to in the dry, when it is little more than a withered stem, and the water-storing root beneath it harvested. The farmer facing winter stores up grain and wood; the city engineer plans a water and sewer system large enough to handle the next hundred years’ projected growth.

All of these activities require the interaction between the hippocampus and nucleus accumbens. The nucleus accumbens tells us that water is good, grain is tasty, fire is warm, and that clean drinking water and flushable toilets are awesome. The hippocampus reminds us that the dry season is coming, and so we should save–and remember–that root until we need it. It reminds us that we will be cold and hungry in winter if we don’t save our grain and spend a hours and hours chopping wood right now. It reminds us that not only is it good to organize the city so that everyone can have clean drinking water and flushable toilets right now, but that we should also make sure the system will keep working even as new people enter the city over time.

Disconnect these two, and your ability to plan goes down the drain. You eat all of your roots now, devour your seed corn, refuse to chop wood, and say, well, yes, running water would be nice, but that would require so much planning.

 

As I have mentioned before, I think Europeans (and probably a few other groups whose history I’m just not as familiar with and so I cannot comment on) IQ increased quite a bit in the past thousand years or so, and not just because the Catholic Church banned cousin marriage. During this time, manorialism became a big deal throughout Western Europe, and the people who exhibited good impulse control, worked hard, delayed gratification, and were able to accurately calculate the long-term effects of their actions tended to succeed (that is, have lots of children) and pass on their clever traits to their children. I suspect that selective pressure for “be a good manorial employee” was particularly strong in German, (and possibly Japan, now that I think about it,) resulting in the Germanic rigidity that makes them such good engineers.

Nothing in the manorial environment directly selected for engineering ability, higher math, large vocabularies, or really anything that we mean when we normally talk about IQ. But I do expect manorial life to select for those who could control their impulses and plan for the future, resulting in a run-away effect of increasingly clever people constructing increasingly complex societies in which people had to be increasingly good at dealing with complexity and planning to survive.

Ultimately, I see pure mathematical ability as a side effect of being able to accurately predict the effects of one’s actions and plan for the future (eg, “It will be an extra long winter, so I will need extra bushels of corn,”) and the ability to plan for the future as a side effect of being able to accurately represent the path of objects through space and remember lessons one has learned. All of these things, ultimately, are the same operations, just oriented differently through the space-time continuum.

Since your brain is, of course, built from the same DNA code as the rest of you, we would expect brain functions to have some amount of genetic heritablity, which is exactly what we find:

Source: The Heritability of Impulse Control
Source: The Heritability of Impulse Control, Genetic and environmental influences on impulsivity: a meta-analysis of twin, family and adoption studies

“A meta-analysis of twin, family and adoption studies was conducted to estimate the magnitude of genetic and environmental influences on impulsivity. The best fitting model for 41 key studies (58 independent samples from 14 month old infants to adults; N=27,147) included equal proportions of variance due to genetic (0.50) and non-shared environmental (0.50) influences, with genetic effects being both additive (0.38) and non-additive (0.12). Shared environmental effects were unimportant in explaining individual differences in impulsivity. Age, sex, and study design (twin vs. adoption) were all significant moderators of the magnitude of genetic and environmental influences on impulsivity. The relative contribution of genetic effects (broad sense heritability) and unique environmental effects were also found to be important throughout development from childhood to adulthood. Total genetic effects were found to be important for all ages, but appeared to be strongest in children. Analyses also demonstrated that genetic effects appeared to be stronger in males than in females.”

 

“Shared environmental effects” in a study like this means “the environment you and your siblings grew up in, like your household and school.” In this case, shared effects were unimportant–that means that parenting had no effect on the impulsivity of adopted children raised together in the same household. Non-shared environmental influences are basically random–you bumped your head as a kid, your mom drank during pregnancy, you were really hungry or pissed off during the test, etc., and maybe even cultural norms.

So your ability to plan for the future appears to be part genetic, and part random luck.

Somali Autism

Approximately one in 88 children has been diagnosed with autism, but in Minnesota, one in 32 Somali children and one in 36 white children have the condition.

A recent study–the Minneapolis Somali Autism Spectrum Disorder Project–reviewed the diagnosis paperwork to make sure the autism diagnoses were accurate, and concluded that they are. They did not go interviewing kids in search of symptoms, just looked at the records of people who’d already been diagnosed.

According to the NY Times, “But the Somali children were less likely than the whites to be “high-functioning” and more likely to have I.Q.s below 70. (The average I.Q. score is 100.) The study offered no explanation of the statistics.”

Well that one seems obvious: average Somali IQ is probably below 70.

Supply your own map if you feel like it
Average IQ by Country, from Memolition

Also, “While some children back home had the same problems children everywhere do, parents said, autism was so unfamiliar that there was no Somali word for it until “otismo” was coined in Minnesota.”

You might think it’s just something in the Minneapolis water supply, but another study, this one from Sweden, found something similar:

Children of migrant parents were at increased risk of low-functioning autism (odds ratio (OR) = 1.5, 95% CI 1.3-1.7); this risk was highest when parents migrated from regions with a low human development index, and peaked when migration occurred around pregnancy (OR = 2.3, 95% CI 1.7-3.0). A decreased risk of high-functioning autism was observed in children of migrant parents, regardless of area of origin or timing of migration. … Environmental factors associated with migration may contribute to the development of autism presenting with comorbid intellectual disability, especially when acting in utero. High- and low-functioning autism may have partly different aetiologies, and should be studied separately.

So what’s up with white kids with autism? Did they get screwed by migration, too?

There is one thing that Minneapolis and Sweden do have in common: lack of sunlight. Somalis may be particularly at risk of Vitamin D deficit, or some other disorder caused by differences in the night-day cycle at different latitudes.

But again, whites have similar rates of autism despite having had thousands of years to adjust to high-latitude winters, while African Americans, who ought to be more similar to the Somalis in their winter/light adaptions, have much lower rates.

In fact, I can’t really think of anything that whites in Minnesota and Somalis in Minnesota might have in common that they wouldn’t also have in common with African Americans in Minnesota. Or Sweden.

The obvious solution is that Somali autism might just be caused by totally different stuff than white autism. Perhaps migration itself caused the high Somali autism rates, or the stress and trauma of war and dislocation. Or it could have something to do with the Somali preference for cousin marriage, but perhaps the autistic kids never got noticed back in Somalia because of high infant mortality rates.

The Insidious Approach of Death

A friend recently attended their 20th highschool reunion, the sort of event that makes one feel very old. Worse, three of their classmates have already died.

I thought that sounded way statistically unlikely, especially given the group’s demographics, but I ran the numbers, and it turns out that it’s only a little unlikely. Given the small N, we’re probably talking about random chance making the class unlucky rather than a particular propensity for death, but it’s unfortunate either way.

Highschool reunions are also a great (by which I mean depressing) opportunity to see who has aged the most. Some classmates look hardly older than the last time you saw them, while others look like they got hit by a semi full of old. Hopefully not you, of course.

In “Quantification of biological aging in young adults,” Belsky et al confirm what I’ve long suspected: that different people age at radically different rates, not just emotionally/mentally, but also biologically.

From the abstract: “We studied aging in 954 young humans, the Dunedin Study birth cohort, tracking multiple biomarkers across three time points spanning their third and fourth decades of life. We developed and validated two methods by which aging can be measured in young adults, one cross-sectional and one longitudinal. Our longitudinal measure allows quantification of the pace of coordinated physiological deterioration across multiple organ systems (e.g., pulmonary, periodontal, cardiovascular, renal, hepatic, and immune function). We applied these methods to assess biological aging in young humans who had not yet developed age-related diseases. Young individuals of the same chronological age varied in their “biological aging” (declining integrity of multiple organ systems). Already, before midlife, individuals who were aging more rapidly were less physically able, showed cognitive decline and brain aging, self-reported worse health, and looked older.” (bold mine.)

” We scaled the Pace of Aging so that the central tendency in the cohort indicates 1 y of physiological change for every one chronological year. On this scale, cohort members ranged in their Pace of Aging from near 0 y of physiological change per chronological year to nearly 3 y of physiological change per chronological year.”

“Study members with advanced Biological Age performed less well on objective tests of
physical functioning at age 38 than biologically younger peers (Fig. 5). They had more difficulty with balance and motor tests (for unipedal stance test of balance, r = −0.22, P < 0.001; for grooved pegboard test of fine motor coordination, r = −0.13, P < 0.001), and they were not as strong (grip strength test, r = −0.19, P < 0.001).”

“Study members with older Biological Ages had poorer cognitive functioning at midlife (r = −0.17, P < 0.001). Moreover, this difference in cognitive functioning reflected actual cognitive decline over the
years. When we compared age-38 IQ test scores to baseline test scores from childhood, study members with older Biological Age showed a decline in cognitive performance net of their baseline
level (r = −0.09, P = 0.010).”

“Neurologists have also begun to use high-resolution 2D photographs of the retina to evaluate age-related loss of integrity of blood vessels within the brain. Retinal and cerebral small vessels
share embryological origin and physiological features, making retinal vasculature a noninvasive indicator of the state of the brain’s microvasculature (32). Retinal microvascular abnormalities are associated with age-related brain pathology, including stroke and dementia (33–35) … study members with advanced Biological Age had older retinal vessels (narrower arterioles, r = −0.20, P < 0.001; wider venules, r = 0.17, P < 0.001).”

“… these biologically older study members were perceived to be older by independent observers.”

“Based on Pace of Aging analysis, we estimate that roughly 1/2 of the difference in Biological Age
observed at chronological age 38 had accumulated over the past 12 y.”

“… our analysis was limited to a single cohort, and one that lacked ethnic minority populations. Replication in other cohorts is needed, in particular in samples including sufficient numbers of ethnic minority individuals to test the “weathering hypothesis” that the stresses of ethnic minority status accelerate aging.”

“Three Dunedin Study members had Pace of Aging less than zero, appearing to grow physiologically younger during their thirties.”

While I suspect measurement error is at play, I’d still like to know what those guys did.

America and the Long Term

1280px-World_map_2004_CIA_large_1.7m_whitespace_removed

If there is some general effect of latitude on IQ, then I would not expect America to look, long-term, like Britain or France. Indeed, I’d expect about half of the US to eventually look like North Africa, and the upper half of the US to look more like Spain, Italy, and Turkey.

The US has historically been a land of great abundance–a land where a small founding population like the Amish might grow from 5,000 people in 1920 to over 290,000 people today.

One of the side effects of abundance has been lower infant mortality; indeed, one of the side effects of modernity has been low infant mortality.

In the Middle Ages, a foundling’s chances of surviving their first year were down around 10%. What did orphanages do without formula? (Goat’s milk, I suspect.) Disease was rampant. Land was dear. Even for the well-off, child mortality was high.

My great-great grandparents lost 6 or 7 children within their first week of life.

Things were pretty harsh. An infant mortality rate of 50% was not uncommon.

American abundance, warm climate, industrialization, and modern medicine/hygiene have all worked together to ensure that far more children survive–even those abandoned by one or more parents. (As someone who would have died 3 or 4 times over in infancy without modern medicine, I am not without some personal appreciation for this fact.)

I recently read an interesting post that I can’t find now that basically posited the theory that all of these extra surviving people running around are depressing the average IQ because they have little sub-optimal bits of genetic code that previously would have gotten them weeded out. There’s a decently strong correlation between intelligence and athleticism–not necessarily at the high end of intelligence, but it does appear at the high end of athleticism. Good athletes are smarter than bad athletes. Smart people, Hawking aside, are generally pretty healthy. For that matter, there are strikingly few fat people at the nation’s top universities. So it is not unreasonable to suspect that a few deleterious mutations that result in some wonky side effects in your kidneys or intestines might also cause some wonky side effects in your brain, which could make you dumber or just really fond of stuffed animals or something.

Okay, but this post is not actually about the theory that low infant mortality is turning us all into furries.

My theory is that America + Modernity => more children of single mothers surviving => long therm changes in marriage/divorce rates => significant long term changes in the structure of society.

Historically, if we go a little further south to Sub-Saharan Africa, monogamy has not been a big thing. Why? Because the climate is generous enough that people don’t have to store up a ton of food for the winter, and women can do most of the food production to feed their children by themselves, or with the help of their extended kin networks. In these places, polygyny is far more common, since men do not need to bear the burden of providing for their own children.

As we head north, the winters get colder and the agricultural labor more intensive, and so the theory goes that women in the north could not provide for their children by themselves. And so Fantine, unwed, dies attempting to provide for her little Cosette, who would have died as well were it not for the ways of novels. The survivors were the men and women who managed to eek out a living together–married, basically monogamous.

But take away the dead Cosettes and Olivers–let them survive in more than just books–and what do we have? Children who, sooner or later, take after their parents. And even if one parent was faithful ’till death, the other certainly wasn’t.

Without any selective pressure on monogamy, monogamy evaporates. So now you can get a guy who has 34 children by 17 different women, and all of the children survive.

Meanwhile, neurotic types who want to make sure they have all of their career and personal ducks all lined up in a row “just can’t afford” a kid until they’re 38, have one if they’re lucky, and then call it quits.

Guess who inherits the future?

Those who show up, that’s who.

I suspect that the effects of low infant mortality have been accumulating for quite a while. Evolution can happen quite quickly if you radically change your selective parameters. For example, if you suddenly start killing white moths instead of grey ones, the moth population will get noticeably darker right after you kill the moths. Future generations of moths will have far fewer white moths. If you then top killing the white moths, white moths will again begin to proliferate. If white moth start having even more babies than grey moths, soon you will have an awful lot of white moths.

Long term, I expect one of the effects of abandoned children surviving is that the gene pool ends up with a lot more people who lack a genetic inclination toward monogamy. At first, these people will just be publicly shamed and life will continue looking relatively normal. But eventually, we should get to a tipping point where we have enough non-monogamous people that they begin advocating as a block and demanding divorce, public acceptance of non-marital sex, etc.

Another effect I would expect is a general “masculinization” of the women. Women who have to fend for themselves and raise their own children without help from their husband have no practical use for femininity, and the more masculine among them will be more likely to thrive. Wilting, feminine flowers will fade away, replaced by tough dames who “need a man like a fish needs a bicycle.”

Only time will tell if the future will belong to the Amish and the Duggers, or to Jay Williams’ progeny.

HBD and The Continuum Concept

A few years ago, I read a mystifying discussion on the subject of Sub-Saharan African development. Side A claimed that SSA was inferior because it had no significant development; Side B claimed that “development” was a cultural value that SSA cultures simply didn’t share. It is true enough that SSA has never had much in the way of “development”–cities were few and far between, and even today, some parts are virtually impassible. (This is a fantastic, wild story, btw, about a couple attempting to cross the DRC by truck. I strongly recommend it.) But how could valuing “development” be culturally relative? Didn’t everyone want development?

A couple of weeks later, I happened, (by total coincidence,) on Liedloff’s The Continuum Concept. This is the kind of book that only tends to appear to hippie parents, but if you’re interested in parenting from an evolutionary perspective, I recommend it. In the book, Liedloff goes to live in a “stone age” village in the Amazon Rainforest. At first she is annoyed by the difficulties of life in the village–for example, there’s no running water. Why don’t the people rig up some sort of system to bring running water to the village so they don’t have to trek down to the river every day?

Then Liedloff has a revelation: the villagers like walking down to the stream every day. It’s a pleasant walk, the stream is nice, and they enjoy having a swim together while they’re there. Is it any better to have running water if you’re less happy as a result?

This is what Side B meant. Not everyone wants to live in skyscrapers. Some people are perfectly happy in huts.

Genetics provides one explanation for why cultures are as they are; gene-culture co-evolution a more refined one. But you don’t have to believe in genetics to understand that cultures are a result of the people who make them.

People like to pretend that culture is nothing more than different clothes and fancy foods. This is Culture for Children, the sort of thing you see at an elementary school Culture Fair.

Food is nice, but that’s not what culture is. Culture is the sum of the personalities, values, even neuroses of the people involved. Some people incredibly driven, super-hard workers. Some people are relaxed and easy going. Some are shy. Some are warm. Japan is Japan because the Japanese made it that way; the DRC is the DRC because the Congolese made it that way. No, the Japanese aren’t perfectly happy with their culture, and neither are the Congolese, and neither are we, but each is still the result of the people in it, and people generally want to keep the parts of their culture that are important to them.

We tend to assume that everyone out there secretly wants to be like us. If we just give them democracy, they’ll start acting like us, we think. If we let them immigrate, they’ll act like us. If we just send them to more school, they’ll start acting like us.

Then we are confused when they don’t.

To this day, the Indians are still pissed off that white people sent them to school to try to impart white culture to them. “Cultural genocide” they call it. And they have every right to be pissed–they didn’t want to be white. They had their own culture. They were perfectly happy with it.

So let them be them and you be you.

 

 

Women, Math, and the Y Chromosome

I was working on this post about how Les Mis is totally communist, but then I remembered this is a blog about evolution, not pop culture ramblings.

Women, math, and genetics.

Many people have wondered why mathematicians are disproportionately male. Some have wondered if Larry Summers got nudged out of being president of Harvard for saying it might just be biological.

Of course it’s biological.

Sex differences in math performance are probably just a side effect of the Y Chromosome.

Let’s back up a speck. First, let’s be clear what we’re talking about.

Last time I checked, women and men performed, on average, about equally well on highschool math. Little girls seem to do slightly better on elementary school math, but elementary school is largely a test of how long you can sit still, so that’s no mystery. But by highschool, the boys have gotten a little better at sitting, and the testing is probably a little more reliable. (See the Wikipedia for way more details.)

And yet, more men than women end up in lucrative, high-status math professorships.

I’m being sarcastic. Math is nerdy and low-status, so women avoid it like the plague except to complain that there aren’t enough women in it.

Anyway, you might be wondering how, if men and women have the same average ability, more men than women end up as math professors. The answer, of course, is that while there are more men than women at the extreme tail of high math ability, there are also more men than women at the extreme tail of low math ability.

After all, more men than women are retarded. Boys dominate special ed classes 2 to 1–that is, they are 2/3s of special ed students, and not just because they’re more aggressive.

Anyone who thinks there’s a vast male conspiracy to keep women out of those sexy, lucrative math jobs needs to explain why those same conspirators think so many little boys are retarded. If society is somehow magically convincing little girls that they suck at math, then it is doing an even better job of convincing little boys that they’re even worse. And which should we be most concerned about, society causing a slight dearth of women at the very top end of a profession that doesn’t pay very well, or a massive over-representation of boys among the retarded?

If society’s not to blame, then what else could cause men to both under and over-perform at math?

Their Y chromosomes.

You see, for women, every chromosome comes as part of a matched set. In the slightly simplified view, you have one eye-color gene from your mom, and one from your dad. Together, they determine your eye color. If one is wonky, the other at least is still there, functioning properly. This has a moderating effect on gene expression–you get fewer extremes.

But males only have one Y chromosome. If something goes wrong with it, well, there’s not a lot your X chromosome is likely to do about it.

The result is that men show greater spread on a lot of traits that involve the Y chromosome. Height is an obvious example: while most men are taller than most women, men have a wider range of heights. Women are more narrowly clustered around their average, while men are more spread out:

We can figure ut something else from thi graph: men lie about their heights
source

Even allowing that some of these people are probably lying (some of those 5’7″ guys are probably actually 5’6″, and probably one of the 6’s is actually 5’11”,) there are far more women at 5’6″ than men at 5’10”. The men are more spread out, with more of them, therefore, at the tails of their distribution.

The Y chromosome contains the code that makes men taller than women, but since they only have one copy of this code, there’s nothing to moderate it. If they happen to get one gene for short, well, then they’re short. If they get one for tall, then they’re tall.

It’s the same with math. The Y chromosome has an effect on brain development (it must, otherwise males brains couldn’t create the sex hormones they need for proper genital development and function.) A woman who is lucky enough to get a good math gene from one of her parents has decent odds of getting a mediocre math gene from her other parent, bringing her back toward average. A woman who gets a particularly bad math gene is likely, again by chance, to get a better one from her other parent, again bringing her back toward average.

By contrast, a man is stuck, for better or worse, with one gene. If it’s a good gene, he’s good at math. If it’s a bad gene, he ends up in special ed.

(Note: in reality, there are a lot of genes involved, not just one or two. This is a simplified model to highlight the effect of the Y chromosome in decreasing individual genetic variation in men.)

 

Why does any of this matter? It doesn’t, except that humans think it matters. There’s been a huge push, socially and legally, to force more women into fields where they aren’t yet 50% or greater. To the extent that math departments have been partially protected, it’s just because math, unlike medicine, is low-status and so not all that attractive to women; they just feel insulted by the claim that they’re bad at math.

Of course, women aren’t “bad at math.” For all of the normal sorts of math people do in everyday life, women and men are equally competent. And plenty of women are math professors–I know some personally. They are just less than 50% of math professors.

No one should be picking math professors based on gender. Male or female, pick ’em based on their math skills.

The beauty of math, the thing I love about it, is its objectivity. You can’t bullshit your way through math; culture doesn’t matter. An answer is correct or it is not. The other thing I love about math is that it is cheap. Of all the subjects, math requires the least $$$ to teach–as my relatives who lived through the Great Depression have impressed on me, reading requires heavy, expensive books (heavy is a concern when your penniless family is fleeing the Dust Bowl,) but you can do math with a stick and some dirt.

This is (among other things) why Asian immigrants do so well in math–it’s cheap, culture-independent, and objective. There are no environmental factors other than brain damage that can be reasonably argued to interfere with math performance.

Frankly, I think arguing about whether people are bad at something or inundating them with messages that essentially say, “Everyone thinks you’re bad at this, but don’t worry, it’s totally not true!” causes way more insecurity than just not saying anything and letting people just be.

Much of American advertising works like this; take something people weren’t thinking about at all, then go out of your way to tell them that of course they shouldn’t be concerned about it until they’re so concerned that they go buy your products.

Maybe we’d be better off not stressing out and just letting kids do their homework without imposing political ideas either way on them.

 

 

Whites like Goth and Metal because Whites are Depressives

On a global scale, poverty is probably a bigger predictor of suicide. But within the US there are some clear looking racial differences in depression:

Actually, the interesting thing is just how non-suicidal blacks seem to be.

Yes, I know that suicide and depression aren’t the same word. But I figure “depression” is kinda tricky to accurately document, (Is he really depressed, or just kinda bummed?), whereas suicide seems pretty reliable. And since whites and Asians probably have the best access to mental health care, the numbers probably aren’t being skewed by lack of Prozac among the poor.

I remember an article I read a year or two ago, but can’t find now, which found a correlation between depression and intelligence. More or less, the implication as I interpreted it, is that “depression” is functionally a slowing down of the brain, and during intellectual tasks, people who could slow down and concentrate performed better–thus, concentrating and depression look rather similar.

There are other, additional possibilities: people from further north get depressed because it’s dark and cold all winter/as an adaptation to the winters, and so the Finns listen to a ton of Death Metal:

 

This came from Reddit, but I'm sure it's totally legit
Death Metal Bands Per Capita throughout the World

I don’t have a map for Goth music; does anyone listen to Goth anymore? Hot Topic seems to be doing fine at the mall.

Or maybe depression is an evolutionary adaptation to make people more peaceful and cooperative by internalizing their aggression instead of killing other people. Here the difference between whites and blacks seems like a point of evidence, since whites seem to kill themselves at higher rates than they kill others, while blacks kill others at higher rates than they kill themselves. Perhaps aggression/depression can be toggled on and off in some way, genetically or, in the case of folks with bi-polar, in a single individual.

Asians, I suspect, are also depressives, but have lower aggression than whites,  so they don’t kill themselves very often. Also, I don’t know what kinds of music they like.

 

The Recent Development of High European IQ

You know what’s kind of awesome? Understanding the economic development level of virtually every country on earth becomes much easier as soon as you realize the massive correlation between per capita and IQ–and it gets even better if you focus on verbal IQ or “smart fraction” vebal IQs:

Oh, there you are, correlation
Lifted gratefully from La Griffe du Lion‘s Smart Fraction II article
I do wonder why he made the graph so much bigger than the relevant part
Lifted gratefully from La Griffe du Lion‘s Smart Fraction II article

La Griffe du Lion has a lot of great articles explaining phenomena via math, so if you haven’t read them already, I strongly recommend that you do.

One wonders what this data would look like if we looked backwards, at per capita GDP in, say, the 15 to 1800s.

I really hope I can find a better graph
I really hope I can find a better graph (this one’s from Wikimedia)

 

Well, that's slightly better
Also from Wikimedia

According to the Guardian article about the paper British Economic Growth 1270-1870, “estimates that per capita income in England in the late middle ages was about $1,000 or £634 a year when compared with currency values in 1990.

“According to the World Bank, countries which had a per capita income of less than $1,000 last year included Ghana ($700), Cambodia ($650), Tanzania ($500), Ethiopia ($300) and Burundi ($150), while in India – one of the BRIC emerging economies – the gross income per capita stands only just above medieval levels at $1,180.”

Ah, here’s a decent graph:

I am so not digging the scale on this graph
From the Wikipedia page on India-EU relations

From the description of the graph:

“The %GDP of Western Europe in the chart is the region in Europe that includes the following modern countries – UK, France, Germany, Italy, Belgium, Switzerland, Denmark, Finland, Sweden, Norway, Netherlands, Portugal, Spain and other smaller states in the Western part of Europe.

The %GDP of Middle East in the chart is the region in West Asia and Northeast Africa that includes the following modern countries – Egypt, Israel, Palestinian Territories, Lebanon, Syria, Turkey, Jordan, Saudi Arabia, Qatar, Bahrain, Kuwait, UAE, Oman, Yemen, Iran, Iraq and other regions in the Arabian region.”

The problem with doing the graph this way is that it doesn’t control for population growth. Obviously the US expanded greatly in population between 1700 and 1950, crushing the rest of the world’s GDP by comparison, without anyone else necessarily getting any poorer. It would be nice if the graph included Africa, because I wonder how things like Mansa Musa’s gold mines would show up.

At any rate, here is my impression, which this graph basically seems to back up:

Around the time of the Romans, “Europe” and the Middle East had similar levels of development, integration into global economy, etc. The fall of the Roman Empire coincided with the Middle East pulling ahead in math, science, and nice-looking buildings.

Meanwhile, India and China were doing quite well for themselves, though it’s not clear from the graph how much of that is population. I would not be surprised to find similar numbers for per capita GDP at that time, though.

Then around 1000, Europe starts to improve while the Middle East falls behind and stays there. I suspect this is in part because cousin marriage became more common in the Middle East between 0 and 1000 while simultaneously becoming less common in Europe, and because the Middle East probably didn’t have much arable land left to expand into and so population couldn’t increase very much, whereas the Germans started their big eastward migration about then, (The Ostsiedlung–goodness, it took me a while to figure out how that’s spelled.) increasing the number of Europeans in our cohort and spurring growth.

(BTW…

One of my earlier theories was "I suspect Eastern Germany must was settled after western Germany, due to pesonalities," which turns out to be true
Click for the bigger version )

India, meanwhile, went downhill for a long time, for I have no idea why reasons. China was doing great until quite recently, when it apparently went capootie. Why? I don’t know, but I think part of the effect is just Europe (and the US) suddenly pulling ahead, making China look less significant by comparison.

So. Extrapolating backwards from what we know about the correlation between GDP and verbal IQ, I suspect Western Europe experienced a massive increase in IQ between 1000 and 1900.

A large chunk of this increase was probably driven by the German eastward expansion, a rather major migration you’ve probably never heard of. (As HBD Chick says, “from a sociobiological point-of-view, probably the most underappreciated event in recent western european history. that and the reconquest of spain.”) Another large chunk was probably driven by various cultural factors unique to manorialism and Christianity.

Windmills began popping up in Western Europe in the late 1100s (given that they seem to have started in France, England, and Flanders, rather than in areas geographically closer to the Middle East, it seems unlikely that the European windmills were inspired by earlier Middle Eastern windmills, but were instead a fairly independent invention.

Watermills were an earlier invention–the Classical Romans and Greeks had them. The Chinese and Middle Easterners had them, too, at that time. I don’t know how many mills they all had, but Europeans really took to them:

“At the time of the compilation of the Domesday Book (1086), there were 5,624 watermills in England alone, only 2% of which have not been located by modern archeological surveys. Later research estimates a less conservative number of 6,082, and it has been pointed out that this should be considered a minimum as the northern reaches of England were never properly recorded. In 1300, this number had risen to between 10,000 and 15,000. [Bold mine.]By the early 7th century, watermills were well established in Ireland, and began to spread from the former territory of the empire into the non-romanized parts of Germany a century later. Ship mills and tide mill were introduced in the 6th century.” (Wikipedia page on Watermills.)

In short, by the 1300s, Europe was well on its way toward industrialization.

IMO, these things combined to produce a land where the clever could get ahead and have more children than the non-clever, where those who could figure out a new use or more efficient milling design could profit.

Oh, look, here’s something relevant from HBD Chick, quoting Daniel Hannan’s article in the Telegraph:

“‘By 1200 Western Europe has a GDP per capita higher than most parts of the world, but (with two exceptions) by 1500 this number stops increasing. In both data sets the two exceptions are Netherlands and Great Britain. These North Sea economies experienced sustained GDP per capita growth for six straight centuries. The North Sea begins to diverge from the rest of Europe long before the “West” begins its more famous split from “the rest”. [W]e can pin point the beginning of this “little divergence” with greater detail. In 1348 Holland’s GDP per capita was $876. England’s was $777. In less than 60 years time Holland’s jumps to $1,245 and England’s to 1090. The North Sea’s revolutionary divergence started at this time.’

The result, I suspect, was an increase in average IQs of about 10 to 15 points–perhaps 20 points in specific sub-groups, eg Ashkenazi Jews–with an overall widening of the spread toward the top end.

Epigenetics

I remember when I first heard about epigenetics–the concept sounded awesome.

Now I cringe at the word.

To over simplify, “epigenetics” refers to biological processes that help turn on and off specific parts of DNA. For example, while every cell in your body (except sperm and eggs and I think blood cells?) have identical DNA, they obviously do different stuff. Eyeball cells and brain cells and muscle cells are all coded from the exact same DNA, but epigenetic factors make sure you don’t end up with muscles wiggling around in your eye sockets–or as an undifferentiated mass of slime.

If external environmental things can have epigenetic effects, I’d expect cancer to be a biggie, due to cell division and differentiation being epigenetic.

What epigenetics probably doesn’t do is everything people want it to do.

There’s a history, here, of people really wanting genetics to do things it doesn’t–to impose free will onto it.* Lamarck can be forgiven–we didn’t know about DNA back then. His theory was that an organism can pass on characteristics that it acquired during its lifetime to its offspring, thus driving evolution. The classic example given is that if a giraffe stretches its neck to reach leaves high up in the trees, its descendants will be born with long necks. It’s not a bad theory for a guy born in the mid 1700s, but science has advanced a bit since then.

The USSR put substantial resources into trying to make environmental effects show up in one’s descendants–including shooting anyone who disagreed.

Trofim Lysenko, a Soviet agronomist, claimed to be able to make wheat that would grow in winter–and pass on the trait to its offspring–by exposing the wheat seeds to cold. Of course, if that actually worked, Europeans would have developed cold-weather wheat thousands of years ago.

Lysenko was essentially the USSR’s version of an Affirmative Action hire:

“By the late 1920s, the Soviet political leaders had given their support to Lysenko. This support was a consequence, in part, of policies put in place by the Communist Party to rapidly promote members of the proletariat into leadership positions in agriculture, science and industry. Party officials were looking for promising candidates with backgrounds similar to Lysenko’s: born of a peasant family, without formal academic training or affiliations to the academic community.” (From the Wikipedia page on Lysenko)

In 1940, Lysenko became director of the USSR’s Academy of Science’s Institute of Genetics–a position he would hold until 1964. In 1948, scientific dissent from Lysenkoism was formally outlawed.

“From 1934 to 1940, under Lysenko’s admonitions and with Stalin’s approval, many geneticists were executed (including Isaak Agol, Solomon Levit, Grigorii Levitskii, Georgii Karpechenko and Georgii Nadson) or sent to labor camps. The famous Soviet geneticist Nikolai Vavilov was arrested in 1940 and died in prison in 1943. Hermann Joseph Muller (and his teachings about genetics) was criticized as a bourgeois, capitalist, imperialist, and promoting fascism so he left the USSR, to return to the USA via Republican Spain.

In 1948, genetics was officially declared “a bourgeois pseudoscience”; all geneticists were fired from their jobs (some were also arrested), and all genetic research was discontinued.”  (From the Wikipedia page on Lysenkoism.)

Alas, the Wikipedia does not tell me if anyone died from Lyskenkoism itself, say, after their crops failed, but I hear the USSR doesn’t have a great agricultural record.

Lysenko got kicked out in the 60s, but his theories have returned in the form of SJW-inspired claims of the magic of epigenetics to explain how any differences in average group performance or behavior is actually the fault of long-dead white people. Eg:

Trauma May be Woven into DNA of Native Americans, by Mary Pember

” The science of epigenetics, literally “above the gene,” proposes that we pass along more than DNA in our genes; it suggests that our genes can carry memories of trauma experienced by our ancestors and can influence how we react to trauma and stress.”

That’s a bold statement. At least Pember is making Walker’s argument for him.

Of course, that’s not actually what epigenetics says, but I’ll get to that in a bit.

“The Academy of Pediatrics reports that the way genes work in our bodies determines neuroendocrine structure and is strongly influenced by experience.”

That’s an interesting source. While I am sure the A of P knows its stuff, their specialty is medical care for small children, not genetics. Why did Pember not use an authority on genetics?

Note: when thinking about whether or not to trust an article’s science claims, consider the sources they use. If they don’t cite a source or cite an unusual, obscure, or less-than-authoritative source, then there’s a good chance they are lying or cherry-picking data to make a claim that is not actually backed up by the bulk of findings in the field. Notice that Pember does not provide a link to the A of P’s report on the subject, nor provide any other information so that an interested reader can go read the full report.

Wikipedia is actually a decent source on most subjects. Not perfect, of course, but it is usually decent. If I were writing science articles for pay, I would have subscriptions to major science journals and devote part of my day to reading them, as that would be my job. Since I’m just a dude with a blog who doesn’t get paid and so can’t afford a lot of journal memberships and has to do a real job for most of the day, I use a lot of Wikipedia. Sorry.

Also, I just want to note that the structure of this sentence is really wonky. “The way genes work in our bodies”? As opposed to how they work outside of our bodies? Do I have a bunch of DNA running around building neurotransmitters in the carpet or something? Written properly, this sentence would read, “According to the A of P, genes determine neuroenodcrine structures, in a process strongly influenced by experience.”

Pember continues:

“Trauma experienced by earlier generations can influence the structure of our genes, making them more likely to “switch on” negative responses to stress and trauma.”

Pember does not clarify whether she is continuing to cite from the A of P, or just giving her own opinions. The structure of the paragraph implies that this statement comes from the A of P, but again, no link to the original source is given, so I am hard pressed to figure out which it is.

At any rate, this doesn’t sound like something the A of P would say, because it is obviously and blatantly incorrect. Trauma *may* affect the structure of one’s epigenetics, but not the structure of one’s genes. The difference is rather large. Viruses and ionizing radiation can change the structure of your DNA, but “trauma” won’t.

” The now famous 1998 ACES study conducted by the Centers for Disease Control (CDC) and Kaiser Permanente showed that such adverse experiences could contribute to mental and physical illness.”

Um, no shit? Is this one of those cases of paying smart people tons of money to tell us grass is green and sky is blue? Also, that’s a really funny definition of “famous.” Looks like the author is trying to claim her sources have more authority than they actually do.

“Folks in Indian country wonder what took science so long to catch up with traditional Native knowledge.”

I’m pretty sure practically everyone already knew this.

“According to Bitsoi, epigenetics is beginning to uncover scientific proof that intergenerational trauma is real. Historical trauma, therefore, can be seen as a contributing cause in the development of illnesses such as PTSD, depression and type 2 diabetes.”

Okay, do you know what epigenetics actually shows?

The experiment Wikipedia cites is of male mice who were trained to fear a certain smell by giving them small electric shocks when they smelled the smell. The children of these mice, conceived after the foot-shocking was finished, startled in response to the smell–they had inherited their father’s epigenetic markers that enhanced their response to that specific smell.

It’s a big jump from “mice startle at smells” to “causes PTSD.” This is a big jump in particular because of two things:

1. Your epigenetics change all the time. It’s like learning. You don’t just learn one thing and then have this one thing you’ve learned stuck in your head for the entire rest of your life, unable to learn anything new. Your epigenetics change in response to life circumstances throughout your entire life.

Eg, (from the Wikipedia):

“One of the first high-throughput studies of epigenetic differences between monozygotic twins focused in comparing global and locus-specific changes in DNA methylation and histone modifications in a sample of 40 monozygotic twin pairs. In this case, only healthy twin pairs were studied, but a wide range of ages was represented, between 3 and 74 years. One of the major conclusions from this study was that there is an age-dependent accumulation of epigenetic differences between the two siblings of twin pairs. This accumulation suggests the existence of epigenetic “drift”.

In other words, when identical twins are babies, they have very similar epigenetics. As they get older, their epigenetics get more and more different because they have had different experiences out in the world, and their experiences have changed their epigenetics. Your epigenetics change as you age.

Which means that the chances of the exact same epigenetics being passed down from father to child over many generations are essentially zilch.

2. Tons of populations have experienced trauma. If you go back far enough in anyone’s family tree, you can probably find someone who has experienced trauma. My grandparents went through trauma during the Great Depression and WWII. My biological parents were both traumatized as children. So have millions, perhaps billions of other people on this earth. If trauma gets encoded in people’s DNA (or their epigenetics,) then it’s encoded in virtually every person on the face of this planet.

Type 2 Diabetes, Depression, and PTSD are not evenly distributed across the planet. Hell, they aren’t even common in all peoples who have had recent, large oppression events. African Americans have low levels of depression and commit suicide at much lower rates than whites–have white Americans suffered more oppression than black Americans? Whites commit suicide at a higher rate than Indians–have the whites suffered more historical trauma? On a global scale, Israel has a relatively low suicide rate–lower than India’s. Did India recently experience some tragedy worse than the Holocaust? (See yesterday’s post for all stats.)

Type 2 Diabetes reaches its global maximum in Saudia Arabia, Oman, and the UAE, which as far as I know have not been particularly traumatized lately, and is much lower among Holocaust descendants in nearby Israel:

From a BBC article on obesity
From a BBC article on obesity

It’s also very low in Sub-Saharan Africa, even though all of the stuff that causes “intergenerational trauma” probably happened there in spades. Have Americans been traumatized more than the Congolese?

This map doesn’t make any sense from the POV of historical trauma. It makes perfect sense if you know who’s eating fatty Waestern diets they aren’t adapted to. Saudia Arabia and the UAE are fucking rich (I bet Oman is, too,) and their population of nomadic goat herders has settled down to eat all the cake they want. The former nomadic lifestyle did not equip them to digest lots of refined grains, which are hard to grow in the desert. Most of Africa (and Yemen) is too poor to gorge on enough food to get Type-2 Diabetes; China and Mongolia have stuck to their traditional diets, to which they are well adapted. Mexicans are probably not adapted to wheat. The former Soviet countries have probably adopted Western diets. Etc., etc.

Why bring up Type-2 Diabetes at all? Well, it appears Indians get Type-2 Diabetes at about the same rate as Mexicans, [Note: PDF] probably for the exact same reasons: their ancestors didn’t eat a lot of wheat, refined sugar, and refined fats, and so they aren’t adapted to the Western diet. (FWIW, White Americans aren’t all that well adapted to the Western Diet, either.)

Everybody who isn’t adapted to the Western Diet gets high rates of diabetes and obesity if they start eating it, whether they had historical trauma or not. We don’t need epigenetic trauma to explain this.

“The researchers found that Native peoples have high rates of ACE’s and health problems such as posttraumatic stress, depression and substance abuse, diabetes all linked with methylation of genes regulating the body’s response to stress. “The persistence of stress associated with discrimination and historical trauma converges to add immeasurably to these challenges,” the researchers wrote.

Since there is a dearth of studies examining these findings, the researchers stated they were unable to conclude a direct cause between epigenetics and high rates of certain diseases among Native Americans.”

There’s a dearth of studies due to it being really immoral to purposefully traumatize humans and then breed them to see if their kids come out fucked up. Luckily for us, (or not luckily, depending on how you look at it,) however, humans have been traumatizing each other for ages, so we can just look at actually traumatized populations. There does seem to be an effect down the road for people whose parents or grandparents went through famines, but, “the effects could last for two generations.”

As horrible as the treatment of the Indians has been, I am pretty sure they didn’t go through a famine two generations ago on the order of what happened when the Nazis occupied the Netherlands and 18-22,000 people starved.

In other words, there’s no evidence of any long-term epigenetic effects large enough to create the effects they’re claiming. As I’ve said, if epigenetics actually acted like that, virtually everyone on earth would show the effects.

The reason they don’t is because epigenetic effects are relatively short-lived. Your epigenetics get re-written throughout your lifetime.

” Researchers such as Shannon Sullivan, professor of philosophy at UNC Charlotte, suggests in her article “Inheriting Racist Disparities in Health: Epigenetics and the Transgenerational Effects of White Racism,” that the science has faint echoes of eugenics, the social movement claiming to improve genetic features of humans through selective breeding and sterilization.”

I’m glad the philosophers are weighing in on science. I am sure philosophers know all about genetics. Hey, remember what I said about citing sources that are actual authorities on the subject at hand? My cousin Bob has all sorts of things to say about epigenetics, but that doesn’t mean his opinions are worth sharing.

The article ends:

“Isolating and nurturing a resilience gene may well be on the horizon.”

How do you nurture a gene?

 

There are things that epigenetics do. Just not the things people want them to do.