Live Fast, Die Young: The amazing correlation between self-control and not dying

Impulsive people die younger than non-impulsive people, so much so that how your teacher rated you as a student back when you were a kid is actually a decent predictor of your later mortality.

. The first two probable reasons for this are obvious:

1. They do risky things like drive too fast, hold up conbinis, or take drugs, all of which can actually kill you.

2. They engage in behaviors with potentially negative long-term consequences, like eating too many donuts or failing out of school and having to do a crappy job with bad safety precautions.

But the third reason is less obvious, unless you’re Jayman:

3. There is no point to planning for the future if you’re going to die young anyway.

Some people come from long-lived people. They have genes that will help keep them alive for a very long time. Some people don’t. These people live fast. They walk earlier, they mature earlier, they get pregnant earlier, and they die earlier. Everything they do is on a shorter timeframe than the long-lived people. To them, they aren’t impulsive–everyone else is just agonizingly slow.

Why save for retirement if you’re not going to live that long?

Impulsive people are like normal people, just sped up.

Just about the best thing I could find today (light and BMI):

“The results of this study demonstrate that the timing of even moderate intensity light exposure is independently associated with BMI. Specifically, having a majority of the average daily light exposure above 500 lux (MLiT500) earlier in the day was associated with a lower BMI. In practical terms, for every hour later of MLiT500 in the day, there was a 1.28 unit increase in BMI. The complete regression model (MLiT500, age, gender, season, activity level, sleep duration and sleep midpoint) accounted for 34.7% of the variance in BMI. Of the variables we explored, MLiT500 contributed the largest portion of the variance (20%).”

From “Timing and Intensity of Light Correlate with Body Weight in Adults” by Kathryn J. Reid, Giovanni Santostasi, Kelly G. Baron, John Wilson, Joseph Kang, and Phyllis C. Zee.

Hey, DNA: What is it good for?

So why do we still have bits of Neanderthal DNA hanging around after so many years? Of course it could just be random junk, but it’s more fun to think that it might be useful.

And the obvious useful thing for it to do is climate adaptation, since Neanderthals had been living in dark, cold, ice-age Europe for much longer than the newly-arrived h. Sapiens, and so might have had some adaptations to help deal with it.

Okay, so here is something related I was reading the other day, that I consider pretty interesting. So it looks like the people who live up on the Tibetan Plateau (like the Tibetans,) are really well-adapted to the altitude. No mean feat, considering that other populations who live at similar altitudes don’t seem to be as well-adjusted, despite living up there for similar lengths of time.

Well, now it appears that the Tibetans have actually been living in Tibet for waaaay longer than expected, because the original h. Sapiens who moved into Tibet intermarried with archaic hominids who had already lived there for hundreds of thousands of years, and so probably picked up their altitude adaptations from those guys.

BTW, “species” is a social construct and you probably shouldn’t bother with it here.

So what kind of useful stuff might we have picked up from Neanderthals?

First I’d like to interject that I still find declarations of “aha, we got this gene from Neanderthals and it does this!” to be speculative and prone to changing. All of the articles I’ve read tend to report the same list of stuff in a similar fashion, so I suspect they’re all workign off one or two sources, which makes everything doubly sketchy. So we’re going in here with a big “if” this is true…

Some of the results are fairly boring, like Neanderthal DNA affecting hair and skin. We already speculate that skin tone helps us deal with sunlight levels, so that’s sensible.

More interesting is the claim that Neanderthal DNA may predispose people to Type-2 Diabetes and depression.

Now why the hell would it do that? It’s probably not *just* random–after all, large stretches of DNA have little to no Neanderthal admixture at all, suggesting that genes in those spots just weren’t useful, so why would we have retained such apparently negative traits?

Maybe, like sickle cell anemia, these things actually have a positive function–at least in the right environments.

I read a fascinating theory a few years ago that Type 2 Diabetes and Seasonal Affective Disorder are actually just part of our bodies’ natural mechanisms for dealing with winter. Basically, you’re supposed to eat plants and get fat all summer long, while plants are available, and then by winter, your ability to absorb more glucose shuts down (there’s no point since the plants are all dead) and you switch over to burning ketones instead and eating an all-mammoth diet.

(Some groups, like the Inuit and Masai, historically [and may today still] survived on diets that included virtually no plants and so ran all of their cellular energy needs through the ketogenic instead of the glucose system.)

During this winter time, humans, like other animals, slowed down and semi-hibernated to save energy and because why the fuck not, it’s dark and no one has invented lightbulbs, yet.

By spring, you’ve lost a lot of weight, the plants come back, and so does your ability to use glucose.

This theory is laid out in the book Lights Out by T. S. Wiley, if you’re curious. I thought it was a really interesting book, but you might just think it’s all crank, I dunno.

Anyway, a big hole in Wiley’s plot is how we actually got this adaptation in the first place, since it’s a pretty complicated one and h. Sapiens hasn’t actually been living in places with winter for all that long. Wiley just claims that it’s a deep internal mechanism that animals have, which always struck me as kinda bunk because why would a species that evolved in Africa, from other animals in Africa, etc., probably going back for million upon millions of years, have some sort of complicated system like this still functional in its genome? A trait that is not undergoing positive selective pressure is probably going to become non-functional pretty quickly. But the theory was cool enough otherwise to ignore this bit, so I’ve kept it around.

Right, so here’s the (potential) answer: h. Sapiens didn’t have this adaptation hiding deep inside of them, Neanderthals had it. Neanderthals had been living in cold places for way, way longer than h. Sapiens, and by inter-breeding with them, we got to take advantage of a bunch of cold-weather adaptations they’d developed over that time frame–thus getting a jump-start on evolving to cope with the weather.

At any rate, if Wiley is correct, and SAD and Type-2 Diabetes are actually part of a dealing with winter complex that benefited our cold-weather ancestors, then that wold explain why these genes would have persisted over the years instead of being bred out.

An easy way to test this would be to compare rates of Type-2 Diabetes and SAD among African immigrants to Europe/other wintery latitudes, African Americans (who have a small amount of Euro admixture,) and Europeans. (Watching out, of course, for Vit D issues.) If the Euros have more SAD and Type-2 Diabetes than Africans living at the same latitude, then those would appear to be adaptations to the latitude. If the Africans have more, then my theory fails.

Species is a Social Construct: Or my Grandfather’s Totally Badass Dog

Coydogs
Coydogs, Wyoming

My grandfather was a badass kind of guy, so of course his dogs were awesome, too.

He lived in a part of the country where coyotes were still a problem for livestock producers (it’s always a bummer when your favorite chicken gets eaten,) so he got this German Shepherd.

The German Shepherd proceeded to kill all the male coyotes in the area.

The next spring, we kept spotting half-German Shepherd, half-coyote pups.

 

Unlike mules, coydogs are fertile, and can continue making more generations of coydogs–or whatever they happen to mate with. In fact, it appears that most species of the Canis genus–various wolves, domestic dogs, dingoes, coyotes, and some jackals–can interbreed. Foxes and other less-closely related members of the family Canidae, however, cannot breed with canids–they have different numbers of chromosomes, which makes the genetics not really work.

(This is what is up with mule, btw. Horses and donkeys have different numbers of chromosomes.)

The history of different canid species actually gets kinda complicated when you look at the inter-species mixing. According to the Wikipedia:

“…melanistic coyotes have been shown to have inherited their black pelts from dogs likely brought to North America through the Bering Land Bridge 12,000 to 14,000 years ago by the ancestors of the America’s indigenous people.”

” Northern Canada’s Aboriginal populations were mating coyotes and wolves to their sled dogs in order to produce more resilient animals as late as the early 20th century.”

(Well that explains the wolf admixture in National Geographic’s article on dog genetics! I’ve been wondering about that.)

“Some 15% of 10,000 coyotes taken annually in Illinois for their fur during the early 1980s may have been coydogs based on cranial measurements… Of 379 wild canid skulls taken in Ohio from 1982 to 1988, 10 (2.6%) were found to be coydogs.”

From the article on coyotes:

“Coyotes have hybridized with wolves to varying degrees, particularly in the Eastern United States and Canada. The so-called “eastern coyote” of northeastern North America has been confirmed to be of mixed wolf-coyote parentage, and probably originated in the aftermath of the extermination of wolves in the northeast, thus allowing coyotes to colonize former wolf ranges and mix with remnant wolf populations.”

” In 2011, an analysis of 48,000 SNP chips in the genomes of various wolf and coyote populations revealed that the eastern wolf …and the red wolf… both previously labeled as species distinct from the gray wolf, are in fact products of varying degrees of wolf-coyote hybridization. The wolf-coyote admixture resulting in the development of the eastern wolf may have occurred on the order of 600–900 years ago between gray wolves and a now extinct pre-Columbian coyote population. The eastern wolf has since backcrossed extensively with parent gray wolf populations. The red wolf may have originated later, approximately 287–430 years ago, when much of the southeastern U.S. was being converted to agriculture and predators were targeted for extermination. During this period, declining local wolf populations would have been forced to mate with coyotes, with the resulting hybrids backcrossing to coyotes as the wolves disappeared, to the extent that ~75–80% of the modern red wolf’s genome is of coyote derivation.

And jackals:

“…since 1975, Russian scientists have bred quarter jackal hybrids, initially from jackals and Lapponian Herder reindeer herding dogs, called Sulimov dogs in order to take advantage of the jackal’s superior olfactory abilities combined with the Lapponian Herder’s resistance to cold. They are owned by Aeroflot – Russian Airlines and trained as sniffer dogs for use in airports. According to the breed’s creator, first-generation hybrid pups could only be produced by male dogs and female jackals, as male jackals refused to mate with female dogs.”

Has Christianity Selected for an Atheistic Upper Class?

I’ve been trying for a while to figure out when atheism became mainstream in the West. Sometimes I answer, “Around the end of the English Civil War,” (1650) and sometimes I answer, “Late 1980s/early 1990s.”

Medieval Europeans seem to have been pretty solidly Christian–probably about as Christian as modern Muslims are Muslim.

Modern Westerners are highly atheistic–even many of the “Christians”. So what happened?

I speculate that the upper classes in France, Britain, and the Colonies (and probably that-which-would-become-Germany and a few other places I’m less familiar with, like the Netherlands,) were largely atheistic by the 1700s. Look at the writings of the Enlightenment philosophers, the behavior of the French nobility, the English distrust of any kind of religious “enthusiasm,” German bishops actively encouraging Jewish settlement in their cities and attempting to protect the Jews from angry peasant mobs, various laws outlawing or greatly limiting religious power passed during the French Revolution, the deism of the Founding Fathers, etc.

By contrast, the lower classes in NW Europe and especially America retained their belief for far longer–a few isolated pockets of belief surviving even into the present. For example, see the Pilgrims, the Counter-Revolution in the Vendee, maybe German peasants, televangelists in the 80s, blue laws, and Appalachian snake handlers in the ’50s, etc.

So how did that happen? I propose that the upper class and lower class followed different evolutionary trajectories (due to different conditions), with strong religiosity basically already selected out by the 1700s, meaning the relevant selection period is roughly 500-1700, not post-1700s.

During this time, the dominant religion was Catholicism, and Catholicism has generally forbade its priests, monks, nuns, etc., from getting married and having children since somewhere abouts the 300s or 400s. (With varying levels of success.)

Who got to be an official member of the Church hierarchy? Members of the upper class. Peasants couldn’t afford to do jobs that didn’t involve growing food, and upper class people weren’t going to accept peasants as religious authorities with power over their eternal souls, anyway. Many (perhaps most) of the people who joined the church were compelled at least in part by economic necessity–lack of arable land and strict inheritance laws meant that a family’s younger sons and daughters would not have the resources for marriage and family formation, anyway, and so these excess children were shunted off to monasteries.

There was another option for younger sons: the army. (Not such a good option for younger daughters.) Folks in the army probably did have children; you can imagine the details.

So we can imagine that, given the option between the army and the Church, those among the upper class with more devote inclinations probably chose the Church. And given a few hundred years of your most devote people leaving no children (and little genetic inflow from the lower classes,) the net result would be a general decrease in the % of genes in your population that contribute to a highly religious outlook.

(This assumes, of course, that religiosity can be selected for. I assume it can.)

Since the lower classes cannot join the Church, we should see much more religiosity among them. (Other factors affected the lower classes, just not this one.) If anything, one might speculate that religiosity may have increased reproductive success for the lower classes, where it could have inspired family-friendly values like honesty, hard work, fidelity, not being a drunkard, etc. A hard-working, moderately devout young man or woman may have been seen as a better potential spouse by the folks arranging marriages than a non-devout person.

Religiosity probably persisted in the US for longer than in Europe because:
1. More religious people tended to move from Europe to America, leaving Europe less religious and America more;

2. The beneficial effects of being a devout person who could raise lots of children were enhanced by the availability of abundant natural resources, allowing these people to raise even more children. NW Europe has had very little new land opened up in the past thousand years, limiting everybody’s expansion. The European lower classes historically did not reproduce themselves (horrific levels of disease and malnutrition will do that to you), being gradually replaced by downwardly-mobile upper classes. (There are probably regions in which the lower classes did survive, of course;)

3. By the time we’re talking about America, we’re talking about Protestant denominations rather than Catholicism, and Protestants generally allow their clergy to marry.

Genetic Aristotelian Moderation

I suspect a lot of genetic traits (being that many involve the complex interaction of many different genes) are such that having a little bit of the trait is advantageous, but having too much (or conversely, too little) is negative.

A few obvious examples:

Aggression: too much, and you go to jail. Historically, prison conditions were awful enough in the West that this likely exerted an upper bound on the population’s criminality by eliminating violent people from the gene pool.

But too little, and you get taken advantage of. You can’t compete in job interviews, get promoted, make friends, or ask people out on dates. Aggressive people take your stuff, and you can’t protect against them.

From getting jobs to getting mates to not being mugged, a little bit of aggression is clearly a good thing.

Intelligence: High IQ is tremendously mal-adaptive in modern society. (This may always have been true.) The upper end of the IQ curve basically does not have children. (On both micro and macro levels.) I’m not prepared to say whether this is a bug or a feature.

But, low IQ appears to also maladaptive. This was certainly true historically in the West, where extremely high death rates and intense resource competition left the dumber members of society with few surviving offspring. Dumb people just have trouble accomplishing the things necessary for raising lots of children.

Somewhat above average IQ appears to be the most adaptive, at least in the present and possibly historically.

Height: Really tall men have health problems and die young. Really short men are considered undateable and so don’t have children. So the pressure is to be tall, but not too tall.

(Speculatively) Depression: Too much depression, and you commit suicide. Not enough, and you’re a happy-go-lucky person who drops out of school and punches people. Just enough, and you work hard, stay true to your spouse, don’t get into fights, and manage to do all of the boring stuff required by Western society. (Note: this could have changed in the past hundred years.)

Sickle Cell Anemia: I don’t think I need to explain this one.

(Also speculative) Tay Sach’s: Tay Sach’s is a horrible neurological disease that shows up in populations with evidence of very high recent pressure to increase IQ, such as Ashkenazim (one of the worlds’ highest IQ groups) and Quebecois. There is therefore speculation that in its heterozygous form, Tay Sach’s may enhance neural development, instead of killing you hideously.

Cats are Cuckoos

Cats win at Evolution,” (from Cave People and Stuff,) is a great article about the evolution of cats from primate-predator to primate-parasite.

A quote:

“The domestic cat – aka Felis catus or Felis silvestris catus – is the same size as a human infant.  Its mew sounds very similar to the cry of a newborn human baby.  It has a round face with big eyes, like a human infant.  The domestic cat has gone down the same evolutionary route as the cuckoo, only much further.  Its entire life-cycle involves imitating baby humans and being coddled, fed and protected as though it were a human infant.”

Fabulous post. Go read it all.

Increased gender dimorphism = lower IQ?

The short version of this is if you could measure the relative gender dimorphism of people–say, by comparing siblings–and compare that to their IQ, I wager the more androgynous people would come out smarter.

This began with Jayman’s Pioneer Hypothesis, which basically posits that frontier or pioneer environments will select for a certain suite of traits like aggressiveness and early menarche–that is, traits that allow them to quickly take over and fill up the land.

Based on this initial theory, I hypothesized that East Germany was settled later than West Germany–which turns out to be actually true. I was pretty stoked about that.

Anyway, earlier menarche => lower IQ, (I’m pretty sure this is well-documented) as shortening childhood = shortening the period of time your brain has to develop.

Raising the age of menarche gives your brain more time to develop. In environments where family formation has historically been difficult, ie, very densely populated areas with little free land available where people might have to wait for their relatives to die before they can get their own farm, have likely evolved people who hit menarche later (after all, there’s no need for early menarche in such an environment. See also: cave fish losing pigment because it’s not useful.) The opposite side of this coin is later menopause, but since these folks have lower fertility overall, I don’t think that’s a big factor.

Anyway, later menarche => more time for brains to develop => higher IQ.

I suppose the speculative part here is that late menarche populations are more androgynous and early menarche populations are less androgynous. This probably wouldn’t hold for all populations, but anecdotal experience with Americans seems consistent–eg, MIT students seem highly androgynous, while dumb people from elsewhere seem much more dimorphic. Actually, many of the extremely high-IQ people I’ve known have been trans*, as opposed to none of the dumb ones. Among dumb people, it seems perfectly normal for women to socialize in all-female groups, especially for activities like shopping and discussing celebrity gossip, while men find it normal to socialize in all-male groups, especially for activities like watching other grown men play keep-away (sports), drinking beer, and playing poker. (To be fair, though, I don’t have a lot of first-hand experience in the world of the dumb.)

Historically pioneer and historically densely settled populations probably end up with different notions of what is “normal” dimorphism, leading to lots of disputes as each side claims that their experiences are normal, and not realizing that the other sides’ experiences are normal for them, too.