Testosterone metabolization, autism, male brain, and female identity

I began this post intending to write about testosterone metabolization in autism and possible connections with transgender identity, but realized halfway through that I didn’t actually know whether the autist-trans connection was primarily male-to-female or female-to-male. I had assumed that the relevant population is primarily MtF because both autists and trans people are primarily male, but both groups do have female populations that are large enough to contribute significantly. Here’s a sample of the data I’ve found so far:

A study conducted by a team of British scientists in 2012 found that of a pool of individuals not diagnosed on the autism spectrum, female-to-male (FTM) transgender people have higher rates of autistic features than do male-to-female (MTF) transgender people or cisgender males and females. Another study, which looked at children and adolescents admitted to a gender identity clinic in the Netherlands, found that almost 8 percent of subjects were also diagnosed with ASD.

Note that both of these studies are looking at trans people and assessing whether or not they have autism symptoms, not looking at autists and asking if they have trans symptoms. Given the characterization of autism as “extreme male brain” and that autism is diagnosed in males at about 4x the rate of females, the fact that there is some overlap between “women who think they think like men” and “traits associated with male thought patterns” is not surprising.

If the reported connection between autism and trans identity is just “autistic women feel like men,” that’s pretty non-mysterious and I just wasted an afternoon.

Though the data I have found so far still does not look directly at autists and ask how many of them have trans symptoms, the wikipedia page devoted to transgender and transsexual computer programmers lists only MtFs and no FtMs. Whether this is a pattern throughout the wider autism community, it definitely seems to be a thing among programmers. (Relevant discussion.)

So, returning to the original post:

Autism contains an amusing contradiction: on the one hand, autism is sometimes characterized as “extreme male brain,” and on the other hand, (some) autists (may be) more likely than neurotypicals to self-identify as transwomen–that is, biological men who see themselves as women. This seems contradictory: if autists are more masculine, mentally, than the average male, why don’t they identify as football players, army rangers, or something else equally masculine? For that matter, why isn’t a group with “extreme male brains” regarded as more, well, masculine?

(And if autists have extreme male brains, does that mean football players don’t? Do football players have more feminine brains than autists? Do colorless green ideas sleep furiously? DO WORDS MEAN?)


In favor of the “extreme male brain” hypothesis, we have evidence that testosterone is important for certain brain functions, like spacial recognition, we have articles like this one: Testosterone and the brain:

Gender differences in spatial recognition, and age-related declines in cognition and mood, point towards testosterone as an important modulator of cerebral functions. Testosterone appears to activate a distributed cortical network, the ventral processing stream, during spatial cognition tasks, and addition of testosterone improves spatial cognition in younger and older hypogonadal men. In addition, reduced testosterone is associated with depressive disorders.

(Note that women also suffer depression at higher rates than men.)

So people with more testosterone are better at spacial cognition and other tasks that “autistic” brains typically excel at, and brains with less testosterone tend to be moody and depressed.

But hormones are tricky things. Where do they come from? Where do they go? How do we use them?

According to Wikipedia:

During the second trimester [of pregnancy], androgen level is associated with gender formation.[13] This period affects the femininization or masculinization of the fetus and can be a better predictor of feminine or masculine behaviours such as sex typed behaviour than an adult’s own levels. A mother’s testosterone level during pregnancy is correlated with her daughter’s sex-typical behavior as an adult, and the correlation is even stronger than with the daughter’s own adult testosterone level.[14]

… Early infancy androgen effects are the least understood. In the first weeks of life for male infants, testosterone levels rise. The levels remain in a pubertal range for a few months, but usually reach the barely detectable levels of childhood by 4–6 months of age.[15][16] The function of this rise in humans is unknown. It has been theorized that brain masculinization is occurring since no significant changes have been identified in other parts of the body.[17] The male brain is masculinized by the aromatization of testosterone into estrogen, which crosses the blood–brain barrier and enters the male brain, whereas female fetuses have α-fetoprotein, which binds the estrogen so that female brains are not affected.[18]

(Bold mine.)

Let’s re-read that: the male brain is masculinized by the aromatization of testosterone into estrogen.

If that’s not a weird sentence, I don’t know what is.

Let’s hop over to the scientific literature, eg, Estrogen Actions in the Brain and the Basis for Differential Action in Men and Women: A Case for Sex-Specific Medicines:

Burgeoning evidence now documents profound effects of estrogens on learning, memory, and mood as well as neurodevelopmental and neurodegenerative processes. Most data derive from studies in females, but there is mounting recognition that estrogens play important roles in the male brain, where they can be generated from circulating testosterone by local aromatase enzymes or synthesized de novo by neurons and glia. Estrogen-based therapy therefore holds considerable promise for brain disorders that affect both men and women. However, as investigations are beginning to consider the role of estrogens in the male brain more carefully, it emerges that they have different, even opposite, effects as well as similar effects in male and female brains. This review focuses on these differences, including sex dimorphisms in the ability of estradiol to influence synaptic plasticity, neurotransmission, neurodegeneration, and cognition, which, we argue, are due in a large part to sex differences in the organization of the underlying circuitry.

Hypothesis: the way testosterone works in the brain (where we both do math and “feel” male or female) and the way it works in the muscles might be very different.

Do autists actually differ from other people in testosterone (or other hormone) levels?

In Elevated rates of testosterone-related disorders in women with autism spectrum conditions, researchers surveyed autistic women and mothers of autistic children about various testosterone-related medical conditions:

Compared to controls, significantly more women with ASC [Autism Spectrum Conditions] reported (a) hirsutism, (b) bisexuality or asexuality, (c) irregular menstrual cycle, (d) dysmenorrhea, (e) polycystic ovary syndrome, (f) severe acne, (g) epilepsy, (h) tomboyism, and (i) family history of ovarian, uterine, and prostate cancers, tumors, or growths. Compared to controls, significantly more mothers of ASC children reported (a) severe acne, (b) breast and uterine cancers, tumors, or growths, and (c) family history of ovarian and uterine cancers, tumors, or growths.

Androgenic Activity in Autism has an unfortunately low number of subjects (N=9) but their results are nonetheless intriguing:

Three of the children had exhibited explosive aggression against others (anger, broken objects, violence toward others). Three engaged in self-mutilations, and three demonstrated no aggression and were in a severe state of autistic withdrawal. The appearance of aggression against others was associated with having fewer of the main symptoms of autism (autistic withdrawal, stereotypies, language dysfunctions).

Three of their subjects (they don’t say which, but presumably from the first group,) had abnormally high testosterone levels (including one of the girls in the study.) The other six subjects had normal androgen levels.

This is the first report of an association between abnormally high androgenic activity and aggression in subjects with autism. Although a previously reported study did not find group mean elevations in plasma testosterone in prepubertal autistic subjects (4), it appears here that in certain autistic individuals, especially those in puberty, hyperandrogeny may play a role in aggressive behaviors. Also, there appear to be distinct clinical forms of autism that are based on aggressive behaviors and are not classified in DSM-IV. Our preliminary findings suggest that abnormally high plasma testosterone concentration is associated with aggression against others and having fewer of the main autistic symptoms.

So, some autists have do have abnormally high testosterone levels, but those same autists are less autistic, overall, than other autists. More autistic behavior, aggression aside, is associated with normal hormone levels. Probably.

But of course that’s not fetal or early infancy testosterone levels. Unfortunately, it’s rather difficult to study fetal testosterone levels in autists, as few autists were diagnosed as fetuses. However, Foetal testosterone and autistic traits in 18 to 24-month-old children comes close:

Levels of FT [Fetal Testosterone] were analysed in amniotic fluid and compared with autistic traits, measured using the Quantitative Checklist for Autism in Toddlers (Q-CHAT) in 129 typically developing toddlers aged between 18 and 24 months (mean ± SD 19.25 ± 1.52 months). …

Sex differences were observed in Q-CHAT scores, with boys scoring significantly higher (indicating more autistic traits) than girls. In addition, we confirmed a significant positive relationship between FT levels and autistic traits.

I feel like this is veering into “we found that boys score higher on a test of male traits than girls did” territory, though.

In Polymorphisms in Genes Involved in Testosterone Metabolism in Slovak Autistic Boys, researchers found:

The present study evaluates androgen and estrogen levels in saliva as well as polymorphisms in genes for androgen receptor (AR), 5-alpha reductase (SRD5A2), and estrogen receptor alpha (ESR1) in the Slovak population of prepubertal (under 10 years) and pubertal (over 10 years) children with autism spectrum disorders. The examined prepubertal patients with autism, pubertal patients with autism, and prepubertal patients with Asperger syndrome had significantly increased levels of salivary testosterone (P < 0.05, P < 0.01, and P < 0.05, respectively) in comparison with control subjects. We found a lower number of (CAG)n repeats in the AR gene in boys with Asperger syndrome (P < 0.001). Autistic boys had an increased frequency of the T allele in the SRD5A2 gene in comparison with the control group. The frequencies of T and C alleles in ESR1 gene were comparable in all assessed groups.

What’s the significance of CAG repeats in the AR gene? Apparently they vary inversely with sensitivity to androgens:

Individuals with a lower number of CAG repeats exhibit higher AR gene expression levels and generate more functional AR receptors increasing their sensitivity to testosterone…

Fewer repeats, more sensitivity to androgens. The SRD5A2 gene is also involved in testosterone metabolization, though I’m not sure exactly what the T allele does relative to the other variants.

But just because there’s a lot of something in the blood (or saliva) doesn’t mean the body is using it. Diabetics can have high blood sugar because their bodies lack the necessary insulin to move the sugar from the blood, into their cells. Fewer androgen receptors could mean the body is metabolizing testosterone less effectively, which in turn leaves more of it floating in the blood… Biology is complicated.

What about estrogen and the autistic brain? That gets really complicated. According to Sex Hormones in Autism: Androgens and Estrogens Differentially and Reciprocally Regulate RORA, a Novel Candidate Gene for Autism:

Here, we show that male and female hormones differentially regulate the expression of a novel autism candidate gene, retinoic acid-related orphan receptor-alpha (RORA) in a neuronal cell line, SH-SY5Y. In addition, we demonstrate that RORA transcriptionally regulates aromatase, an enzyme that converts testosterone to estrogen. We further show that aromatase protein is significantly reduced in the frontal cortex of autistic subjects relative to sex- and age-matched controls, and is strongly correlated with RORA protein levels in the brain.

If autists are bad at converting testosterone to estrogen, this could leave extra testosterone floating around in their blood… but doens’t explain their supposed “extreme male brain.” Here’s another study on the same subject, since it’s confusing:

Comparing the brains of 13 children with and 13 children without autism spectrum disorder, the researchers found a 35 percent decrease in estrogen receptor beta expression as well as a 38 percent reduction in the amount of aromatase, the enzyme that converts testosterone to estrogen.

Levels of estrogen receptor beta proteins, the active molecules that result from gene expression and enable functions like brain protection, were similarly low. There was no discernable change in expression levels of estrogen receptor alpha, which mediates sexual behavior.

I don’t know if anyone has tried injecting RORA-deficient mice with estrogen, but here is a study about the effects of injecting reelin-deficient mice with estrogen:

The animals in the new studies, called ‘reeler’ mice, have one defective copy of the reelin gene and make about half the amount of reelin compared with controls. …

Reeler mice with one faulty copy serve as a model of one of the most well-established neuro-anatomical abnormalities in autism. Since the mid-1980s, scientists have known that people with autism have fewer Purkinje cells in the cerebellum than normal. These cells integrate information from throughout the cerebellum and relay it to other parts of the brain, particularly the cerebral cortex.

But there’s a twist: both male and female reeler mice have less reelin than control mice, but only the males lose Purkinje cells. …

In one of the studies, the researchers found that five days after birth, reeler mice have higher levels of testosterone in the cerebellum compared with genetically normal males3.

Keller’s team then injected estradiol — a form of the female sex hormone estrogen — into the brains of 5-day-old mice. In the male reeler mice, this treatment increases reelin levels in the cerebellum and partially blocks Purkinje cell loss. Giving more estrogen to female reeler mice has no effect — but females injected with tamoxifen, an estrogen blocker, lose Purkinje cells. …

In another study, the researchers investigated the effects of reelin deficiency and estrogen treatment on cognitive flexibility — the ability to switch strategies to solve a problem4. …

“And we saw indeed that the reeler mice are slower to switch. They tend to persevere in the old strategy,” Keller says. However, male reeler mice treated with estrogen at 5 days old show improved cognitive flexibility as adults, suggesting that the estrogen has a long-term effect.

This still doesn’t explain why autists would self-identify as transgender women (mtf) at higher rates than average, but it does suggest that any who do start hormone therapy might receive benefits completely independent of gender identity.

Let’s stop and step back a moment.

Autism is, unfortunately, badly defined. As the saying goes, if you’ve met one autist, you’ve met one autist. There are probably a variety of different, complicated things going on in the brains of different autists simply because a variety of different, complicated conditions are all being lumped together under a single label. Any mental disability that can include both non-verbal people who can barely dress and feed themselves and require lifetime care and billionaires like Bill Gates is a very badly defined condition.

(Unfortunately, people diagnose autism with questionnaires that include questions like “Is the child pedantic?” which could be equally true of both an autistic child and a child who is merely very smart and has learned more about a particular subject than their peers and so is responding in more detail than the adult is used to.)

The average autistic person is not a programmer. Autism is a disability, and the average diagnosed autist is pretty darn disabled. Among the people who have jobs and friends but nonetheless share some symptoms with formally diagnosed autists, though, programmer and the like appear to be pretty popular professions.

Back in my day, we just called these folks nerds.

Here’s a theory from a completely different direction: People feel the differences between themselves and a group they are supposed to fit into and associate with a lot more strongly than the differences between themselves and a distant group. Growing up, you probably got into more conflicts with your siblings and parents than with random strangers, even though–or perhaps because–your family is nearly identical to you genetically, culturally, and environmentally. “I am nothing like my brother!” a man declares, while simultaneously affirming that there is a great deal in common between himself and members of a race and culture from the other side of the planet. Your  coworker, someone specifically selected for the fact that they have similar mental and technical aptitudes and training as yourself, has a distinct list of traits that drive you nuts, from the way he staples papers to the way he pronounces his Ts, while the women of an obscure Afghan tribe of goat herders simply don’t enter your consciousness.

Nerds, somewhat by definition, don’t fit in. You don’t worry much about fitting into a group you’re not part of in the fist place–you probably don’t worry much about whether or not you fit in with Melanesian fishermen–but most people work hard at fitting in with their own group.

So if you’re male, but you don’t fit in with other males (say, because you’re a nerd,) and you’re down at the bottom of the highschool totem pole and feel like all of the women you’d like to date are judging you negatively next to the football players, then you might feel, rather strongly, the differences between you and other males. Other males are aggressive, they call you a faggot, they push you out of their spaces and threaten you with violence, and there’s very little you can do to respond besides retreat into your “nerd games.”

By contrast, women are polite to you, not aggressive, and don’t aggressively push you out of their spaces. Your differences with them are much less problematic, so you feel like you “fit in” with them.

(There is probably a similar dynamic at play with American men who are obsessed with anime. It’s not so much that they are truly into Japanese culture–which is mostly about quietly working hard–as they don’t fit in very well with their own culture.) (Note: not intended as a knock on anime, which certainly has some good works.)

And here’s another theory: autists have some interesting difficulties with constructing categories and making inferences from data. They also have trouble going along with the crowd, and may have fewer “mirror neurons” than normal people. So maybe autists just process the categories of “male” and “female” a little differently than everyone else, and in a small subset of autists, this results in trans identity.*

And another: maybe there are certain intersex disorders which result in differences in brain wiring/organization. (Yes, there are real interesx disorders, like Klinefelter’s, in which people have XXY chromosomes instead of XX or XY.) In a small set of cases, these unusually wired brains may be extremely good at doing certain tasks (like programming) resulting people who are both “autism spectrum” and “trans”. This is actually the theory I’ve been running with for years, though it is not incompatible with the hormonal theories discussed above.

But we are talking small: trans people of any sort are extremely rare, probably on the order of <1/1000. Even if autists were trans at 8 times the rates of non-autists, that’s still only 8/1000 or 1/125. Autists themselves are pretty rare (estimates vary, but the vast majority of people are not autistic at all,) so we are talking about a very small subset of a very small population in the first place. We only notice these correlations at all because the total population has gotten so huge.

Sometimes, extremely rare things are random chance.


Book on a Friday: Squid Empire: The Rise and Fall of the Cephalopods by Danna Staaf

Danna Staaf’s Squid Empire: The Rise and Fall of the Cephalopods is about the evolution of squids and their relatives–nautiluses, cuttlefish, octopuses, ammonoids, etc. If you are really into squids or would like to learn more about squids, this is the book for you. If you aren’t big on reading about squids but want something that looks nice on your coffee table and matches your Cthulhu, Flying Spaghetti Monster, and 20,000 Leagues Under the Sea decor, this is the book for you. If you aren’t really into squids, you probably won’t enjoy this book.

Squids, octopuses, etc. are members of the class of cephalopods, just as you are a member of the class of mammals. Mammals are in the phylum of chordates; cephalopods are mollusks. It’s a surprising lineage for one of Earth’s smartest creatures–80% mollusk species are slugs and snails. If you think you’re surrounded by idiots, imagine how squids must feel.

The short story of cephalopodic evolution is that millions upon millions of years ago, most life was still stuck at the bottom of the ocean. There were some giant microbial mats, some slugs, some snails, some worms, and not a whole lot else. One of those snails figured out how to float by removing some of the salt from the water inside its shell, making itself a bit buoyant. Soon after its foot (all mollusks have a “foot”) split into multiple parts. The now-floating snail drifted over the seafloor, using its new tentacles to catch and eat the less-mobile creatures below it.

From here, cephalopods diversified dramatically, creating the famous ammonoids of fossil-dating lore.

Cross-section of a fossilized ammonite shell, revealing internal chambers and septa

Ammonoids are known primarily from their shells (which fossilize well) rather than their fleshy tentacle parts, (which fossilize badly). But shells we have in such abundance they can be easily used for dating other nearby fossils.

Ammonoids are obviously similar to their cousins, the lovely chambered nautiluses. (Please don’t buy nautilus shells; taking them out of their shells kills them and no one farms nautiluses so the shell trade is having a real impact on their numbers. We don’t need their shells, but they do.)

Ammonoids succeeded for millions of years, until the Creatceous extinction event that also took out the dinosaurs. The nautiluses survived–as the author speculates, perhaps because they lay large eggs with much more yolk that develop very slowly, infant nautiluses were able to wait out the event while ammonoids, with their fast-growing, tiny eggs dependent on feeding immediately after hatching simply starved in the upheaval.

In the aftermath, modern squids and octopuses proliferated.

How did we get from floating, shelled snails to today’s squishy squids?

The first step was internalization–cephalopods began growing their fleshy mantles over their shells instead of inside of them–in essence, these invertebrates became vertebrates. Perhaps this was some horrible genetic accident, but it worked out. These internalized shells gradually became smaller and thinner, until they were reduced to a flexible rod called a “pen” that runs the length of a squid’s mantle. (Cuttlefish still retain a more substantial bone, which is frequently collected on beaches and sold for birds to peck at for its calcium.)

With the loss of the buoyant shell, squids had to find another way to float. This they apparently achieved by filling themselves with ammonia salts, which makes them less dense than water but also makes their decomposition disgusting and renders them unfossilizable because they turn to mush too quickly. Octopuses, by contrast, aren’t full of ammonia and so can fossilize.

Since the book is devoted primarily to cephalopod evolution rather than modern cephalopods, it doesn’t go into much depth on the subject of their intelligence. Out of all the invertebrates, cephalopods are easily the most intelligent (perhaps really the only intelligent invertebrates). Why? If cephalopods didn’t exist, we might easily conclude that invertebrates can’t be intelligent–invertebrateness is somehow inimical to intelligence. After all, most invertebrates are about as intelligent as slugs. But cephalopods do exist, and they’re pretty smart.

The obvious answer is that cephalopods can move and are predatory, which requires bigger brains. But why are they the only invertebrates–apparently–who’ve accomplished the task?

But enough jabber–let’s let Mrs. Staaf speak:

I find myself obliged to address the perennial question: “octopuses” or “octopi”? Or, heaven help us, “octopodes”?

Whichever you like best. Seriously. Despite what you may have heard, “octopus” is neither ancient Greek nor Latin. Aristotle called the animal polypous for its “many feet.” The ancient Romans borrowed this word and latinized the spelling to polypus. It was much later that a Renaissance scientists coined and popularized the word “octopus,” using Greek roots for “eight” and “foot” but Latin spelling.

If the word had actually been Greek, it would be spelled octopous and pluralized octopodes. If translated into Latin, it might have become octopes and pluralized octopedes,  but more likely the ancient Roman would have simply borrowed the Greek word–as they did with poly pus. Those who perhaps wished to appear erudite used the Greek plural polypodes, while others favored a Latin ending and pluralized it polypi.

The latter is a tactic we English speakers emulate when we welcome “octopus” into our own language and pluralize it “octopuses” as I’ve chosen to do.

There. That settles it.

Dinosaurs are the poster children for evolution and extinction writ large…

Of course, not all of them did die. We know now that birds are simply modern dinosaurs, but out of habit we tend to reserve the word “dinosaur for the hug ancient creatures that went extinct at the end of the Cretaceous. After all, even if they had feathers, they seem so different from today’s finches and robins. For one thing, the first flying feathered dinosaurs all seem to have had four wings. There aren’t any modern birds with four wings.

Wesl… actually, domestic pigeons can be bred to grow feathers on their legs. Not fuzzy down, but long flight feathers, and along with these feathers their leg bones grow more winglike. The legs are still legs’ they can’t be used to fly like wings. They do, however, suggest a clear step along the road from four-winged dinosaurs to two-winged birds. The difference between pigeons with ordinary legs and pigeons with wing-legs is created by control switches in their DNA that alter the expression of two particular genes. These genes are found in all birds, indeed in all vertebrates,and so were most likely present in dinosaurs as well.

…and I’ve just discovered that almost all of my other bookmarks fell out of the book. Um.

So squid brains are shaped like donuts because their eating/jet propulsion tube runs through the middle of their bodies and thus through the middle of their brains. It seems like this could be a problem if the squid eats too much or eats something with sharp bits in it, but squids seem to manage.

Squids can also leap out of the water and fly through the air for some ways. Octopuses can carry water around in their mantles, allowing them to move on dry land for a few minutes without suffocating.

Since cephalopods are somewhat unique among mollusks for their ability to move quickly, they have a lot in common, genetically, with vertebrates. In essence, they are the most vertebrate-behaving of the mollusks. Convergent evolution.

The vampire squid, despite its name, is actually more of an octopus.

Let me quote from the chapter on sex and babies:

This is one arena in which cephalopods, both ancient and modern, are actually less alien than many aliens–even other mollusks. Slugs, for instance, are hermaphroditic, and in the course of impregnating each other their penises sometimes get tangled, so they chew them off. Nothing in the rest of this chapter will make you nearly that uncomfortable. …

The lovely argonaut octopus

In one living coleoid species, however, sex is blindingly obvious. Females of the octopus known as an argonaut are five times larger than males. (A killer whale is about five times larger than an average adult human, which in turn is about five times large than an opossum.)

This enormous size differential caught the attention of paleontologists who had noticed that many ammonoid species also came in two distinct size, which htey had dubbed microconch (little shell) and macroconch (big shell). Bot were clearly mature, as they had completed the juvenile part of the shell and constructed the final adult living chamber. After an initial flurry of debate, most researchers agreed to model ammonoid sex on modern argonauts, and began to call macroconchs females and microconcs males.

Some fossil nautiloids also come in macroconch and microchonch flavors, though it’s more difficult to be certain that both are adults…

However, the shells of modern nautiluses show the opposite pattern–males are somewhat large than females… Like the nautiloid shift from ten arms to many tens of arms, the pattern could certainly have evolved from a different ancestral condition. If we’re going to make that argument, though, we have to wonder when nautliloids switched from females to males as the larger sex, and why.

In modern species that have larger females, we usually assume the size difference has to do with making or brooding a lot of eggs.Female argonauts take it up a notch and actually secrete a shell-like brood chamber from their arms, using it to cradle numerous batch of eggs over their lifetime. meanwhile, each tiny male argonaut get ot mate only once. His hectocotylus is disposable–after being loaded with sperm and inserted into the female, it breaks off. …

By contrast, when males are the bigger sex, we often guess that the purpose is competition. Certainly many species of squid and cuttlefish have large males that battle for female attention on the mating grounds. They display outrageous skin patterns as they push, shove, and bite each other. Females do appear impressed; at least, they mate with the winning males and consent to be guarded by them. Even in these species, though, there are some mall males who exhibit a totally different mating strategy. While the big males strut their stuff, these small males quietly sidle up to the females, sometimes disguising themselves with female color patterns. This doesn’t put off the real females, who readily mate with these aptly named “sneaker males.” By their very nature, such obfuscating tactics are virtually impossible to glean from the fossil record…

More on octopus mating habits.

This, of course, reminded me of this graph:

In the majority of countries, women are more likely to be overweight than men (suggesting that our measure of “overweight” is probably flawed.) In some countries women are much more likely to be overweight, while in some countries men and women are almost equally likely to be overweight, and in just a few–the Czech Republic, Germany, Hungary, Japan, and barely France, men are more likely to be overweight.

Is there any rhyme or reason to this pattern? Surely affluence is related, but Japan, for all of its affluence, has very few overweight people at all, while Egypt, which is pretty poor, has far more overweight people. (A greater % of Egyptian women are overweight than American women, but American men are more likely to be overweight than Egyptian men.)

Of course, male humans are still–in every country–larger than females. Even an overweight female doesn’t necessarily weigh more than a regular male. But could the variation in male and female obesity rates have anything to do with historic mating strategies? Or is it completely irrelevant?

Back to the book:

Coleoid eyes are as complex as our own, with a lens for focusing light, a retina to detect it, and an iris to sharpen the image. … Despite their common complexity, though, there are some striking differences [between our and squid eyes]. For Example, our retina has a blind spot whee a bundle of nerves enters the eyeball before spreading out to connect to the font of every light receptor. By contrast, light receptors in the coleoid retina are innervated from behind, so there’s no “hole” or blind spot. Structural differences like this how that the two groups converged on similar solution through distinct evolutionary pathways.

Another significant difference is that fish went on to evolve color vision by increasing the variety of light-sensitive proteins in their eyes; coleoids never did and are probably color blind. I say “probably ” because the idea of color blindness in such colorful animals has flummoxed generations of scientists…

“I’m really more of a cuddlefish”

Color-blind or not, coleoids can definitely see something we humans are blind to: the polarization of light.

Sunlight normally consists of waves vibrating in all directions. but when these waves are reflected off certain surface, like water, they get organized and arrive at the retina vibrating in only one direction. We call this “glare” and we don’t like it, so we invented polarized sunglasses. … That’s pretty much all polarized sunglasses can do–block polaraized light. Sadly, they can’t help you decode the secret messages of cuttlefish, which have the ability to perform a sort of double0-talk with their skin, making color camouflage for the befit of polarization-blind predators while flashing polarized displays to their fellow cuttlefish.

That’s amazing. Here’s an article with more on cuttlefish vision and polarization.

Overall, I enjoyed this book. The writing isn’t the most thrilling, but the author has a sense of humor and a deep love for her subject. I recommend it to anyone with a serious hankering to know more about the evolution of squids, or who’d like to learn more about an ancient animal besides dinosaurs.

The Facsimile of Meaning

Most of the activities our ancestors spent the majority of their time on have been automated or largely replaced by technology. Chances are good that the majority of your great-great grandparents were farmers, but few of us today hunt, gather, plant, harvest, or otherwise spend our days physically producing food; few of us will ever build our own houses or even sew our own clothes.

Evolution has (probably) equipped us with neurofeedback loops that reward us for doing the sorts of things we need to do to survive, like hunt down prey or build shelters (even chimps build nests to sleep in,) but these are precisely the activities that we have largely automated and replaced. The closest analogues to these activities are now shopping, cooking, exercising, working on cars, and arts and crafts. (Even warfare has been largely replaced with professional sports fandom.)

Society has invented vicarious thrills: Books, movies, video games, even roller coasters. Our ability to administer vicarious emotions appears to be getting better and better.

And yet, it’s all kind of fake.

Exercising, for example, is in many ways a pointless activity–people literally buy machines so they can run in place. But if you have a job that requires you to be sedentary for most of the day and don’t fancy jogging around your neighborhood after dark, running in place inside your own home may be the best option you have for getting the post-running-down prey endorphin hit that evolution designed you to crave.

A sedentary lifestyle with supermarkets and restaurants deprives us of that successful-hunting endorphin hit and offers us no logical reason to go out and get it. But without that exercise, not only our physical health, but our mental health appears to suffer. According to the Mayo Clinic, exercise effectively decreases depression and anxiety–in other words, depression and anxiety may be caused in part by lack of exercise.

So what do we do? We have to make up some excuse and substitute faux exercise for the active farming/gardening/hunting/gathering lifestyles our ancestors lived.

By the way, about 20% of Americans are on psychiatric medications of some sort, [warning PDF] of which anti-depressants are one of the most commonly prescribed:

Overall, the number of Americans on medications used to treat psychological and behavioral disorders has substantially increased since 2001; more than one‐in‐five adults was on at least one
of these medications in 2010, up 22 percent from ten years earlier. Women are far more likely to take a drug to treat a mental health condition than men, with more than a quarter of the adult
female population on these drugs in 2010 as compared to 15 percent of men.

Women ages 45 and older showed the highest use of these drugs overall. …

The trends among children are opposite those of adults: boys are the higher utilizers of these medications overall but girls’ use has been increasing at a faster rate.

This is mind-boggling. 1 in 5 of us is mentally ill, (supposedly,) and the percent for young women in the “prime of their life” years is even higher. (The rates for Native Americans are astronomical.)

Lack of exercise isn’t the only problem, but I wager a decent chunk of it is that our lives have changed so radically over the past 100 years that we are critically lacking various activities that used to make us happy and provide meaning.

Take the rise of atheism. Irrespective of whether God exists or not, many functions–community events, socializing, charity, morality lessons, etc–have historically been done by religious groups. Atheists are working on replacements, but developing a full system that works without the compulsion of religious belief may take a long while.

Sports and video games replace war and personal competition. TV sitcoms replace friendship. Twitter replaces real life conversation. Politics replace friendship, conversation, and religion.

There’s something silly about most of these activities, and yet they seem to make us happy. I don’t think there’s anything wrong with enjoying knitting, even if you’re making toy octopuses instead of sweaters. Nor does there seem to be anything wrong with enjoying a movie or a game. The problem comes when people get addicted to these activities, which may be increasingly likely as our ability to make fake activities–like hyper-realistic special effects in movies–increases.

Given modernity, should we indulge? Or can we develop something better?


Angola and Atomization

Quick excerpt from God of the Rodeo: The Quest for Redemption in Louisiana’s Angola Prison:

Before the rodeo [Terry Hawkins] had graduated out of the fields to the position of fry cook. It was better than being A.D.H.D. (A Dude with a Hoe and a Ditch)–after stirring fried rice or flipping hotcakes on a sove ten feet long, he could grill hamburgers, bag them, and stuff them down his pants to sell in the dorm. Sometimes he snuck out with fried chicken under his shirt and cuts of cheese in his socks. Payment came in cigarettes, the prison’s currency. Later he would stand outside the canteen, and trade a few packs for shampoo or soap or deoderant, or “zoo-zos”–snacks of candy bars or sardines. He knew which guards would allow the stealing, the selling. He made sure to send them plates of fried chicken.

While reading this I thought, “This man has, at least, something to offer his neighbors. He can sell them food, something they’re grateful for. The guy with cheese in his socks and hamburgers in his pants is probably a respected member of his community.”

What do I have to offer my neighbors? I have skills, but they’re only of interest to a corporate employer, my boss. I don’t make anything for sale. I can’t raise a barn or train a horse, and even if I could, my neighbors don’t need these services. Even if I had milk for sale from my personal cow, my neighbors would still prefer to buy their milk at the grocery store.

All of these needs that we used to fill by interacting with our neighbors are now routed through multinational corporations that build their products in immense sweatshops in foreign countries.

I don’t even have to go to the store to buy things if I don’t want to–I can order things online, even groceries.

Beyond the economic, modern prosperity has also eliminated many of the ways (and places) people used to interact. As Lewis Mumford recounts (H/T Wrath of Gnon):

The Bible would have been different without public wells

To sum up the medieval dwelling house, one may say that it was characterized by lack of differentiated space and differentiated function. In the cities, however, this lack of internal differentiation was offset by a completer development of domestic functions in public institutions. Though the house might lack a private bake-oven, there was a public one at the baker’s or the cook-shop. Though it might lack a private bathroom, there was a municipal bath-house. Thought it might lack facilities for isolating and nursing a diseased member, there were numerous public hospitals. … As long as the conditions were rude–when people lived in the open, pissed freely in the garden or the street, bought and sold outdoors, opened their shutters and let in full sunlight–the defects of the house were far less serious than they were under a more refined regime.

Without all of the little, daily things that naturally brought people into contact with each other and knit them into communities, we simply have far fewer reasons to talk. We might think that people could simply make up for these changes by inventing new, leisure-oriented reasons to interact with each other, but so far, they’re struggling:

Americans’ circle of confidants has shrunk dramatically in the past two decades and the number of people who say they have no one with whom to discuss important matters has more than doubled, according to a new study by sociologists at DukeUniversity and the University of Arizona.

“The evidence shows that Americans have fewer confidants and those ties are also more family-based than they used to be,” said Lynn Smith-Lovin, Robert L. Wilson Professor of Sociology at Duke University and one of the authors of “ Social Isolation in America: Changes in Core Discussion Networks Over Two Decades.” …

It compared data from 1985 and 2004 and found that the mean number of people with whom Americans can discuss matters important to them dropped by nearly one-third, from 2.94 people in 1985 to 2.08 in 2004.

Researchers also found that the number of people who said they had no one with whom to discuss such matters more than doubled, to nearly 25 percent. The survey found that both family and non-family confidants dropped, with the loss greatest in non-family connections.

I don’t know about you, but I just don’t trust most people, and most people have given me no reason to trust them.


Existential Caprine


You were



There were predators

The lions could be confusing

But you were free

goat painting, Herculaneum

Then came men

Faster, smarter than lions

They killed the wolves

Brought you food

(The bread of slavery, they say, is far sweeter than the bread of freedom.)

And shelter

Children were born, safe from wolves, hunger, or cold

and you grew used to man.

Centuries passed

And it seemed you outnumbered the stars

Perhaps your sons disappeared

But was it worse than wolves?

You could almost forget you were once wild

Could you return to the mountains, even if you wanted to?

And as they lead you away

You ask

Did I ever have a choice?


To explain: The process of domestication is fascinating. Some animals, like wolves, began associating with humans because they could pick up our scraps. Others, like cats, began living in our cities because they liked eating the vermin we attracted. (You might say the mice, too, are domesticated.) These relationships are obviously mutually beneficial (aside from the mice.)

The animals we eat, though, have a different–more existential–story.

Humans increased the number of wild goats and sheep available for them to eat by eliminating competing predators, like wolves and lions. We brought them food in the winter, built them shelters to keep them warm in the winter, and led them to the best pastures. As a result, their numbers increased.

But, of course, we eat them.

From the goat’s perspective, is it worth it?

There’s a wonderful metaphor in the Bible, enacted every Passover: matzoh.

If you’ve never had it, matzoh tastes like saltines, only worse. It’s the bread of freedom, hastily thrown on the fire, hastily thrown on the fire and carried away.

The bread of slavery tastes delicious. The bread of freedom tastes awful.

1And they took their journey from Elim, and all the congregation of the children of Israel came unto the wilderness of Sin, which is between Elim and Sinai, on the fifteenth day of the second month after their departing out of the land of Egypt. 2And the whole congregation of the children of Israel murmured against Moses and Aaron in the wilderness: 3And the children of Israel said unto them, Would to God we had died by the hand of the LORD in the land of Egypt, when we sat by the flesh pots, and when we did eat bread to the full… Exodus 16

Even if the goats didn’t want to be domesticated, hated it and fought against it, did they have any choice? If the domesticated goats have more surviving children than wild ones, then goats will become domesticated. It’s a simple matter of numbers:

Total Fertility Rate by Country: Purple = 7 children per woman; Blue = 1 child per woman

The future belongs to those who show up.

Which future do you chose?


Recent Discoveries in Human Evolution: H. Sapiens 300,000 years old?

Welcome back to our discussion of recent exciting advances in our knowledge of human evolution:

  • Ancient hominins in the US?
  • Homo naledi
  • Homo flores
  • Humans evolved in Europe?
  • In two days, first H Sap was pushed back to 260,000 years,
  • then to 300,000 years!
  • Bell beaker paper

As we’ve been discussing for the past couple of weeks, the exact dividing line between “human” and “non-human” isn’t always hard and fast. The very first Homo species, such as Homo habilis, undoubtedly had more in common with its immediate Australopithecine ancestors than with today’s modern humans, 3 million years later, but that doesn’t mean these dividing lines are meaningless. Homo sapiens and Homo neandethalensis, while considered different species, interbred and produced fertile offspring (most non-Africans have 3-5% Neanderthal DNA as a result of these pairings;) by contrast, humans and chimps cannot produce fertile offspring, because humans and chimps have a different number of chromosomes. The genetic distance between the two groups is just too far.

Oldowan tool

The grouping of ancient individuals into Homo or not-Homo, Erectus or Habilis, Sapiens or not, is partly based on physical morphology–what they looked like, how they moved–and partly based on culture, such as the ability to make tools or control fire. While australopithecines made some stone tools (and chimps can make tools out of twigs to retrieve tasty termites from nests,) Homo habilis (“handy man”) was the first to master the art and produce large numbers of more sophisticated tools for different purposes, such as this Oldowan chopper.

But we also group species based on moral or political beliefs–scientists generally believe it would be immoral to say that different modern human groups belong to different species, and so the date when Homo ergaster transforms into Homo sapiens is dependent on the date when the most divergent human groups alive today split apart–no one wants to come up with a finding that will get trumpeted in media as “Scientists Prove Pygmies aren’t Human!” (Pygmies already have enough problems, what with their immediate neighbors actually thinking they aren’t human and using their organs for magic rituals.)

(Of course they would still be Human even if they part of an ancient lineage.)

But if an ecologically-minded space alien arrived on earth back in 1490 and was charged with documenting terrestrial species, it might easily decide–based on morphology, culture, and physical distribution–that there were several different Homo “species” which all deserve to be preserved.

But we are not space aliens, and we have the concerns of our own day.

So when a paper was published last year on archaic admixture in Pygmies and the Pygmy/Bushmen/everyone else split, West Hunter noted the authors used a fast–but discredited–estimate of mutation rate to avoid the claim that Pygmies split off 300,000 years ago, 100,000 years before the emergence of Homo sapiens:

There are a couple of recent papers on introgression from some quite divergent archaic population into Pygmies ( this also looks to be the case with Bushmen). Among other things, one of those papers discussed the time of the split between African farmers (Bantu) and Pygmies, as determined from whole-genome analysis and the mutation rate. They preferred to use the once-fashionable rate of 2.5 x 10-8 per-site per-generation (based on nothing), instead of the new pedigree-based estimate of about 1.2 x 10-8 (based on sequencing parents and child: new stuff in the kid is mutation). The old fast rate indicates that the split between Neanderthals and modern humans is much more recent than the age of early Neanderthal-looking skeletons, while the new slow rate fits the fossil record – so what’s to like about the fast rate? Thing is, using the slow rate, the split time between Pygmies and Bantu is ~300k years ago – long before any archaeological sign of behavioral modernity (however you define it) and well before the first known fossils of AMH (although that shouldn’t bother anyone, considering the raggedness of the fossil record).

This was a good catch. (Here is the relevant Dienekes article, plus Model-based analyses of whole-genome data reveal a complex evolutionary history involving archaic introgression in Central African Pygmies, and Whole-genome sequence analyses of Western Central African Pygmy hunter-gatherers reveal a complex demographic history and identify candidate genes under positive natural selection.) If the slow mutation rate matches the fossil record better than the fast, why use the fast–except if the fast gives you inconvenient results?

But now we have another finding, based on the Bushmen, which also pushes the Bushmen/everyone else split back further than 200,000 years–from BioRxiv, “Ancient genomes from southern Africa pushes modern human divergence beyond 260,000 years ago“:

Southern Africa is consistently placed as one of the potential regions for the evolution of Homo sapiens. To examine the region’s human prehistory prior to the arrival of migrants from East and West Africa or Eurasia in the last 1,700 years, we generated and analyzed genome sequence data from seven ancient individuals from KwaZulu-Natal, South Africa. Three Stone Age hunter-gatherers date to ~2,000 years ago, and we show that they were related to current-day southern San groups such as the Karretjie People. Four Iron Age farmers (300-500 years old) have genetic signatures similar to present day Bantu-speakers. The genome sequence (13x coverage) of a juvenile boy from Ballito Bay, who lived ~2,000 years ago, demonstrates that southern African Stone Age hunter-gatherers were not impacted by recent admixture; however, we estimate that all modern-day Khoekhoe and San groups have been influenced by 9-22% genetic admixture from East African/Eurasian pastoralist groups arriving >1,000 years ago, including the Ju|’hoansi San, previously thought to have very low levels of admixture. Using traditional and new approaches, we estimate the population divergence time between the Ballito Bay boy and other groups to beyond 260,000 years ago.

260,000 years! Looks like West Hunter was correct, and we should be looking at the earlier Pygmy divergence date, too.

Two days later, a paper from the opposite end of Africa appeared in Nature which–potentially–pushes H sapiens’s emergence to 300,000 years ago, “New fossils from Jebel Irhoud, Morocco and the pan-African origin of Homo sapiens“:

Fossil evidence points to an African origin of Homo sapiens from a group called either H. heidelbergensis or H. rhodesiensis. However, the exact place and time of emergence of H. sapiens remain obscure … In particular, it is unclear whether the present day ‘modern’ morphology rapidly emerged approximately 200 thousand years ago (ka) among earlier representatives of H. sapiens1 or evolved gradually over the last 400 thousand years2. Here we report newly discovered human fossils from Jebel Irhoud, Morocco, and interpret the affinities of the hominins from this site with other archaic and recent human groups. We identified a mosaic of features including facial, mandibular and dental morphology that aligns the Jebel Irhoud material with early or recent anatomically modern humans and more primitive neurocranial and endocranial morphology. In combination with an age of 315 ± 34 thousand years (as determined by thermoluminescence dating)3, this evidence makes Jebel Irhoud the oldest and richest African Middle Stone Age hominin site that documents early stages of the H. sapiens clade in which key features of modern morphology were established.

Comparison of the skulls of a Jebel Irhoud human (left) and a modern human (right) (NHM London)

Hublin–one of the study’s coauthors–notes that between 330,000 and 300,000 years ago, the Sahara was green and animals could range freely across it.

While the Moroccan fossils do look like modern H sapiens, they also still look a lot like pre-sapiens, and the matter is still up for debate. Paleoanthropologist Chris Stringer suggests that we should consider all of our ancestors after the Neanderthals split off to be Homo sapiens, which would make our species 500,000 years old. Others would undoubtedly prefer to use a more recent date, arguing that the physical and cultural differences between 500,000 year old humans and today’s people are too large to consider them one species.

According to the Atlantic:

[The Jebel Irhoud] people had very similar faces to today’s humans, albeit with slightly more prominent brows. But the backs of their heads were very different. Our skulls are rounded globes, but theirs were lower on the top and longer at the back. If you saw them face on, they could pass for a modern human. But they turned around, you’d be looking at a skull that’s closer to extinct hominids like Homo erectus. “Today, you wouldn’t be able to find anyone with a braincase that shape,” says Gunz.

Their brains, though already as large as ours, must also have been shaped differently. It seems that the size of the human brain had already been finalized 300,000 years ago, but its structure—and perhaps its abilities—were fine-tuned over the subsequent millennia of evolution.

No matter how we split it, these are exciting days in the field!


No, Graecopithecus does not prove humans evolved in Europe

Hello! We’re in the midst of a series of posts on recent exciting news in the field of human evolution:

  • Ancient hominins in the US?
  • Homo naledi
  • Homo flores
  • Humans evolved in Europe?
  • In two days, first H Sap was pushed back to 260,000 years,
  • then to 300,000 years!
  • Bell beaker paper

Today we’re discussing the much-publicized claim that scientists have discovered that humans evolved in Europe. (If you haven’t read last week’s post on Homo naledi and flores, I encourage you to do so first.) The way reporters have framed their headlines about the recent Graecopithecus freybergi findings is itself a tale:

The Telegraph proclaimed, “Europe was the birthplace of mankind, not Africa, scientists find,” Newsweek similarly trumpeted, “First Human Ancestor Came from Europe Not Africa,” and CBS News stated, “Controversial study suggests earliest humans lived in Europe – not Africa.”

The Conversation more prudently inquired, “Did humans evolve in Europe rather than Africa? ” and NewScientist and the Washington Post, in a burst of knowing what a “human” is, stated, “Our common ancestor with chimps may be from Europe, not Africa” and “Ape that lived in Europe 7 million years ago could be human ancestor,” respectively.

This all occasioned some very annoying conversations along the lines of “White skin tone couldn’t possibly have evolved within the past 20,000 years because humans evolved in Europe! Don’t you know anything about science?”

Ohkay. Let’s step back a moment and take a look at what Graecopithecus is and what it isn’t.

This is Graecopithecus:

I think there is a second jawbone, but that’s basically it–and that’s not six teeth, that’s three teeth, shown from two different perspectives. There’s no skull, no shoulder blades, no pelvis, no legs.


By contrast, here are Lucy, the famous Australopithecus from Ethiopia, and a sample of the over 1,500 bones and pieces of Homo naledi recently recovered from a cave in South Africa.

Now, given what little scientists had to work with, the fact that they managed to figure out anything about Graecopithecus is quite impressive. The study, reasonably titled “Potential hominin affinities of Graecopithecus from the Late Miocene of Europe,” by
Jochen Fuss, Nikolai Spassov, David R. Begun, and Madelaine Böhm, used μCT and 3D reconstructions of the jawbones and teeth to compare Graecopithecus’s teeth to those of other apes. They decided the teeth were different enough to distinguish Graecopithecus from the nearby but older Ouranopithecus, while looking more like hominin teeth:

G. freybergi uniquely shares p4 partial root fusion and a possible canine root reduction with this tribe and therefore, provides intriguing evidence of what could be the oldest known hominin.

My hat’s off to the authors, but not to all of the reporters who dressed up “teeth look kind of like hominin teeth” as “Humans evolved in Europe!”

First of all, you cannot make that kind of jump based off of two jawbones and a handfull of teeth. Many of the hominin species we have recovered–such as Homo naledi and Homo floresiensis, as you know if you already read the previous post–possessed a mosaic of “ape like” and “human like” traits, ie:

The physical characteristics of H. naledi are described as having traits similar to the genus Australopithecus, mixed with traits more characteristic of the genus Homo, and traits not known in other hominin species. The skeletal anatomy displays plesiomorphic (“ancestral”) features found in the australopithecines and more apomorphic (“derived,” or traits arising separately from the ancestral state) features known from later hominins.[2]

Nebraska Man teeth compared to chimps, Homo erectus, and modern humans

If we only had six Homo naledi bones instead of 1,500 of them, we might be looking only at the part that looks like an Australopithecus instead of the parts that look like H. erectus or totally novel. You simply cannot make that kind of claim off a couple of jawbones. You’re far too likely to be wrong, and then not only will you end up with egg on your face, but you’ll only be giving more fuel to folks who like to proclaim that “Nebraska Man turned out to be a pig!”:

In February 1922, Harold Cook wrote to Dr. Henry Osborn to inform him of the tooth that he had had in his possession for some time. The tooth had been found years prior in the Upper Snake Creek beds of Nebraska along with other fossils typical of North America. … Osborn, along with Dr. William D. Matthew soon came to the conclusion that the tooth had belonged to an anthropoid ape. They then passed the tooth along to William K. Gregory and Dr. Milo Hellman who agreed that the tooth belonged to an anthropoid ape more closely related to humans than to other apes. Only a few months later, an article was published in Science announcing the discovery of a manlike ape in North America.[1] An illustration of H. haroldcookii was done by artist Amédée Forestier, who modeled the drawing on the proportions of “Pithecanthropus” (now Homo erectus), the “Java ape-man,” for the Illustrated London News. …

Examinations of the specimen continued, and the original describers continued to draw comparisons between Hesperopithecus and apes. Further field work on the site in the summers of 1925 and 1926 uncovered other parts of the skeleton. These discoveries revealed that the tooth was incorrectly identified. According to these discovered pieces, the tooth belonged neither to a man nor an ape, but to a fossil of an extinct species of peccary called Prosthennops serus.

That basically sums up everything I learned about human evolution in highschool.


Scientists define “humans” as members of the genus Homo, which emerged around 3 million years ago. These are the guys with funny names like Homo habilis, Homo neanderthalensis, and the embarrassingly named Homo erectus. The genus also includes ourselves, Homo sapiens, who emerged around 200-300,000 years ago.

Homo habilis descended from an Australopithecus, perhaps Lucy herself. Australopithecines are not in the Homo genus; they are not “human,” though they are more like us than modern chimps and bonobos are. They evolved around 4 million years ago.

The Australopithecines evolved, in turn, from even older apes, such as–maybe–Ardipithecus (4-6 million years ago) or Sahelanthropus tchadensis.

Regardless, humans didn’t evolve 7 million years ago. Sahelanthropus and even Lucy do not look like anyone you would call “human.” Humans have only been around for about 3 million years, and our own specific species is only about 300,000 years old. Even if Graecopithecus turns out to be the missing link–the true ancestor of both modern chimps and modern humans–that still does not change where humans evolved, because Graecopithecus narrowly missed being a human by 4 million years.

If you want to challenge the Out of Africa narrative, I think you’d do far better arguing for a multi-regional model of human evolution that includes back-migration of H. erectus into Africa and interbreeding with hominins there as spurring the emergence of H. sapiens than arguing about a 7 million year old jawbone. (I just made that up, by the way. It has no basis in anything I have read. But it at least has the right characters, in the right time frame, in a reasonable situation.)

Sorry this was a bit of a rant; I am just rather passionate about the subject. Next time we’ll examine very exciting news about Bushmen and Pygmy DNA!



Musical Mystery

Singer Tom Jones, famous recipient of ladies’ panties

There are three categories of supersars who seem to attract excessive female interest. The first is actors, who of course are selected for being abnormally attractive and put into romantic and exciting narratives that our brains subconsciously interpret as real. The second are sports stars and other athletes, whose ritualized combat and displays of strength obviously indicate their genetic “fitness” for siring and providing for children.

The third and strangest category is professional musicians, especially rock stars.

I understand why people want to pass athletic abilities on to their children, but what is the evolutionary importance of musical talent? Does music tap into some deep, fundamental instinct like a bird’s attraction to the courtship song of its mate? And if so, why?

There’s no denying the importance of music to American courtship rituals–not only do people visit bars, clubs, and concerts where music is being played in order to meet potential partners, but they also display musical tastes on dating profiles in order to meet musically-like-minded people.

Of all the traits to look for in a mate, why rate musical taste so highly? And why do some people describe their taste as, “Anything but rap,” or “Anything but country”?

Mick Jagger and Chuck Berry

At least when I was a teen, musical taste was an important part of one’s “identity.” There were goths and punks, indie scene kids and the aforementioned rap and country fans.

Is there actually any correlation between musical taste and personality? Do people who like slow jazz get along with other slow jazz fans better than fans of classical Indian? Or is this all compounded by different ethnic groups identifying with specific musical styles?

Obviously country correlates with Amerikaner ancestry; rap with African American. I’m not sure what ancestry is biggest fans of Die Antwoord. Heavy Metal is popular in Finno-Scandia. Rock ‘n Roll got its start in the African American community as “Race Music” and became popular with white audiences after Elvis Presley took up the guitar.

While Europe has a long and lovely musical heritage, it’s indisputable that African Americans have contributed tremendously to American musical innovation.

Here are two excerpts on the subject of music and dance in African societies:

source: A Voyage to Senegal: The Isle of Goreé, and the River Gambia by  Michel Adanson, Correspondent of the Royal Academy of Sciences


source: Africana: The Encyclopedia of the African and African American Experience Aardvark-Catholic. Vol. 1
Elvis’s pelvis, considered too sexy for TV

Both of these h/t HBD Chick and my apologies in advance if I got the sources reversed.

One of the major HBD theories holds that the three races vary–on average–in the distribution of certain traits, such as age of first tooth eruption or intensity of an infant’s response to a tissue placed over its face. Sub-Saharan Africans and Asians are considered two extremes in this distribution, with whites somewhere in between.

If traditional African dancing involves more variety in rhythmic expression than traditional European, does traditional Asian dance involve less? I really know very little about traditional Asian music or dance of any kind, but I would not be surprised to see some kind of continuum affected by whether a society traditionally practiced arranged marriages. Where people chose their own mates, it seems like they display a preference for athletic or musically talented mates (“sexy” mates;) when parents chose mates, they seem to prefer hard-working, devout, “good providers.”

Natasha Rostova and Andrei Bolkonsky, from War and Peace by Tolstoy

Even in traditional European and American society, where parents played more of a role in courtship than they do today, music still played a major part. Young women, if their families could afford it, learned to play the piano or other instruments in order to be “accomplished” and thus more attractive to higher-status men; young men and women often met and courted at musical events or dances organized by the adults.

It is undoubtedly true that music stirs the soul and speaks to the heart, but why?



Why is our Society so Obsessed with Salads?

It’s been a rough day. So I’m going to complain about something totally mundane: salads.

I was recently privy to a conversation between two older women on why it is so hard to stay thin in the South: lack of good salads. Apparently when you go to a southern restaurant, they serve a big piece of meat (often deep-fried steak) a lump of mashed potatoes and gravy, and a finger-bowl with 5 pieces of iceberg lettuce, an orange tomato, and a slathering of dressing.

Sounds good to me.

Now, if you like salads, that’s fine. You’re still welcome here. Personally, I just don’t see the point. The darn things don’t have any calories!

From an evolutionary perspective, obviously food provides two things: calories and nutrients. There may be some foods that are mostly calorie but little nutrient (eg, honey) and some foods that are nutrient but no calorie (salt isn’t exactly a food, but it otherwise fits the bill.)

Food doesn’t seem like it should be that complicated–surely we’ve evolved to eat effectively by now. So any difficulties we have (besides just getting the food) are likely us over-thinking the matter. There’s no problem getting people to eat high-calorie foods, because they taste good. It’s also not hard to get people to eat salt–it also tastes good.

But people seem to have this ambivalent relationship with salads. What’s so important about eating a bunch of leaves with no calories and a vaguely unpleasant flavor? Can’t a just eat a nice potato? Or some corn? Or asparagus?

Don’t get me wrong. I don’t hate vegetables. Just everything that goes in a salad. Heck, I’ll even eat most salad fixins if they’re cooked. I won’t turn down fried green tomatoes, you know.

While there’s nothing wrong with enjoying a bowl of lettuce if that’s your think, I think our society has gone down a fundamentally wrong collective path when it comes to nutrition wisdom. The idea here is that your hunger drive is this insatiable beast that will force you to consume as much food as possible, making you overweight and giving you a heart attack, and so the only way to save yourself is to trick the beast by filling your stomach with fluffy, zero-calorie plants until there isn’t anymore room.

This seems to me like the direct opposite of what you should be doing. See, I assume your body isn’t an idiot, and can figure out whether you’ve just eaten something full of calories, and so should go sleep for a bit, or if you just ate some leaves and should keep looking for food.

I recently tried increasing the amount of butter I eat each day, and the result was I felt extremely full an didn’t want to eat dinner. Butter is a great way to almost arbitrarily increase the amount of calories per volume of food.

If you’re wondering about my weight, well, let’s just say that despite the butter, never going on a diet, and abhorring salads, I’m still not overweight–but this is largely genetic. (I should note though that I don’t eat many sweets at all.)

Obviously I am not a nutritionist, a dietician, nor a doctor. I’m not a good source for health advice. But it seems to me that increasing or decreasing the number of sweats you eat per day probably has a bigger impact on your overall weight than adding or subtracting a salad.

But maybe I’m missing something.


On Socialization

As a parent, I spend much of my day attempting to “socialize” my kids–“Don’t hit your brother! Stop jumping on the couch! For the umpteenth time, ‘yeah, right!’ is sarcasm.”

There are a lot of things that don’t come naturally to little kids. Many of them struggle to understand that these wiggly lines on paper can turn into words or that tiny, invisible things on their hands can make them sick.

“Yes, you have to brush your teeth and go to bed, no, I’m not explaining why again.”

And they definitely don’t understand why I won’t let them have ice cream for dinner.

“Don’t ride your bike down the hill and into the street like that! You could get hit by a car and DIE!”

Despite all of the effort I have devoted to transforming this wiggly bunch of feral children into respectable adults (someday, I hope,) I have never found myself concerned with the task of teaching them about gender. As a practical matter, whether the children behave like “girls” or “boys” makes little difference to the running of the household, because we have both–by contrast, whether the children put their dishes away after meals and do their homework without me having to threaten or cajole them makes a big difference.

Honestly, I can’t convince them not to pick their noses in public or that broccoli is tasty, but I’m supposed to somehow subtly convince them that they’ve got to play Minecraft because they’re boys (even while explicitly saying, “Hey, you’ve been playing that for two hours, go ride your bike,” or that they’re supposed to be walking doormats because they’re girls (even while saying, “Next time he pushes you, push him back!”)

And yet the boys still act like boys, the girls like girls–statistically speaking.

“Ah,” I hear some of you saying, “But you are just one parent! How do you know there aren’t legions of other parents who are out there doing everything they can to ensure that their sons succeed and daughters fail in life?”

This is, if you will excuse me, a very strange objection. What parent desires failure from their children?