Book Club: Chinua Achebe’s Things Fall Apart, pt 2

 

Chinua_Achebe_-_Buffalo_25Sep2008_crop
Chinua Achebe, Author and Nobel Prize winner

Welcome back to our discussion of Chinua Achebe’s Things Fall Apart. Today I thought it would be interesting to look at the history of the Igbo (aka Ibo) people, past and present.

The modern Igbo are one of the world’s larger ethnic groups, numbering around 34 million people, most of whom live in south east/central Nigeria. About 2 million Igbo live abroad, most of them in Britain (which, Chinua recounts, colonized Nigeria.) The Igbo diaspora is well-known for its intelligence, with Igbo students outscoring even Chinese students in the UK:

Although the Chinese and Indians are still very conspicuously above even the best African nationalities, their superiority disappears when the Nigerian and other groups are broken down even further according to their different tribal ethnicities. Groups like the famous Igbo tribe, which has contributed much genetically to the African American blacks, are well known to be high academic achievers within Nigeria. In fact, their performance seems to be at least as high as the “model minority” Chinese and Indians in the UK, as seen when some recent African immigrants are divided into languages spoken at home (which also indicates that these are not multigenerational descendants but children of recent immigrants).

Africans speaking Luganda and Krio did better than the Chinese students in 2011. The igbo were even more impressive given their much bigger numbers (and their consistently high performance over the years, gaining a 100 percent pass rate in 2009!). The superior Igbo achievement on GCSEs is not new and has been noted in studies that came before the recent media discovery of African performance. A 2007 report on “case study” model schools in Lambeth also included a rare disclosure of specified Igbo performance (recorded as Ibo in the table below) and it confirms that Igbos have been performing exceptionally well for a long time (5 + A*-C GCSEs); in fact, it is difficult to find a time when they ever performed below British whites.

Of course, Igbo immigrants to the UK are probably smarter than folks who didn’t figure out how to immigrate to the UK, but Peter Frost argues that even the ones who stayed at home are also pretty smart, via a collection of quotes:

All over Nigeria, Ibos filled urban jobs at every level far out of proportion to their numbers, as laborers and domestic servants, as bureaucrats, corporate managers, and technicians. Two-thirds of the senior jobs in the Nigerian Railway Corporation were held by Ibos. Three-quarters of Nigeria’s diplomats came from the Eastern Region. So did almost half of the 4,500 students graduating from Nigerian universities in 1966. The Ibos became known as the “Jews of Africa,” despised—and envied—for their achievements and acquisitiveness. (Baker, 1980)

The Ibos are the wandering Jews of West Africa — gifted, aggressive, Westernized; at best envied and resented, but mostly despised by the mass of their neighbors in the Federation.(Kissinger, 1969)

So what makes the Igbo so smart? Frost attributes their high IQ to the selective effects of an economy based on trade in which the Igbo were middlemen to the other peoples (mostly Yoruba and Fulani) around them, along with an excellent metalworking tradition:

Archaeological sites in the Niger Delta show that advanced economic development began much earlier there than elsewhere in West Africa. This is seen in early use of metallurgy. At one metallurgical complex, dated to 765 BC, iron ore was smelted in furnaces measuring a meter wide. The molten slag was drained through conduits to pits, where it formed blocks weighing up to 43-47 kg. …

This production seems to have been in excess of local needs and therefore driven by trade with other peoples …

This metallurgy is unusual not only in its early date for West Africa but also in its subsequent development, which reached a high level of sophistication despite a lack of borrowing from metallurgical traditions in the Middle East and Europe.

Here is a fun little video on Igbo bronzes (I recommend watching on double speed and pausing occasionally to appreciate the work quality):

So between the bronze, the river, and long-distance trade, the Igbo became the local market dominant minorities–and like most MDMs, with the arrival of democracy came genocide.

Nigeria achieved independence in 1960 and became a Republic in 1963. Periodic military coups and conflict kept disturbing the peace:

From June through October 1966, pogroms in the North killed an estimated 80,000 to 100,000 Igbo, half of them children, and caused more than a million to two million to flee to the Eastern Region.[76] 29 September 1966, was considered the worst day; because of massacres, it was called ‘Black Thursday’.[77][78]

Ethnomusicologist Charles Keil, who was visiting Nigeria in 1966, recounted:

The pogroms I witnessed in Makurdi, Nigeria (late Sept. 1966) were foreshadowed by months of intensive anti-Ibo and anti-Eastern conversations among Tiv, Idoma, Hausa and other Northerners resident in Makurdi, and, fitting a pattern replicated in city after city, the massacres were led by the Nigerian army. Before, during and after the slaughter, Col. Gowon could be heard over the radio issuing ‘guarantees of safety’ to all Easterners, all citizens of Nigeria, but the intent of the soldiers, the only power that counts in Nigeria now or then, was painfully clear. After counting the disemboweled bodies along the Makurdi road I was escorted back to the city by soldiers who apologised for the stench and explained politely that they were doing me and the world a great favor by eliminating Igbos.

… until the Igbos decided they’d had enough and declared themselves an independent country, Biafra, triggering a civil war. The Nigerian and British governments blockaded Biafra, resulting in mass starvation that left nearly 2 million dead.

Why the British government thought it was important to use money to starve children, I don’t know.

(Hint: the answer is oil.)

During the war, Britain covertly supplied Nigeria with weapons and military intelligence and may have also helped it to hire mercenaries.[102] After the decision was made to back Nigeria, the BBC oriented its reporting to favour this side.[103] Supplies provided to the Federal Military Government included two vessels and 60 vehicles.[104]

Go BBC!

(Richard Nixon, always the voice of morality, was against the blockade.)

Chinua Achebe published Things Fall Apart in 1959, the year before independence–so he was awfully prescient.

As news got out about the genocide, people began demanding that food be airlifted into Biafra, but there were some problems getting the food off the ground:

800px-Starved_girl
Just “enemy propaganda” of little girls starving to death

Secretary General of the United Nations, U Thant, refused to support the airlift.[5] The position of the Organization of African Unity was to not intervene in conflicts its members’ deemed internal and to support the nation-state boundaries instituted during the colonial era.[6] The ruling Labour Party of the United Kingdom, which together with the USSR was supplying arms to the Nigerian military,[7]dismissed reports of famine as “enemy propaganda”.[8] Mark Curtis writes that the UK also reportedly provided military assistance on the ‘neutralisation of the rebel airstrips’, with the understanding that their destruction would put them out of use for daylight humanitarian relief flights.[9]

Soon Joint Church Aid, a combination of Protestant, Catholic, Jewish, and other NG organizations, chartered airplanes and began running the Nigerian blockade (three relief workers were killed when their plane was shot down.) The Biafra Airlift is second in scope only to the Berlin Airlift in non-combat airlifts.

Since the end of the war, the state of the Igbo people has steadily improved (the presence of oil in the area is finally benefiting them.)

Modern Nigeria has about 200 million people with a fertility rate around 5.5 children per woman (I don’t have data specifically about the Igbo) and a per capita GDP around $2,000, which is high for the area.

It’s getting late, so I’d like to end with some modern Igbo music, a reminder that the people we read about in anthropology books (or literature) never stay in anthropology books:

Advertisement

Is there any reliable way to distinguish between low IQ and insanity? 

I see claims like this surprisingly often:

Of course there are smart people who are insane, and dumb people who are completely rational. But if we define intelligence as having something to do with accurately understanding and interpreting the information we constantly receive from the world, necessary to make accurate predictions about the future and how one’s interactions with others will go, there’s a clear correlation between accurately understanding the world and being sane.

In other words, a sufficiently dumb person, even a very sane one, will be unable to distinguish between accurate and inaccurate depictions of reality and so can easily espouse beliefs that sound, to others, completely insane.

Is there any way to distinguish between a dumb person who believes wrong things by accident and a smart person who believes wrong things because they are insane?

Digression: I have a friend who was homeless for many years. Eventually he was diagnosed as mentally ill and given a disability check.

“Why?” he asked, but received no answer. He struggled (and failed) for years to prove that he was not disabled.

Eventually he started hearing voices, was diagnosed with schizophrenia, and put on medication. Today he is not homeless, due at least in part to the positive effects of anti-psychotics.

The Last Psychiatrist has an interesting post (deleted from his blog, but re-posted elsewhere,) on how SSI is determined:

Say you’re poor and have never worked. You apply for Welfare/cash payments and state Medicaid. You are obligated to try and find work or be enrolled in a jobs program in order to receive these benefits. But who needs that? Have a doctor fill out a form saying you are Temporarily Incapacitated due to Medical Illness. Yes, just like 3rd grade. The doc will note the diagnosis, however, it doesn’t matter what your diagnosis is, it only matters that a doctor says you are Temporarily Incapacitated. So cancer and depression both get you the same benefits.

Nor does it matter if he medicates you, or even believes you, so long as he signs the form and writes “depression.”(1) The doc can give you as much time off as he wants (6 months is typical) and you can return, repeatedly, to get another filled out. You can be on state medicaid and receive cash payments for up to 5 years. So as long as you show up to your psych appointments, you’ll can receive benefits with no work obligation.

“That’s not how it works for me”

you might say, which brings us to the whole point: it’s not for you. It is for the entire class of people we label as poor, about whom comic Greg Geraldo joked: “it’s easy to forget there’s so much poverty in the United States, because the poor people look just like black people.” Include inner city whites and hispanics, and this is how the government fights the War On Poverty.

In the inner cities, the system is completely automated. Poor person rolls in to the clinic, fills out the paperwork (doc signs a stack of them at the end of the day), he sees a therapist therapist, a doctor, +/- medications, and gets his benefits.

There’s no accountability, at all. I have never once been asked by the government whether the person deserved the money, the basis for my diagnosis– they don’t audit the charts, all that exists is my sig on a two page form. The system just is.

atlantahomeless
see if you can find the one poor person hidden in this picture (Last Psychiatrist)

Enter SSI, Supplemental Security Income. You can earn lifetime SSI benefits (about $600/mo + medical insurance) if “you” can “show” you are “Permanently Disabled” due to a “medical illness.”
You“= your doc who fills out a packet with specific questions; and maybe a lawyer who processes the massive amounts of other paperwork, and argues your case, and charges about 20% of a year’s award.

show” has a very specific legal definition: whatever the judge feels like that day. I have been involved in thousands of these SSI cases, and to describe the system as arbitrary is to describe Blake Lively as “ordinary.”

Permanently disabled” means the illness prevents you from ever working. “But what happens when you get cured?” What is this, the future? You can’t cure bipolar.

Medical illness” means anything. The diagnosis doesn’t matter, only that “you” show how the diagnosis makes it impossible for you to work. Some diagnoses are easier than others, but none are impossible. “Unable to work” has specific meaning, and specific questions are asked: ability to concentrate, ability to complete a workweek, work around others, take criticism from supervisors, remember and execute simple/moderately difficult/complex requests and tasks, etc.

Fortunately, your chances of being awarded SSI are 100%…

It’s a good post. You should read the whole thing.

TLP’s point is not that the poor are uniformly mentally ill, but that our country is using the disability system as a means of routing money to poor people in order to pacify them (and maybe make their lives better.)

I’ve been playing a bit of sleight of hand, here, subbing in “poor” and “dumb.” But they are categories that highly overlap, given that dumb people have trouble getting jobs that pay well. Despite TLP’s point, many of the extremely poor are, by the standards of the middle class and above, mentally disabled. We know because they can’t keep a job and pay their bills on time.

“Disabled” is a harsh word to some ears. Who’s to say they aren’t equally able, just in different ways?

Living under a bridge isn’t being differently-abled. It just sucks.

Normativity bias happens when you assume that everyone else is just like you. Middle and upper-middle class people tend to assume that everyone else thinks like they do, and the exceptions, like guys who think the CIA is trying to communicate with them via the fillings in their teeth, are few and far between.

As for the vast legions of America’s unfortunates, they assume that these folks are basically just like themselves. If they aren’t very bright, this only means they do their mental calculations a little slower–nothing a little hard work, grit, mindfulness, and dedication can’t make up for. The fact that anyone remains poor, then, has to be the fault of either personal failure (immorality) or outside forces like racism keeping people down.

These same people often express the notion that academia or Mensa are crawling with high-IQ weirdos who can barely tie their shoes and are incapable of socializing with normal humans, to which I always respond that furries exist. 

These people need to get out more if they think a guy successfully holding down a job that took 25 years of work in the same field to obtain and that requires daily interaction with peers and students is a “weirdo.” Maybe he wears more interesting t-shirts than a middle manager at BigCorp, but you should see what the Black Hebrew Israelites wear.

I strongly suspect that what we would essentially call “mental illness” among the middle and upper classes is far more common than people realize among the lower classes.

As I’ve mentioned before, there are multiple kinds of intellectual retardation. Some people suffer physical injuries (like shaken baby syndrome or encephalitis), some have genetic defects like Down’s Syndrome, and some are simply dull people born to dull parents. Intelligence is part genetic, so just as some people are gifted with lucky smart genes, some people are visited by the stupid fairy, who only leaves dumb ones. Life isn’t fair.

Different kinds of retardation manifest differently, with different levels of overall impairment in life skills. There are whole communities where the average person tests as mentally retarded, yet people in these communities go providing for themselves, building homes, raising their children, etc. They do not do so in the same ways as we would–and there is an eternal chicken and egg debate about whether the environment they are raised in causes their scores, or their scores cause their environment–but nevertheless, they do.

All of us humans are descended from people who were significantly less intelligent than ourselves. Australopithecines were little smarter than chimps, after all. The smartest adult pygmy chimps, (bonobos) like Kanzi, only know about 3,000 words, which is about the same as a 3 or 4 year old human. (We marvel that chimps can do things a kindergartener finds trivial, like turn on the TV.) Over the past few million years, our ancestors got a lot smarter.

How do chimps think about the world? We have no particular reason to assume that they think about it in ways that substantially resemble our own. While they can make tools and immediately use them, they cannot plan for tomorrow (dolphins probably beat them at planning.) They do not make sentences of more than a few words, much less express complex ideas.

Different humans (and groups of humans) also think about the world in very different ways from each other–which is horrifyingly obvious if you’ve spent any time talking to criminals. (The same people who think nerds are weird and bad at socializing ignore the existence of criminals, despite strategically moving to neighborhoods with fewer of them.)

Even non-criminals communities have all sorts of strange practices, including cannibalism, human sacrifice, wife burning, genital mutilation, coprophagy, etc. Anthropologists (and economists) have devoted a lot of effort to trying to understand and explain these practices as logical within their particular contexts–but a different explanation is possible: that different people sometimes think in very different ways.

For example, some people think there used to be Twa Pygmies in Ireland, before that nefarious St. Patrick got there and drove out all of the snakes. (Note: Ireland did’t have snakes when Patrick arrived.)

(My apologies for this being a bit of a ramble, but I’m hoping for feedback from other people on what they’ve observed.)

How do you Raise a Genius?

 

220px-littleprince
Recommended, of course.

Special Announcement: I have launched a new blog, “Unpaused Books“, for my Homeschooling Corner posts and reviews of children’s literature. (The title is a pun.) I try to keep the posts entertaining, in my usual style.

Back to genius:

“My kid is a genius.”

It feels rather like bragging, doesn’t it? So distasteful. No one likes a braggart. Ultimately, though, someone has to be a genius–or brilliant, gifted, talented–it’s a statistical inevitability.

So let’s compromise. Your kid’s the genius; I’m just a very proud parent with a blog.

So how do you raise a genius? Can you make a kid a genius?

Unfortunately, kids don’t come with instructions. As far as anyone can tell, there’s no reliable way to transform an average person into a genius (the much bally-hooed “growth mindset” might be useful for getting a kid to concentrate for a few minutes, but it has no long-term effects:

A growing number of recent studies are casting doubt on the efficacy of mindset interventions at scale. A large-scale study of 36 schools in the UK, in which either pupils or teachers were given training, found that the impact on pupils directly receiving the intervention did not have statistical significance, and that the pupils whose teachers were trained made no gains at all. Another study featuring a large sample of university applicants in the Czech Republic used a scholastic aptitude test to explore the relationship between mindset and achievement. They found a slightly negative correlation, with researchers claiming that ‘the results show that the strength of the association between academic achievement and mindset might be weaker than previously thought’. A 2012 review for the Joseph Rowntree Foundation in the UK of attitudes to education and participation found ‘no clear evidence of association or sequence between pupils’ attitudes in general and educational outcomes, although there were several studies attempting to provide explanations for the link (if it exists)’. In 2018, two meta-analyses in the US found that claims for the growth mindset might have been overstated, and that there was ‘little to no effect of mindset interventions on academic achievement for typical students’.).

Of course, there are many ways to turn a genius into a much less intelligent person–such as dropping them on their head.

terman1916fig2iqdistribution
IQ score distribution chart for sample of 905 children tested on 1916 Stanford–Binet Test, from from Terman’s The Measurement of Intelligence

While there is no agreed-upon exact cut-off for genius, it is generally agreed to correlate more or less with the right side of the IQ bell-curve–though exceptions exist. Researchers have studied precocious and gifted children and found that, yes, they tend to turn out to be talented, high-achieving adults:

Terman’s goal was to disprove the then-current belief that gifted children were sickly, socially inept, and not well-rounded. …

Based on data collected in 1921–22, Terman concluded that gifted children suffered no more health problems than normal for their age, save a little more myopia than average. He also found that the children were usually social, were well-adjusted, did better in school, and were even taller than average.[25] A follow-up performed in 1923–1924 found that the children had maintained their high IQs and were still above average overall as a group. …

Well over half of men and women in Terman’s study finished college, compared to 8% of the general population at the time.[31] Some of Terman’s subjects reached great prominence in their fields. Among them were head I Love Lucy writer Jess Oppenheimer,[32] American Psychological Association president and educational psychologist Lee Cronbach,[33] Ancel Keys,[34] and Robert Sears himself.[32] Over fifty men became college and university faculty members.[35] However, the majority of study participants’ lives were more mundane.

The only really useful parenting advice IQ researchers have come up with so far is to make sure your son or daughter has appropriately challenging school work.

child_genius_chart_newsfeature_web
Source

The SMPY data supported the idea of accelerating fast learners by allowing them to skip school grades. In a comparison of children who bypassed a grade with a control group of similarly smart children who didn’t, the grade-skippers were 60% more likely to earn doctorates or patents and more than twice as likely to get a PhD in a STEM field6. …

Skipping grades is not the only option. SMPY researchers say that even modest interventions — for example, access to challenging material such as college-level Advanced Placement courses — have a demonstrable effect.

This advice holds true whether one’s children are “geniuses” or not. All children benefit from activities matched to their abilities, high or low; no one benefits from being bored out of their gourd all day or forced into activities that are too difficult to master. It also applies whether a child’s particular abilities lie in schoolwork or not–some children are amazingly talented at art, sports, or other non-academic skills.

Homeschooling, thankfully, allows you to tailor your child’s education to exactly their needs. This is especially useful for kids who are advanced in one or two academic areas, but not all of them, or who have the understanding necessary for advanced academics, but not the age-related maturity to sit through advanced classes.

That all said, gifted children are still children, and all children need time to play, relax, and have fun. They’re smart–not robots.

The Female Problem

 

800px-otto_hahn_und_lise_meitner
Lise Meitner and Otto Hahn in their laboratory, 1912

As Pumpkin Person reports, 96% of people with math IQs over 154 are male (at least in the early 1980s.) Quoting from  Feingold, A. (1988). Cognitive gender differences are disappearing. American Psychologist, 43(2), 95-103:

When the examinees from the two test administrations were combined, 96% of 99 scores of 800 (the highest possible scaled score), 90% of 433 scores in the 780-790 range, 81% of 1479 scores between 750 and 770, and 56% of 3,768 scores of 600 were earned by boys.

The linked article notes that this was an improvement over the previous gender gap in high-end math scores. (This improvement may itself be an illusion, due to the immigration of smarter Asians rather than any narrowing of the gap among locals.)

I don’t know what the slant is among folks with 800s on the verbal sub-test, though it is probably less–far more published authors and journalists are male than top mathematicians are female. (Language is a much older human skill than math, and we seem to have a corresponding easier time with it.) ETA: I found some data. Verbal is split nearly 50/50 across the board; the short-lived essay had a female bias. Since the 90s, the male:female ratio for scores over 700 improved from 13:1 to 4:1; there’s more randomness in the data for 800s, but the ratio is consistently more male-dominated.

High SAT (or any other sort of) scores is isolating. A person with a combined score between 950 and 1150 (on recent tests) falls comfortably into the middle of the range; most people have scores near them. A person with a score above 1350 is in the 90th%–that is, 90% of people have scores lower than theirs.

People with scores that round up to 1600 are above the 99th%. Over 99% of people have lower scores than they do.

And if on top of that you are a female with a math score above 750, you’re now a minority within a minority–75% or more of the tiny sliver of people at your level are likely to be male.

Obviously the exact details change over time–the SAT is periodically re-normed and revised–and of course no one makes friends by pulling out their SAT scores and nixing anyone with worse results.

But the general point holds true, regardless of our adjustments, because people bond with folks who think similarly to themselves, have similar interests, or are classmates/coworkers–and if you are a female with high math abilities, you know well that your environment is heavily male.

This is not so bad if you are at a point in your life when you are looking for someone to date and want to be around lots of men (in fact, it can be quite pleasant.) It becomes a problem when you are past that point, and looking for fellow women to converse with. Married women with children, for example, do not typically associate in groups that are 90% male–nor should they, for good reasons I can explain in depth if you want me to.

A few months ago, a young woman named Kathleen Rebecca Forth committed suicide. I didn’t know Forth, but she was a nerd, and nerds are my tribe.

She was an effective altruist who specialized in understanding people through the application of rationality techniques. She was in the process of becoming a data scientist so that she could earn the money she needed to dedicate her life to charity.

I cannot judge the objective truth of Forth’s suicide letter, because I don’t know her nor any of the people in her particular communities. I have very little experience with life as a single person, having had the good luck to marry young. Nevertheless, Forth is dead.

At the risk of oversimplifying the complex motivations for Forth’s death, she was desperately alone and felt like she had no one to protect her. She wanted friends, but was instead surrounded by men who wanted to mate with her (with or without her consent.) Normal people can solve this problem by simply hanging out with more women. This is much harder for nerds:

Rationality and effective altruism are the loves of my life. They are who I am.

I also love programming. Programming is part of who I am.

I could leave rationality, effective altruism and programming to escape the male-dominated environments that increase my sexual violence risk so much. The trouble is, I wouldn’t be myself. I would have to act like someone else all day.

Imagine leaving everything you’re interested in, and all the social groups where people have something in common with you. You’d be socially isolated. You’d be constantly pretending to enjoy work you don’t like, to enjoy activities you’re not interested in, to bond with people who don’t understand you, trying to be close to people you don’t relate to… What kind of life is that? …

Before I found this place, my life was utterly unengaging. No one was interested in talking about the same things. I was actually trying to talk about rationality and effective altruism for years before I found this place, and was referred into it because of that!

My life was tedious and very lonely. I never want to go back to that again. Being outside this network felt like being dead inside my own skin.

Why Forth could not effectively change the way she interacted with men in order to decrease the sexual interest she received from them, I do not know–it is perhaps unknowable–but I think her life would not have ended had she been married.

A couple of years ago, I met someone who initiated a form of attraction I’d never experienced before. I was upset because of a sex offender and wanted to be protected. For months, I desperately wanted this person to protect me. My mind screamed for it every day. My survival instincts told me I needed to be in their territory. This went on for months. I fantasized about throwing myself at them, and even obeying them, because they protected me in the fantasy.

That is very strange for me because I had never felt that way about anyone. Obedience? How? That seemed so senseless.

Look, no one is smart in all ways at once. We all have our blind spots. Forth’s blind spot was this thing called “marriage.” It is perhaps also a blind spot for most of the people around her–especially this one. She should not be condemned for not being perfect, any more than the rest of us.

But we can still conclude that she was desperately lonely for normal things that normal people seek–friendship, love, marriage–and her difficulties hailed in part from the fact that her environment was 90% male. She had no group of like-minded females to bond with and seek advice and feedback from.

Forth’s death prompted me to create The Female Side, an open thread for any female readers of this blog, along with a Slack-based discussion group. (The invite is in the comments over on the Female Side.) You don’t have to be alone. (You don’t even have to be good at math.) We are rare, but we are out here.

(Note: anyone can feel free to treat any thread as an Open Thread, and some folks prefer to post over on the About page.)

Given all of this, why don’t I embrace efforts to get more women into STEM? Why do I find these efforts repulsive, and accept the heavily male-dominated landscape? Wouldn’t it be in my self-interest to attract more women to STEM and convince people, generally, that women are talented at such endeavors?

I would love it if more women were genuinely interested in STEM. I am also grateful to pioneers like Marie Curie and Lise Meitner, whose brilliance and dedication forced open the doors of academies that had formerly been entirely closed to women.

The difficulty is that genuine interest in STEM is rare, and even rarer in women. The over-representation of men at both the high and low ends of mathematical abilities is most likely due to biological causes that even a perfect society that removes all gender-based discrimination and biases cannot eliminate.

It does not benefit me one bit if STEM gets flooded with women who are not nerds. That is just normies invading and taking over my territory. It’s middle school all over again.

If your idea of “getting girls interested in STEM” includes makeup kits and spa masks, I posit that you have no idea what you’re talking about, you’re appropriating my culture, and you can fuck off.

Please take a moment to appreciate just how terrible this “Project Mc2” “Lip Balm Lab” is. I am not sure I have words sufficient to describe how much I hate this thing and its entire line, but let me try to summarize:

There’s nothing inherently wrong with lib balm. The invention of makeup that isn’t full of lead and toxic chemicals was a real boon to women. There are, in fact, scientists at work at makeup companies, devoted to inventing new shades of eye shadow, quicker-drying nail polish, less toxic lipstick, etc.

And… wearing makeup is incredibly normative for women. Little girls play at wearing makeup. Obtaining your first adult makeup and learning how to apply it is practically a rite of passage for young teens. Most adult women love makeup and wear it every day.

Except:

Nerd women.

Female nerds just aren’t into makeup.

Marie Curie
Marie Curie, fashionista

I’m not saying they never wear makeup–there’s even a significant subculture of people who enjoy cosplay/historical re-enactment and construct elaborate costumes, including makeup–but most of us don’t. Much like male nerds, we prioritize comfort and functionality in the things covering our bodies, not fashion trends.

And if anything, makeup is one of the most obvious shibboleths that distinguishes between nerd females and normies.

In other words, they took the tribal marker of the people who made fun of us throughout elementary and highschool and repackaged it as “Science!” in an effort to get more normies into STEM, and I’m supposed to be happy about this?!

I am not ashamed of the fact that women are rarer than men at the highest levels of math abilities. Women are also rarer than men at the lowest levels of math abilities. I feel no need to cram people into disciplines they aren’t actually interested in just so we can have equal numbers of people in each–we don’t need equal numbers of men and women in construction work, plumbing, electrical engineering, long-haul trucking, nursing, teaching, childcare, etc.

It’s okay for men and women to enjoy different things–on average–and it’s also okay for some people to have unusual talents or interests.

It’s okay to be you.

(I mean, unless you’re a murderer or something. Then don’t be you.)

A Modest Educational Proposal

Source

Fellow humans, we have a problem. (And another problem.)

At least, this looks like a problem to me., especially when I’m trying to make conversation at the local moms group.

There are many potential reasons the data looks like this (including inaccuracy, though my lived experience says it is accurate.) Our culture encourages people to limit their fertility, and smart women are especially so encouraged. Smart people are also better at long-term planning and doing things like “reading the instructions on the birth control.”

But it seems likely that there is another factor, an arrow of causation pointing in the other direction: smart people tend to stay in school for longer, and people dislike having children while they are still in school. While you are in school, you are in some sense still a child, and we have a notion that children shouldn’t beget children.

Isaac Newton. Never married. Probably a virgin.

People who drop out of school and start having children at 16 tend not to be very smart and also tend to have plenty of children during their child-creating years. People who pursue post-docs into their thirties tend to be very smart–and many of them are virgins.

Now, I don’t know about you, but I kind of like having smart people around, especially the kinds of people who invent refrigerators and make supply chains work so I can enjoy eating food, even though I live in a city, far from any farms. I don’t want to live in a world where IQ is crashing and we can no longer maintain complex technological systems.

We need to completely re-think this system where the smarter you are, the longer you are expected to stay in school, accruing debt and not having children.

Proposal one: Accelerated college for bright students. Let any student who can do college-level work begin college level work for college credits, even if they are still in high (or middle) school. There are plenty of bright students out there who could be completing their degrees by 18.

The entirely framework of schooling probably ought to be sped up in a variety of ways, especially for bright students. The current framework often reflects the order in which various discoveries were made, rather than the age at which students are capable of learning the material. For example, negative numbers are apparently not introduced in the math curriculum until 6th grade, even though, in my experience, even kindergarteners are perfectly capable of understanding the concept of “debt.” If I promise to give you one apple tomorrow, then I have “negative one apple.” There is no need to hide the concept of negatives for 6 years.

Proposal two: More apprenticeship.

In addition to being costly and time-consuming, a college degree doesn’t even guarantee that your chosen field will still be hiring when you graduate. (I know people with STEM degrees who graduated right as the dot.com bubble burst. Ouch.) We essentially want our educational system to turn out people who are highly skilled at highly specialized trades, and capable of turning around and becoming highly skilled at another highly specialized trade on a dime if that doesn’t work out. This leads to chemists returning to university to get law degrees; physicists to go back for medical degrees. We want students to have both “broad educations” so they can get hired anywhere, and “deep educations” so they’ll actually be good at their jobs.

Imagine, instead, a system where highschool students are allowed to take a two-year course in preparation for a particular field, at the end of which high performers are accepted into an apprenticeship program where the continue learning on the job. At worst, these students would have a degree, income, and job experience by the age of 20, even if they decided they now wanted to switch professions or pursue an independent education.

Proposal three: Make childbearing normal for adult students.

There’s no reason college students can’t get married and have children (aside from, obviously, their lack of jobs and income.) College is not more time consuming or physically taxing than regular jobs, and college campuses tend to be pretty pleasant places. Studying while pregnant isn’t any more difficult than working while pregnant.

Grad students, in particular, are old and mature enough to get married and start families, and society should encourage them to do so.

Proposal four: stop denigrating child-rearing, especially for intelligent women.

Children are a lot of work, but they’re also fun. I love being with my kids. They are my family and an endless source of happiness.

What people want and value, they will generally strive to obtain.

 

These are just some ideas. What are yours?

Book Club: The Code Economy pt 1

I don’t think the publishers got their money’s worth on cover design

Welcome to EvX’s Book Club. Today we begin our exciting tour of Philip E. Auerswald’s The Code Eoconomy: A Forty-Thousand-Year History. with the introduction, Technology = Recipes, and Chapter one, Jobs: Divide and Coordinate if we get that far.

I’m not sure exactly how to run a book club, so just grab some coffee and let’s dive right in.

First, let’s note that Auerswald doesn’t mean code in the narrow sense of “commands fed into a computer” but in a much broader sense of all encoded processes humans have come up with. His go-to example is the cooking recipe.

The Code Economy describes the evolution of human productive activity from simplicity to complexity over the span of more than 40,000 years. I call this evolutionary process the advance of code.

I find the cooking example a bit cutesy, but otherwise it gets the job done.

How… have we humans managed to get where we are today despite our abundant failings, including wars, famine, and a demonstrably meager capacity for society-wide planning and coordination? … by developing productive activities that evolve into regular routines and standardized platforms–which is to say that we have survived, and thrived, by creating and advancing code.

There’s so much in this book that almost every sentence bears discussion. First, as I’ve noted before, social organization appears to be a spontaneous emergent feature of every human group. Without even really meaning to, humans just naturally seem compelled organize themselves. One day you’re hanging out with your friends, riding motorcycles, living like an outlaw, and the next thing you know you’re using the formal legal system to sue a toy store for infringement of your intellectual property.

Alexander Wienberger, Holodomor

At the same time, our ability to organize society at the national level is completely lacking. As one of my professors once put it, “God must hate communists, because every time a country goes communist, an “act of god” occurs and everyone dies.”

It’s a mystery why God hates communists so much, but hate ’em He does. Massive-scale social engineering is a total fail and we’ll still be suffering the results for a long time.

This creates a kind of conflict, because people can look at the small-scale organizing they do, and they look at large-scale disorganization, and struggle to understand why the small stuff can’t simply be scaled up.

And yet… society still kind of works. I can go to the grocery store and be reasonably certain that by some magical process, fresh produce has made its way from fields in California to the shelf in front of me. By some magical process, I can wave a piece of plastic around and use it to exchange enough other, unseen goods to pay for my groceries. I can climb into a car I didn’t build and cruise down a network of streets and intersections, reasonably confident that everyone else driving their own two-ton behemoth at 60 miles an hour a few feet away from me has internalized the same rules necessary for not crashing into me. Most of the time. And I can go to the gas station and pour a miracle liquid into my car and the whole system works, whether or not I have any clue how all of the parts manage to come together and do so.

The result is a miracle. Modern society is a miracle. If you don’t believe me, try using an outhouse for a few months. Try carrying all of your drinking water by hand from the local stream and chopping down all of the wood you need to boil it to make it potable. Try fighting off parasites, smallpox, or malaria without medicine or vaccinations. For all my complaints (and I know I complain a lot,) I love civilization. I love not worrying about cholera, crop failure, or dying from cavities. I love air conditioning, refrigerators, and flush toilets. I love books and the internet and domesticated strawberries. All of these are things I didn’t create and can’t take credit for, but get to enjoy nonetheless. I have been blessed.

But at the same time, “civilization” isn’t equally distributed. Millions (billions?) of the world’s peoples don’t have toilets, electricity, refrigerators, or even a decent road from their village to the next.

GDP per capita by country

Auerswald is a passionate champion of code. His answer to unemployment problems is probably “learn to code,” but in such a broad, metaphorical way that encompasses so many human activities that we can probably forgive him for it. One thing he doesn’t examine is why code takes off in some places but not others. Why is civilization more complex in Hong Kong than in Somalia? Why does France boast more Fields Medalists than the DRC?

In our next book (Niall Ferguson’s The Great Degeneration,) we’ll discuss whether specific structures like legal and tax codes can affect how well societies grow and thrive (spoiler alert: they do, just see communism,) and of course you are already familiar with the Jared Diamond environmentalist theory that folks in some parts of the world just had better natural resources to work than in other parts (also true, at least in some cases. I’m not expecting some great industry to get up and running on its own in the arctic.)

IQ by country

But laying these concerns aside, there are obviously other broad factors at work. A map of GDP per capita looks an awful lot like a map of average IQs, with obvious caveats about the accidentally oil-rich Saudis and economically depressed ex-communists.

Auerswald believes that the past 40,000 years of code have not been disasters for the human race, but rather a cascade of successes, as each new invention and expansion to our repertoir of “recipes” or “codes” has enabled a whole host of new developments. For example, the development of copper tools didn’t just put flint knappers out of business, it also opened up whole new industries because you can make more varieties of tools out of copper than flint. Now we had copper miners, copper smelters (a  new profession), copper workers. Copper tools could be sharpened and, unlike stone, resharpened, making copper tools more durable. Artists made jewelry; spools of copper wires became trade goods, traveling long distances and stimulating the prehistoric “economy.” New code bequeaths complexity and even more code, not mass flint-knapper unemployment.

Likewise, the increase in reliable food supply created by farming didn’t create mass hunter-gatherer unemployment, but stimulated the growth of cities and differentiation of humans into even more professions, like weavers, cobblers, haberdashers, writers, wheelwrights, and mathematicians.

It’s a hopeful view, and I appreciate it in these anxious times.

But it’s very easy to say that the advent of copper or bronze or agriculture was a success because we are descended from the people who succeeded. We’re not descended from the hunter-gatherers who got displaced or wiped out by agriculturalists. In recent cases where hunter-gatherer or herding societies were brought into the agriculturalist fold, the process has been rather painful.

Elizabeth Marshall Thomas’s The Harmless People, about the Bushmen of the Kalahari, might overplay the romance and downplay the violence, but the epilogue’s description of how the arrival of “civilization” resulted in the deaths and degradation of the Bushmen brought tears to my eyes. First they died of dehydration because new fences erected to protect “private property” cut them off from the only water. No longer free to pursue the lives they had lived for centuries, they were moved onto what are essentially reservations and taught to farm and herd. Alcoholism and violence became rampant.

Among the book’s many characters was a man who had lost most of his leg to snakebite. He suffered terribly as his leg rotted away, cared for by his wife and family who brought him food. Eventually, with help, he healed and obtained a pair of crutches, learned to walk again, and resumed hunting: providing for his family.

And then in “civilization” he was murdered by one of his fellow Bushmen.

It’s a sad story and there are no easy answers. Bushman life is hard. Most people, when given the choice, seem to pick civilization. But usually we aren’t given a choice. The Bushmen weren’t. Neither were factory workers who saw their jobs automated and outsourced. Some Bushmen will adapt and thrive. Nelson Mandela was part Bushman, and he did quite well for himself. But many will suffer.

What to do about the suffering of those left behind–those who cannot cope with change, who do not have the mental or physical capacity to “learn to code” or otherwise adapt remains an unanswered question. Humanity might move on without them, ignoring their suffering because we find them undeserving of compassion–or we might get bogged down trying to save them all. Perhaps we can find a third route: sympathy for the unfortunate without encouraging obsolete behavior?

In The Great Degeneration, Ferguson wonders why the systems (“code”) that supports our society appears to be degenerating. I have a crude but answer: people are getting stupider. It takes a certain amount of intelligence to run a piece of code. Even a simple task like transcribing numbers is better performed by a smarter person than a dumber person, who is more likely to accidentally write down the wrong number. Human systems are built and executed by humans, and if the humans in them are less intelligent than the ones who made them, then they will do a bad job of running the systems.

Unfortunately for those of us over in civilization, dysgenics is a real thing:

Source: Audacious Epigone

Whether you blame IQ itself or the number of years smart people spend in school, dumb people have more kids (especially the parents of the Baby Boomers.) Epigone here only looks at white data (I believe Jayman has the black data and it’s just as bad, if not worse.)

Of course we can debate about the Flynn effect and all that, but I suspect there two competing things going on: First, a rising 50’s economic tide lifted all boats, making everyone healthier and thus smarter and better at taking IQ tests and making babies, and second, declining infant mortality since the late 1800s and possibly the Welfare state made it easier for the children of the poorest and least capable parents to survive.

The effects of these two trends probably cancel out at first, but after a while you run out of Flynn effect (maybe) and then the other starts to show up. Eventually you get Greece: once the shining light of Civilization, now defaulting on its loans.

Well, we have made it a page in!

Termite City

What do you think of the book? Have you finished it yet? What do you think of the way Auersbach conceptualizes of “code” and its basis as the building block of pretty much all human activity? Do you think Auersbach is essentially correct to be hopeful about our increasingly code-driven future, or should we beware of the tradeoffs to individual autonomy and freedom inherent in becoming a glorified colony of ants?

Are “Nerds” Just a Hollywood Stereotype?

Yes, MIT has a football team.

The other day on Twitter, Nick B. Steves challenged me to find data supporting or refuting his assertion that Nerds vs. Jocks is a false stereotype, invented around 1975. Of course, we HBDers have a saying–“all stereotypes are true,” even the ones about us–but let’s investigate Nick’s claim and see where it leads us.

(NOTE: If you have relevant data, I’d love to see it.)

Unfortunately, terms like “nerd,” “jock,” and “chad” are not all that well defined. Certainly if we define “jock” as “athletic but not smart” and nerd as “smart but not athletic,” then these are clearly separate categories. But what if there’s a much bigger group of people who are smart and athletic?

Or what if we are defining “nerd” and “jock” too narrowly? Wikipedia defines nerd as, “a person seen as overly intellectual, obsessive, or lacking social skills.” I recall a study–which I cannot find right now–which found that nerds had, overall, lower-than-average IQs, but that study included people who were obsessive about things like comic books, not just people who majored in STEM. Similarly, should we define “jock” only as people who are good at sports, or do passionate sports fans count?

For the sake of this post, I will define “nerd” as “people with high math/science abilities” and “jock” as “people with high athletic abilities,” leaving the matter of social skills undefined. (People who merely like video games or watch sports, therefore, do not count.)

Nick is correct on one count: according to Wikipedia, although the word “nerd” has been around since 1951, it was popularized during the 70s by the sitcom Happy Days. However, Wikipedia also notes that:

An alternate spelling,[10] as nurd or gnurd, also began to appear in the mid-1960s or early 1970s.[11] Author Philip K. Dick claimed to have coined the nurd spelling in 1973, but its first recorded use appeared in a 1965 student publication at Rensselaer Polytechnic Institute.[12][13] Oral tradition there holds that the word is derived from knurd (drunk spelled backward), which was used to describe people who studied rather than partied. The term gnurd (spelled with the “g”) was in use at the Massachusetts Institute of Technology by 1965.[14] The term nurd was also in use at the Massachusetts Institute of Technology as early as 1971 but was used in the context for the proper name of a fictional character in a satirical “news” article.[15]

suggesting that the word was already common among nerds themselves before it was picked up by TV.

But we can trace the nerd-jock dichotomy back before the terms were coined: back in 1921, Lewis Terman, a researcher at Stanford University, began a long-term study of exceptionally high-IQ children, the Genetic Studies of Genius aka the Terman Study of the Gifted:

Terman’s goal was to disprove the then-current belief that gifted children were sickly, socially inept, and not well-rounded.

This belief was especially popular in a little nation known as Germany, where it inspired people to take schoolchildren on long hikes in the woods to keep them fit and the mass-extermination of Jews, who were believed to be muddying the German genepool with their weak, sickly, high-IQ genes (and nefariously trying to marry strong, healthy German in order to replenish their own defective stock.) It didn’t help that German Jews were both high-IQ and beset by a number of illnesses (probably related to high rates of consanguinity,) but then again, the Gypsies are beset by even more debilitating illnesses, but no one blames this on all of the fresh air and exercise afforded by their highly mobile lifestyles.

(Just to be thorough, though, the Nazis also exterminated the Gypsies and Hans Asperger’s subjects, despite Asperger’s insistence that they were very clever children who could probably be of great use to the German war effort via code breaking and the like.)

The results of Terman’s study are strongly in Nick’s favor. According to Psychology Today’s  account:

His final group of “Termites” averaged a whopping IQ of 151. Following-up his group 35-years later, his gifted group at mid-life definitely seemed to conform to his expectations. They were taller, healthier, physically better developed, and socially adept (dispelling the myth at the time of high-IQ awkward nerds).

According to Wikipedia:

…the first volume of the study reported data on the children’s family,[17] educational progress,[18] special abilities,[19] interests,[20] play,[21] and personality.[22] He also examined the children’s racial and ethnic heritage.[23] Terman was a proponent of eugenics, although not as radical as many of his contemporary social Darwinists, and believed that intelligence testing could be used as a positive tool to shape society.[3]

Based on data collected in 1921–22, Terman concluded that gifted children suffered no more health problems than normal for their age, save a little more myopia than average. He also found that the children were usually social, were well-adjusted, did better in school, and were even taller than average.[24] A follow-up performed in 1923–1924 found that the children had maintained their high IQs and were still above average overall as a group.

Of course, we can go back even further than Terman–in the early 1800s, allergies like hay fever were associated with the nobility, who of course did not do much vigorous work in the fields.

My impression, based on studies I’ve seen previously, is that athleticism and IQ are positively correlated. That is, smarter people tend to be more athletic, and more athletic people tend to be smarter. There’s a very obvious reason for this: our brains are part of our bodies, people with healthier bodies therefore also have healthier brains, and healthier brains tend to work better.

At the very bottom of the IQ distribution, mentally retarded people tend to also be clumsy, flacid, or lacking good muscle tone. The same genes (or environmental conditions) that make children have terrible health/developmental problems often also affect their brain growth, and conditions that affect their brains also affect their bodies. As we progress from low to average to above-average IQ, we encounter increasingly healthy people.

In most smart people, high-IQ doesn’t seem to be a random fluke, a genetic error, nor fitness reducing: in a genetic study of children with exceptionally high IQs, researchers failed to find many genes that specifically endowed the children with genius, but found instead a fortuitous absence of deleterious genes that knock a few points off the rest of us. The same genes that have a negative effect on the nerves and proteins in your brain probably also have a deleterious effect on the nerves and proteins throughout the rest of your body.

And indeed, there are many studies which show a correlation between intelligence and strength (eg, Longitudinal and Cross-Sectional Assessments of Age Changes in Physical Strength as Related to Sex, Social Class, and Mental Ability) or intelligence and overall health/not dying (eg, Intelligence in young adulthood and cause-specific mortality in the Danish Conscription Database (pdf) and The effects of occupation-based social position on mortality in a large American cohort.)

On the other hand, the evolutionary standard for “fitness” isn’t strength or longevity, but reproduction, and on this scale the high-IQ don’t seem to do as well:

Smart teens don’t have sex (or kiss much either): (h/t Gene Expresion)

Controlling for age, physical maturity, and mother’s education, a significant curvilinear relationship between intelligence and coital status was demonstrated; adolescents at the upper and lower ends of the intelligence distribution were less likely to have sex. Higher intelligence was also associated with postponement of the initiation of the full range of partnered sexual activities. … Higher intelligence operates as a protective factor against early sexual activity during adolescence, and lower intelligence, to a point, is a risk factor.

Source

Here we see the issue plainly: males at 120 and 130 IQ are less likely to get laid than clinically retarded men in 70s and 60s. The right side of the graph are “nerds”, the left side, “jocks.” Of course, the high-IQ females are even less likely to get laid than the high-IQ males, but males tend to judge themselves against other men, not women, when it comes to dating success. Since the low-IQ females are much less likely to get laid than the low-IQ males, this implies that most of these “popular” guys are dating girls who are smarter than themselves–a fact not lost on the nerds, who would also like to date those girls.

 In 2001, the MIT/Wellesley magazine Counterpart (Wellesley is MIT’s “sister school” and the two campuses allow cross-enrollment in each other’s courses) published a sex survey that provides a more detailed picture of nerd virginity:

I’m guessing that computer scientists invented polyamory, and neuroscientists are the chads of STEM. The results are otherwise pretty predictable.

Unfortunately, Counterpoint appears to be defunct due to lack of funding/interest and I can no longer find the original survey, but here is Jason Malloy’s summary from Gene Expression:

By the age of 19, 80% of US males and 75% of women have lost their virginity, and 87% of college students have had sex. But this number appears to be much lower at elite (i.e. more intelligent) colleges. According to the article, only 56% of Princeton undergraduates have had intercourse. At Harvard 59% of the undergraduates are non-virgins, and at MIT, only a slight majority, 51%, have had intercourse. Further, only 65% of MIT graduate students have had sex.

The student surveys at MIT and Wellesley also compared virginity by academic major. The chart for Wellesley displayed below shows that 0% of studio art majors were virgins, but 72% of biology majors were virgins, and 83% of biochem and math majors were virgins! Similarly, at MIT 20% of ‘humanities’ majors were virgins, but 73% of biology majors. (Apparently those most likely to read Darwin are also the least Darwinian!)

College Confidential has one paragraph from the study:

How Rolling Stone-ish are the few lucky souls who are doing the horizontal mambo? Well, not very. Considering all the non-virgins on campus, 41% of Wellesley and 32% of MIT students have only had one partner (figure 5). It seems that many Wellesley and MIT students are comfortingly monogamous. Only 9% of those who have gotten it on at MIT have been with more than 10 people and the number is 7% at Wellesley.

Someone needs to find the original study and PUT IT BACK ON THE INTERNET.

But this lack of early sexual success seems to translate into long-term marital happiness, once nerds find “the one.”Lex Fridman’s Divorce Rates by Profession offers a thorough list. The average divorce rate was 16.35%, with a high of 43% (Dancers) and a low of 0% (“Media and communication equipment workers.”)

I’m not sure exactly what all of these jobs are nor exactly which ones should count as STEM (veterinarian? anthropologists?) nor do I know how many people are employed in each field, but I count 49 STEM professions that have lower than average divorce rates (including computer scientists, economists, mathematical science, statisticians, engineers, biologists, chemists, aerospace engineers, astronomers and physicists, physicians, and nuclear engineers,) and only 23 with higher than average divorce rates (including electricians, water treatment plant operators, radio and telecommunication installers, broadcast engineers, and similar professions.) The purer sciences obviously had lower rates than the more practical applied tech fields.

The big outliers were mathematicians (19.15%), psychologists (19.26%), and sociologists (23.53%), though I’m not sure they count (if so, there were only 22 professions with higher than average divorce rates.)

I’m not sure which professions count as “jock” or “chad,” but athletes had lower than average rates of divorce (14.05%) as did firefighters, soldiers, and farmers. Financial examiners, hunters, and dancers, (presumably an athletic female occupation) however, had very high rates of divorce.

Medical Daily has an article on Who is Most Likely to Cheat? The Top 9 Jobs Unfaithful People Have (according to survey):

According to the survey recently taken by the “infidelity dating website,” Victoria Milan, individuals working in the finance field, such as brokers, bankers, and analysts, are more likely to cheat than those in any other profession. However, following those in finance comes those in the aviation field, healthcare, business, and sports.

With the exception of healthcare and maybe aviation, these are pretty typical Chad occupations, not STEM.

The Mirror has a similar list of jobs where people are most and least likely to be married. Most likely: Dentist, Chief Executive, Sales Engineer, Physician, Podiatrist, Optometrist, Farm product buyer, Precision grinder, Religious worker, Tool and die maker.

Least likely: Paper-hanger, Drilling machine operator, Knitter textile operator, Forge operator, Mail handler, Science technician, Practical nurse, Social welfare clerk, Winding machine operative, Postal clerk.

I struggled to find data on male fertility by profession/education/IQ, but there’s plenty on female fertility, eg the deceptively titled High-Fliers have more Babies:

…American women without any form of high-school diploma have a fertility rate of 2.24 children. Among women with a high-school diploma the fertility rate falls to 2.09 and for women with some form of college education it drops to 1.78.

However, among women with college degrees, the economists found the fertility rate rises to 1.88 and among women with advanced degrees to 1.96. In 1980 women who had studied for 16 years or more had a fertility rate of just 1.2.

As the economists prosaically explain: “The relationship between fertility and women’s education in the US has recently become U-shaped.”

Here is another article about the difference in fertility rates between high and low-IQ women.

But female fertility and male fertility may not be the same–I recall data elsewhere indicating that high-IQ men have more children than low IQ men, which implies those men are having their children with low-IQ women. (For example, while Bill and Hillary seem about matched on IQ, and have only one child, Melania Trump does not seem as intelligent as Trump, who has five children.)

Amusingly, I did find data on fertility rate by father’s profession for 1920, in the Birth Statistics for the Birth Registration Area of the US:

Of the 1,508,874 children born in 1920 in the birth registration area of the United states, occupations of fathers are stated for … 96.9%… The average number of children ever born to the present wives of these occupied fathers is 3.3 and the average number of children living 2.9.

The average number of children ever born ranges from 4.6 for foremen, overseers, and inspectors engaged in the extraction of minerals to 1.8 for soldiers, sailors, and marines. Both of these extreme averages are easily explained, for soldier, sailors and marines are usually young, while such foremen, overseers, and inspectors are usually in middle life. For many occupations, however, the ages of the fathers are presumably about the same and differences shown indicate real differences in the size of families. For example, the low figure for dentists, (2), architects, (2.1), and artists, sculptors, and teachers of art (2.2) are in striking contrast with the figure for mine operatives (4.3), quarry operatives (4.1) bootblacks, and brick and stone masons (each 3.9). …

As a rule the occupations credited with the highest number of children born are also credited with the highest number of children living, the highest number of children living appearing for foremen, overseers, and inspectors engaged in the extraction of minerals (3.9) and for steam and street railroad foremen and overseer (3.8), while if we exclude groups plainly affected by the age of fathers, the highest number of children living appear for mine and quarry operatives (each 3.6).

Obviously the job market was very different in 1920–no one was majoring in computer science. Perhaps some of those folks who became mine and quarry operatives back then would become engineers today–or perhaps not. Here are the average numbers of surviving children for the most obviously STEM professions (remember average for 1920 was 2.9):

Electricians 2.1, Electrotypers 2.2, telegraph operator 2.2, actors 1.9, chemists 1.8, Inventors 1.8, photographers and physicians 2.1, technical engineers 1.9, veterinarians 2.2.

I don’t know what paper hangers do, but the Mirror said they were among the least likely to be married, and in 1920, they had an average of 3.1 children–above average.

What about athletes? How smart are they?

Athletes Show Huge Gaps on SAT Scores” is not a promising title for the “nerds are athletic” crew.

The Journal-Constitution studied 54 public universities, “including the members of the six major Bowl Championship Series conferences and other schools whose teams finished the 2007-08 season ranked among the football or men’s basketball top 25.”…

  • Football players average 220 points lower on the SAT than their classmates. Men’s basketball was 227 points lower.
  • University of Florida won the prize for biggest gap between football players and the student body, with players scoring 346 points lower than their peers.
  • Georgia Tech had the nation’s best average SAT score for football players, 1028 of a possible 1600, and best average high school GPA, 3.39 of a possible 4.0. But because its student body is apparently very smart, Tech’s football players still scored 315 SAT points lower than their classmates.
  • UCLA, which has won more NCAA championships in all sports than any other school, had the biggest gap between the average SAT scores of athletes in all sports and its overall student body, at 247 points.

From the original article, which no longer seems to be up on the Journal-Constitution website:

All 53 schools for which football SAT scores were available had at least an 88-point gap between team members’ average score and the average for the student body. …

Football players performed 115 points worse on the SAT than male athletes in other sports.

The differences between athletes’ and non-athletes’ SAT scores were less than half as big for women (73 points) as for men (170).

Many schools routinely used a special admissions process to admit athletes who did not meet the normal entrance requirements. … At Georgia, for instance, 73.5 percent of athletes were special admits compared with 6.6 percent of the student body as a whole.

On the other hand, as Discover Magazine discusses in “The Brain: Why Athletes are Geniuses,” athletic tasks–like catching a fly ball or slapping a hockey puck–require exceptionally fast and accurate brain signals to trigger the correct muscle movements.

Ryan Stegal studied the GPAs of highschool student athletes vs. non-athletes and found that the athletes had higher average GPAs than the non-athletes, but he also notes that the athletes were required to meet certain minimum GPA requirements in order to play.

But within athletics, it looks like the smarter athletes perform better than dumber ones, which is why the NFL uses the Wonderlic Intelligence Test:

NFL draft picks have taken the Wonderlic test for years because team owners need to know if their million dollar player has the cognitive skills to be a star on the field.

What does the NFL know about hiring that most companies don’t? They know that regardless of the position, proof of intelligence plays a profound role in the success of every individual on the team. It’s not enough to have physical ability. The coaches understand that players have to be smart and think quickly to succeed on the field, and the closer they are to the ball the smarter they need to be. That’s why, every potential draft pick takes the Wonderlic Personnel Test at the combine to prove he does–or doesn’t—have the brains to win the game. …

The first use of the WPT in the NFL was by Tom Landry of the Dallas Cowboys in the early 70s, who took a scientific approach to finding players. He believed players who could use their minds where it counted had a strategic advantage over the other teams. He was right, and the test has been used at the combine ever since.

For the NFL, years of testing shows that the higher a player scores on the Wonderlic, the more likely he is to be in the starting lineup—for any position. “There is no other reasonable explanation for the difference in test scores between starting players and those that sit on the bench,” Callans says. “Intelligence plays a role in how well they play the game.”

Let’s look at Exercising Intelligence: How Research Shows a Link Between Physical Activity and Smarts:

A large study conducted at the Sahlgrenska Academy and Sahlgrenska University Hospital in Gothenburg, Sweden, reveals that young adults who regularly exercise have higher IQ scores and are more likely to go on to university.

The study was published in the Proceedings of the National Academy of Sciences (PNAS), and involved more than 1.2 million Swedish men. The men were performing military service and were born between the years 1950 and 1976. Both their physical and IQ test scores were reviewed by the research team. …

The researchers also looked at data for twins and determined that primarily environmental factors are responsible for the association between IQ and fitness, and not genetic makeup. “We have also shown that those youngsters who improve their physical fitness between the ages of 15 and 18 increase their cognitive performance.”…

I have seen similar studies before, some involving mice and some, IIRC, the elderly. It appears that exercise is probably good for you.

I have a few more studies I’d like to mention quickly before moving on to discussion.

Here’s Grip Strength and Physical Demand of Previous Occupation in a Well-Functioning Cohort of Chinese Older Adults (h/t prius_1995) found that participants who had previously worked in construction had greater grip strength than former office workers.

Age and Gender-Specific Normative Data of Grip and Pinch Strength in a Healthy Adult Swiss Population (h/t prius_1995).

 

If the nerds are in the sedentary cohort, then they be just as athletic if not more athletic than all of the other cohorts except the heavy work.

However, in Revised normative values for grip strength with the Jamar dynamometer, the authors found no effect of profession on grip strength.

And Isometric muscle strength and anthropometric characteristics of a Chinese sample (h/t prius_1995).

And Pumpkin Person has an interesting post about brain size vs. body size.

 

Discussion: Are nerds real?

Overall, it looks like smarter people are more athletic, more athletic people are smarter, smarter athletes are better athletes, and exercise may make you smarter. For most people, the nerd/jock dichotomy is wrong.

However, there is very little overlap at the very highest end of the athletic and intelligence curves–most college (and thus professional) athletes are less intelligent than the average college student, and most college students are less athletic than the average college (and professional) athlete.

Additionally, while people with STEM degrees make excellent spouses (except for mathematicians, apparently,) their reproductive success is below average: they have sex later than their peers and, as far as the data I’ve been able to find shows, have fewer children.

Stephen Hawking

Even if there is a large overlap between smart people and athletes, they are still separate categories selecting for different things: a cripple can still be a genius, but can’t play football; a dumb person can play sports, but not do well at math. Stephen Hawking can barely move, but he’s still one of the smartest people in the world. So the set of all smart people will always include more “stereotypical nerds” than the set of all athletes, and the set of all athletes will always include more “stereotypical jocks” than the set of all smart people.

In my experience, nerds aren’t socially awkward (aside from their shyness around women.) The myth that they are stems from the fact that they have different interests and communicate in a different way than non-nerds. Let nerds talk to other nerds, and they are perfectly normal, communicative, socially functional people. Put them in a room full of non-nerds, and suddenly the nerds are “awkward.”

Unfortunately, the vast majority of people are not nerds, so many nerds have to spend the majority of their time in the company of lots of people who are very different than themselves. By contrast, very few people of normal IQ and interests ever have to spend time surrounded by the very small population of nerds. If you did put them in a room full of nerds, however, you’d find that suddenly they don’t fit in. The perception that nerds are socially awkward is therefore just normie bias.

Why did the nerd/jock dichotomy become so popular in the 70s? Probably in part because science and technology were really taking off as fields normal people could aspire to major in, man had just landed on the moon and the Intel 4004 was released in 1971.  Very few people went to college or were employed in sciences back in 1920; by 1970, colleges were everywhere and science was booming.

And at the same time, colleges and highschools were ramping up their athletics programs. I’d wager that the average school in the 1800s had neither PE nor athletics of any sort. To find those, you’d probably have to attend private academies like Andover or Exeter. By the 70s, though, schools were taking their athletics programs–even athletic recruitment–seriously.

How strong you felt the dichotomy probably depends on the nature of your school. I have attended schools where all of the students were fairly smart and there was no anti-nerd sentiment, and I have attended schools where my classmates were fiercely anti-nerd and made sure I knew it.

But the dichotomy predates the terminology. Take Superman, first 1938. His disguise is a pair of glasses, because no one can believe that the bookish, mild-mannered, Clark Kent is actually the super-strong Superman. Batman is based on the character of El Zorro, created in 1919. Zorro is an effete, weak, foppish nobleman by day and a dashing, sword-fighting hero of the poor by night. Of course these characters are both smart and athletic, but their disguises only work because others do not expect them to be. As fantasies, the characters are powerful because they provide a vehicle for our own desires: for our everyday normal failings to be just a cover for how secretly amazing we are.

But for the most part, most smart people are perfectly fit, healthy, and coordinated–even the ones who like math.

 

Navigation and the Wealth of Nations

Global Determinants of Navigational Ability, by Coutrot et al:

Using a mobile-based virtual reality navigation task, we measured spatial navigation ability in more than 2.5 million people globally. Using a clustering approach, we find that navigation ability is not smoothly distributed globally but clustered into five distinct yet geographically related groups of countries. Furthermore, the economic wealth of a nation (Gross Domestic Product per capita) was predictive of the average navigation ability of its inhabitants and gender inequality (Gender Gap Index) was predictive of the size of performance difference between males and females. Thus, cognitive abilities, at least for spatial navigation, are clustered according to economic wealth and gender inequalities globally.

This is an incredible study. They got 2.5 million people from all over the world to participate.

If you’ve been following any of the myriad debates about intelligence, IQ, and education, you’re probably familiar with the concept of “multiple intelligences” and the fact that there’s rather little evidence that people actually have “different intelligences” that operate separately from each other. In general, it looks like people who have brains that are good at working out how to do one kind of task tend to be good at working out other sorts of tasks.

I’ve long held navigational ability as a possible exception to this: perhaps people in, say, Polynesian societies depended historically far more on navigational abilities than the rest of us, even though math and literacy were nearly absent.

Unfortunately, it doesn’t look like the authors got enough samples from Polynesia to include it in the study, but they did get data from Indonesia and the Philippines, which I’ll return to in a moment.

Frankly, I don’t see what the authors mean by “five distinct yet geographically related groups of countries.” South Korea is ranked between the UK and Belgium; Russia is next to Malaysia; Indonesia is next to Portugal and Hungary.

GDP per capita appears to be a stronger predictor than geography:

Some people will say these results merely reflect experience playing video games–people in wealthier countries have probably spent more time and money on computers and games. But assuming that the people who are participating in the study in the first place are people who have access to smartphones, computers, video games, etc., the results are not good for the multiple-intelligences hypothesis.

In the GDP per Capita vs. Conditional Modes (ie how well a nation scored overall, with low scores better than high scores) graph, countries above the trend line are under-performing relative to their GDPs, and countries below the line are over-performing relative to their GDPs.

South Africa, for example, significantly over-performs relative to its GDP, probably due to sampling bias: white South Africans with smartphones and computers were probably more likely to participate in the study than the nation’s 90% black population, but the GDP reflects the entire population. Finland and New Zealand are also under-performing economically, perhaps because Finland is really cold and NZ is isolated.

On the other side of the line, the UAE, Saudi Arabia, and Greece over-perform relative to GDP. Two of these are oil states that would be much poorer if not for geographic chance, and as far as I can tell, the whole Greek economy is being propped up by German loans. (There is also evidence that Greek IQ is falling, though this may be a near universal problem in developed nations.)

Three other nations stand out in the “scoring better than GDP predicts” category: Ukraine, (which suffered under Communism–Communism seems to do bad things to countries,) Indonesia and the Philippines. While we could be looking at selection bias similar to South Africa, these are island nations in which navigational ability surely had some historical effect on people’s ability to survive.

Indonesia and the Philippines still didn’t do as well as first-world nations like Norway and Canada, but they outperformed other nations with similar GDPs like Egypt, India, and Macedonia. This is the best evidence I know of for independent selection for navigational ability in some populations.

The study’s other interesting findings were that women performed consistently worse than men, both across countries and age groups (except for the post-90 cohort, but that might just be an error in the data.) Navigational ability declines steeply for everyone post-23 years old until about 75 years; the authors suggest the subsequent increase in abilities post-70s might be sampling error due to old people who are good at video games being disproportionately likely to seek out video game related challenges.

The authors note that people who drive more (eg, the US and Canada) might do better on navigational tasks than people who use public transportation more (eg, Europeans) but also that Finno-Scandians are among the world’s best navigators despite heavy use of public transport in those countries. The authors write:

We speculate that this specificity may be linked to Nordic countries sharing a culture of participating in a sport related to navigation: orienteering. Invented as an official sport in the late 19th century in Sweden, the first orienteering competition open to the public was held in Norway in 1897. Since then, it has been more popular in Nordic countries than anywhere else in the world, and is taught in many schools [26]. We found that ‘orienteering world championship’ country results significantly correlated with countries’ CM (Pearson’s correlation ρ = .55, p = .01), even after correcting for GDP per capita (see Extended Data Fig. 15). Future targeted research will be required to evaluate the impact of cultural activities on navigation skill.

I suggest a different causal relationship: people make hobbies out of things they’re already good at and enjoy doing, rather than things they’re bad at.

 

 

Please note that the study doesn’t look at a big chunk of countries, like most of Africa. Being at the bottom in navigational abilities in this study by no means indicates that a country is at the bottom globally–given the trends already present in the data, it is likely that the poorer countries that weren’t included in the study would do even worse.

Evolution is slow–until it’s fast: Genetic Load and the Future of Humanity

Source: Priceonomics

A species may live in relative equilibrium with its environment, hardly changing from generation to generation, for millions of years. Turtles, for example, have barely changed since the Cretaceous, when dinosaurs still roamed the Earth.

But if the environment changes–critically, if selective pressures change–then the species will change, too. This was most famously demonstrated with English moths, which changed color from white-and-black speckled to pure black when pollution darkened the trunks of the trees they lived on. To survive, these moths need to avoid being eaten by birds, so any moth that stands out against the tree trunks tends to get turned into an avian snack. Against light-colored trees, dark-colored moths stood out and were eaten. Against dark-colored trees, light-colored moths stand out.

This change did not require millions of years. Dark-colored moths were virtually unknown in 1810, but by 1895, 98% of the moths were black.

The time it takes for evolution to occur depends simply on A. The frequency of a trait in the population and B. How strongly you are selecting for (or against) it.

Let’s break this down a little bit. Within a species, there exists a great deal of genetic variation. Some of this variation happens because two parents with different genes get together and produce offspring with a combination of their genes. Some of this variation happens because of random errors–mutations–that occur during copying of the genetic code. Much of the “natural variation” we see today started as some kind of error that proved to be useful, or at least not harmful. For example, all humans originally had dark skin similar to modern Africans’, but random mutations in some of the folks who no longer lived in Africa gave them lighter skin, eventually producing “white” and “Asian” skin tones.

(These random mutations also happen in Africa, but there they are harmful and so don’t stick around.)

Natural selection can only act on the traits that are actually present in the population. If we tried to select for “ability to shoot x-ray lasers from our eyes,” we wouldn’t get very far, because no one actually has that mutation. By contrast, albinism is rare, but it definitely exists, and if for some reason we wanted to select for it, we certainly could. (The incidence of albinism among the Hopi Indians is high enough–1 in 200 Hopis vs. 1 in 20,000 Europeans generally and 1 in 30,000 Southern Europeans–for scientists to discuss whether the Hopi have been actively selecting for albinism. This still isn’t a lot of albinism, but since the American Southwest is not a good environment for pale skin, it’s something.)

You will have a much easier time selecting for traits that crop up more frequently in your population than traits that crop up rarely (or never).

Second, we have intensity–and variety–of selective pressure. What % of your population is getting removed by natural selection each year? If 50% of your moths get eaten by birds because they’re too light, you’ll get a much faster change than if only 10% of moths get eaten.

Selection doesn’t have to involve getting eaten, though. Perhaps some of your moths are moth Lotharios, seducing all of the moth ladies with their fuzzy antennae. Over time, the moth population will develop fuzzier antennae as these handsome males out-reproduce their less hirsute cousins.

No matter what kind of selection you have, nor what part of your curve it’s working on, all that ultimately matters is how many offspring each individual has. If white moths have more children than black moths, then you end up with more white moths. If black moths have more babies, then you get more black moths.

Source SUPS.org

So what happens when you completely remove selective pressures from a population?

Back in 1968, ethologist John B. Calhoun set up an experiment popularly called “Mouse Utopia.” Four pairs of mice were given a large, comfortable habitat with no predators and plenty of food and water.

Predictably, the mouse population increased rapidly–once the mice were established in their new homes, their population doubled every 55 days. But after 211 days of explosive growth, reproduction began–mysteriously–to slow. For the next 245 days, the mouse population doubled only once every 145 days.

The birth rate continued to decline. As births and death reached parity, the mouse population stopped growing. Finally the last breeding female died, and the whole colony went extinct.

source

As I’ve mentioned before Israel is (AFAIK) the only developed country in the world with a TFR above replacement.

It has long been known that overcrowding leads to population stress and reduced reproduction, but overcrowding can only explain why the mouse population began to shrink–not why it died out. Surely by the time there were only a few breeding pairs left, things had become comfortable enough for the remaining mice to resume reproducing. Why did the population not stabilize at some comfortable level?

Professor Bruce Charlton suggests an alternative explanation: the removal of selective pressures on the mouse population resulted in increasing mutational load, until the entire population became too mutated to reproduce.

What is genetic load?

As I mentioned before, every time a cell replicates, a certain number of errors–mutations–occur. Occasionally these mutations are useful, but the vast majority of them are not. About 30-50% of pregnancies end in miscarriage (the percent of miscarriages people recognize is lower because embryos often miscarry before causing any overt signs of pregnancy,) and the majority of those miscarriages are caused by genetic errors.

Unfortunately, randomly changing part of your genetic code is more likely to give you no skin than skintanium armor.

But only the worst genetic problems that never see the light of day. Plenty of mutations merely reduce fitness without actually killing you. Down Syndrome, famously, is caused by an extra copy of chromosome 21.

While a few traits–such as sex or eye color–can be simply modeled as influenced by only one or two genes, many traits–such as height or IQ–appear to be influenced by hundreds or thousands of genes:

Differences in human height is 60–80% heritable, according to several twin studies[19] and has been considered polygenic since the Mendelian-biometrician debate a hundred years ago. A genome-wide association (GWA) study of more than 180,000 individuals has identified hundreds of genetic variants in at least 180 loci associated with adult human height.[20] The number of individuals has since been expanded to 253,288 individuals and the number of genetic variants identified is 697 in 423 genetic loci.[21]

Obviously most of these genes each plays only a small role in determining overall height (and this is of course holding environmental factors constant.) There are a few extreme conditions–gigantism and dwarfism–that are caused by single mutations, but the vast majority of height variation is caused by which particular mix of those 700 or so variants you happen to have.

The situation with IQ is similar:

Intelligence in the normal range is a polygenic trait, meaning it’s influenced by more than one gene.[3][4]

The general figure for the heritability of IQ, according to an authoritative American Psychological Association report, is 0.45 for children, and rises to around 0.75 for late teens and adults.[5][6] In simpler terms, IQ goes from being weakly correlated with genetics, for children, to being strongly correlated with genetics for late teens and adults. … Recent studies suggest that family and parenting characteristics are not significant contributors to variation in IQ scores;[8] however, poor prenatal environment, malnutrition and disease can have deleterious effects.[9][10]

And from a recent article published in Nature Genetics, Genome-wide association meta-analysis of 78,308 individuals identifies new loci and genes influencing human intelligence:

Despite intelligence having substantial heritability2 (0.54) and a confirmed polygenic nature, initial genetic studies were mostly underpowered3, 4, 5. Here we report a meta-analysis for intelligence of 78,308 individuals. We identify 336 associated SNPs (METAL P < 5 × 10−8) in 18 genomic loci, of which 15 are new. Around half of the SNPs are located inside a gene, implicating 22 genes, of which 11 are new findings. Gene-based analyses identified an additional 30 genes (MAGMA P < 2.73 × 10−6), of which all but one had not been implicated previously. We show that the identified genes are predominantly expressed in brain tissue, and pathway analysis indicates the involvement of genes regulating cell development (MAGMA competitive P = 3.5 × 10−6). Despite the well-known difference in twin-based heritability2 for intelligence in childhood (0.45) and adulthood (0.80), we show substantial genetic correlation (rg = 0.89, LD score regression P = 5.4 × 10−29). These findings provide new insight into the genetic architecture of intelligence.

The greater number of genes influence a trait, the harder they are to identify without extremely large studies, because any small group of people might not even have the same set of relevant genes.

High IQ correlates positively with a number of life outcomes, like health and longevity, while low IQ correlates with negative outcomes like disease, mental illness, and early death. Obviously this is in part because dumb people are more likely to make dumb choices which lead to death or disease, but IQ also correlates with choice-free matters like height and your ability to quickly press a button. Our brains are not some mysterious entities floating in a void, but physical parts of our bodies, and anything that affects our overall health and physical functioning is likely to also have an effect on our brains.

Like height, most of the genetic variation in IQ is the combined result of many genes. We’ve definitely found some mutations that result in abnormally low IQ, but so far we have yet (AFAIK) to find any genes that produce the IQ gigantism. In other words, low (genetic) IQ is caused by genetic load–Small Yet Important Genetic Differences Between Highly Intelligent People and General Population:

The study focused, for the first time, on rare, functional SNPs – rare because previous research had only considered common SNPs and functional because these are SNPs that are likely to cause differences in the creation of proteins.

The researchers did not find any individual protein-altering SNPs that met strict criteria for differences between the high-intelligence group and the control group. However, for SNPs that showed some difference between the groups, the rare allele was less frequently observed in the high intelligence group. This observation is consistent with research indicating that rare functional alleles are more often detrimental than beneficial to intelligence.

Maternal mortality rates over time, UK data

Greg Cochran has some interesting Thoughts on Genetic Load. (Currently, the most interesting candidate genes for potentially increasing IQ also have terrible side effects, like autism, Tay Sachs and Torsion Dystonia. The idea is that–perhaps–if you have only a few genes related to the condition, you get an IQ boost, but if you have too many, you get screwed.) Of course, even conventional high-IQ has a cost: increased maternal mortality (larger heads).

Wikipedia defines genetic load as:

the difference between the fitness of an average genotype in a population and the fitness of some reference genotype, which may be either the best present in a population, or may be the theoretically optimal genotype. … Deleterious mutation load is the main contributing factor to genetic load overall.[5] Most mutations are deleterious, and occur at a high rate.

There’s math, if you want it.

Normally, genetic mutations are removed from the population at a rate determined by how bad they are. Really bad mutations kill you instantly, and so are never born. Slightly less bad mutations might survive, but never reproduce. Mutations that are only a little bit deleterious might have no obvious effect, but result in having slightly fewer children than your neighbors. Over many generations, this mutation will eventually disappear.

(Some mutations are more complicated–sickle cell, for example, is protective against malaria if you have only one copy of the mutation, but gives you sickle cell anemia if you have two.)

Jakubany is a town in the Carpathian Mountains

Throughout history, infant mortality was our single biggest killer. For example, here is some data from Jakubany, a town in the Carpathian Mountains:

We can see that, prior to the 1900s, the town’s infant mortality rate stayed consistently above 20%, and often peaked near 80%.

The graph’s creator states:

When I first ran a calculation of the infant mortality rate, I could not believe certain of the intermediate results. I recompiled all of the data and recalculated … with the same astounding result – 50.4% of the children born in Jakubany between the years 1772 and 1890 would diebefore reaching ten years of age! …one out of every two! Further, over the same 118 year period, of the 13306 children who were born, 2958 died (~22 %) before reaching the age of one.

Historical infant mortality rates can be difficult to calculate in part because they were so high, people didn’t always bother to record infant deaths. And since infants are small and their bones delicate, their burials are not as easy to find as adults’. Nevertheless, Wikipedia estimates that Paleolithic man had an average life expectancy of 33 years:

Based on the data from recent hunter-gatherer populations, it is estimated that at 15, life expectancy was an additional 39 years (total 54), with a 0.60 probability of reaching 15.[12]

Priceonomics: Why life expectancy is misleading

In other words, a 40% chance of dying in childhood. (Not exactly the same as infant mortality, but close.)

Wikipedia gives similarly dismal stats for life expectancy in the Neolithic (20-33), Bronze and Iron ages (26), Classical Greece(28 or 25), Classical Rome (20-30), Pre-Columbian Southwest US (25-30), Medieval Islamic Caliphate (35), Late Medieval English Peerage (30), early modern England (33-40), and the whole world in 1900 (31).

Over at ThoughtCo: Surviving Infancy in the Middle Ages, the author reports estimates for between 30 and 50% infant mortality rates. I recall a study on Anasazi nutrition which I sadly can’t locate right now, which found 100% malnutrition rates among adults (based on enamel hypoplasias,) and 50% infant mortality.

As Priceonomics notes, the main driver of increasing global life expectancy–48 years in 1950 and 71.5 years in 2014 (according to Wikipedia)–has been a massive decrease in infant mortality. The average life expectancy of an American newborn back in 1900 was only 47 and a half years, whereas a 60 year old could expect to live to be 75. In 1998, the average infant could expect to live to about 75, and the average 60 year old could expect to live to about 80.

Back in his post on Mousetopia, Charlton writes:

Michael A Woodley suggests that what was going on [in the Mouse experiment] was much more likely to be mutation accumulation; with deleterious (but non-fatal) genes incrementally accumulating with each generation and generating a wide range of increasingly maladaptive behavioural pathologies; this process rapidly overwhelming and destroying the population before any beneficial mutations could emerge to ‘save; the colony from extinction. …

The reason why mouse utopia might produce so rapid and extreme a mutation accumulation is that wild mice naturally suffer very high mortality rates from predation. …

Thus mutation selection balance is in operation among wild mice, with very high mortality rates continually weeding-out the high rate of spontaneously-occurring new mutations (especially among males) – with typically only a small and relatively mutation-free proportion of the (large numbers of) offspring surviving to reproduce; and a minority of the most active and healthy (mutation free) males siring the bulk of each generation.

However, in Mouse Utopia, there is no predation and all the other causes of mortality (eg. Starvation, violence from other mice) are reduced to a minimum – so the frequent mutations just accumulate, generation upon generation – randomly producing all sorts of pathological (maladaptive) behaviours.

Historically speaking, another selective factor operated on humans: while about 67% of women reproduced, only 33% of men did. By contrast, according to Psychology Today, a majority of today’s men have or will have children.

Today, almost everyone in the developed world has plenty of food, a comfortable home, and doesn’t have to worry about dying of bubonic plague. We live in humantopia, where the biggest factor influencing how many kids you have is how many you want to have.

source

Back in 1930, infant mortality rates were highest among the children of unskilled manual laborers, and lowest among the children of professionals (IIRC, this is Brittish data.) Today, infant mortality is almost non-existent, but voluntary childlessness has now inverted this phenomena:

Yes, the percent of childless women appears to have declined since 1994, but the overall pattern of who is having children still holds. Further, while only 8% of women with post graduate degrees have 4 or more children, 26% of those who never graduated from highschool have 4+ kids. Meanwhile, the age of first-time moms has continued to climb.

In other words, the strongest remover of genetic load–infant mortality–has all but disappeared; populations with higher load (lower IQ) are having more children than populations with lower load; and everyone is having children later, which also increases genetic load.

Take a moment to consider the high-infant mortality situation: an average couple has a dozen children. Four of them, by random good luck, inherit a good combination of the couple’s genes and turn out healthy and smart. Four, by random bad luck, get a less lucky combination of genes and turn out not particularly healthy or smart. And four, by very bad luck, get some unpleasant mutations that render them quite unhealthy and rather dull.

Infant mortality claims half their children, taking the least healthy. They are left with 4 bright children and 2 moderately intelligent children. The three brightest children succeed at life, marry well, and end up with several healthy, surviving children of their own, while the moderately intelligent do okay and end up with a couple of children.

On average, society’s overall health and IQ should hold steady or even increase over time, depending on how strong the selective pressures actually are.

Or consider a consanguineous couple with a high risk of genetic birth defects: perhaps a full 80% of their children die, but 20% turn out healthy and survive.

Today, by contrast, your average couple has two children. One of them is lucky, healthy, and smart. The other is unlucky, unhealthy, and dumb. Both survive. The lucky kid goes to college, majors in underwater intersectionist basket-weaving, and has one kid at age 40. That kid has Down Syndrome and never reproduces. The unlucky kid can’t keep a job, has chronic health problems, and 3 children by three different partners.

Your consanguineous couple migrates from war-torn Somalia to Minnesota. They still have 12 kids, but three of them are autistic with IQs below the official retardation threshold. “We never had this back in Somalia,” they cry. “We don’t even have a word for it.”

People normally think of dysgenics as merely “the dumb outbreed the smart,” but genetic load applies to everyone–men and women, smart and dull, black and white, young and especially old–because we all make random transcription errors when copying our DNA.

I could offer a list of signs of increasing genetic load, but there’s no way to avoid cherry-picking trends I already know are happening, like falling sperm counts or rising (diagnosed) autism rates, so I’ll skip that. You may substitute your own list of “obvious signs society is falling apart at the genes” if you so desire.

Nevertheless, the transition from 30% (or greater) infant mortality to almost 0% is amazing, both on a technical level and because it heralds an unprecedented era in human evolution. The selective pressures on today’s people are massively different from those our ancestors faced, simply because our ancestors’ biggest filter was infant mortality. Unless infant mortality acted completely at random–taking the genetically loaded and unloaded alike–or on factors completely irrelevant to load, the elimination of infant mortality must continuously increase the genetic load in the human population. Over time, if that load is not selected out–say, through more people being too unhealthy to reproduce–then we will end up with an increasing population of physically sick, maladjusted, mentally ill, and low-IQ people.

(Remember, all mental traits are heritable–so genetic load influences everything, not just controversial ones like IQ.)

If all of the above is correct, then I see only 4 ways out:

  1. Do nothing: Genetic load increases until the population is non-functional and collapses, resulting in a return of Malthusian conditions, invasion by stronger neighbors, or extinction.
  2. Sterilization or other weeding out of high-load people, coupled with higher fertility by low-load people
  3. Abortion of high load fetuses
  4. Genetic engineering

#1 sounds unpleasant, and #2 would result in masses of unhappy people. We don’t have the technology for #4, yet. I don’t think the technology is quite there for #2, either, but it’s much closer–we can certainly test for many of the deleterious mutations that we do know of.

An attempt to answer some questions on IQ

I recently received a few IQ-related questions. Now, IQ is not my specialty, so I do not feel particularly adequate for the task, but I’ll do my best. I recommend anyone really interested in the subject read Pumpkin Person’s blog, as he really enjoys talking about IQ all the time.

  1. I wanted to ask if you know any IQ test on the internet that is an equivalent to the reliable tests given by psychologists?

I suppose it depends on what you want the test for. Curiosity? Diagnosis? Personally, I suspect that the average person isn’t going to learn very much from an IQ test that they didn’t already know just from living (similarly, I don’t think you’re going to discover that you’re an introvert or extrovert by taking an online quiz if you didn’t know it already from interacting with people,) but there are cases where people might want to take an IQ test, so let’s get searching.

Pumpkin Person speaks highly of the Wechsler Intelligence Scales (it comes in adult and child versions.) According to Wikipedia:

The Wechsler Adult Intelligence Scale (WAIS) is an IQ test designed to measure intelligence and cognitive ability in adults and older adolescents.[1] The original WAIS (Form I) was published in February 1955 by David Wechsler, as a revision of the Wechsler–Bellevue Intelligence Scale, released in 1939.[2] It is currently in its fourth edition (WAIS-IV) released in 2008 by Pearson, and is the most widely used IQ test, for both adults and older adolescents, in the world.

Since IQ tests excite popular interest but no one really wants to pay $1,000 just to take a test, the internet is littered with “free” tests of questionable quality. For example, WeschlerTest.com offers “free sample tests,” but the bottom of the website notes that, “Disclaimer: This is not an official Wechsler test and is only for entertainment purposes. Any scores derived from it may not accurately reflect the score you would attain on an official Wechsler test.” Here is a similar wesbsite that offers free Stanford-Binet Tests.

I am not personally in a position to judge if these are any good.

It looks like the US military has put its Armed Services Vocational Aptitude Battery online, or at least a practice version. This seems like one of the best free options, because the army is a real organization that’s deeply interested in getting accurate results and the relationship between the ASVAB and other IQ tests is probably well documented. From the website:

The ASVAB is a timed test that measures your skills in a number of different areas. You complete questions that reveal your skills in paragraph comprehension, word knowledge, arithmetic reasoning and mathematics knowledge. These are basic skills that you will need as a member of the U.S. military. The score you receive on the ASVAB is factored into your Armed Forces Qualifying Test (AFQT) score. This score is used to figure out whether you qualify to enlist in the armed services. …

The ASVAB was created in 1968. By 1976, all branches of the military began using this test. In 2002, the test underwent many revisions, but its main goal of gauging a person’s basic skills remained the same. Today, there is a computerized version of the test as well as a written version. The Department of Defense developed this test and it’s taken by students in thousands of schools across the country. It is also given at Military Entrance Processing Stations (MEPS).

Naturally, each branch of the United States armed services wants to enlist the best, most qualified candidates each year. The ASVAB is a tool that helps in the achievement of that purpose. Preparing to take the ASVAB is just one more step in the journey toward your goal of joining the U.S. armed services. …

Disclaimer: The tests on this website are for entertainment purposes only, and may not accurately reflect the scores you would attain on a professionally administered ASVAB test.

The blog Random Critical Analysis gives a thorough rundown of the correlation between ASVAB and IQ scores (they are highly correlated) along with the SAT and ACT.

Additionally, there are a couple of tests linked here in Lipscomb’s Intelligence Course Lab : Classical IQ Test from Psychology Today and IQTest.com.

Drawing a page from Pumpkin Person’s book, I recommend taking several different tests and then comparing results. Use your good judgment about whether a particular test seems reliable–is it covered in ads? Does random guessing get you a score of 148? Did you get a result similar to what you’d expect based on real life experiences?

2. Besides that I wanted to ask you how much social class and IQ are correlated?

A fair amount.

Thanks to Tino Sanandaji

With thanks to Pumpkin Person

Really dumb people are too dumb to commit as much crime as mildly dumb people

IQ by country–red = low; purple – high. Source Wikipedia

 

 

 

 

 

 

 

 

 

 

I do wonder why he made the graph so much bigger than the relevant part
Lifted gratefully from La Griffe Du Lion’s Smart Fraction II article
Oh, there you are, correlation
Lifted gratefully from La Griffe Du Lion’s Smart Fraction II article

When dumb children are born to rich people, they tend to do badly in life and don’t make much money; they subsequently sink in social status. When smart children are born to poor people, they tend to do well in life and rise in social status. Even in societies with strict social classes where moving from class to class is effectively impossible, we should still expect that really dumb people born into wealth will squander it, leading to their impoverishment. Likewise, among the lower classes, we would still expect that smarter low-class people would do better in life than dumber ones.

This is all somewhat built into the entire definition of “IQ” and what people were trying to measure when they created the tests.

3. Basically do traditional upper classes form separate genetic clusters like Gregory Clark claims?

I haven’t read Clark’s book, but I’m sure the pathetic amount of research I can do here would be nothing compared to what he’s amassed.

There are a number of studies on assortative mating and IQ, eg: Spouse similarity for IQ and personality and convergence:

A similar pattern of spousal association for IQ scores and personality traits was found in two British samples from Oxford and Cambridge. There was no indirect evidence from either sample to suggest that convergence occurred during marriage. All observed assortative mating might well be due to initial assortment.

Assortative mating for psychiatric disorders and psychological traits:

This article reviews the literature on assortative mating for psychological traits and psychiatric illness. Assortative mating appears to exist for personality traits, but to a lesser degree than that observed for physical traits, sociodemographic traits, intelligence, and attitudes and values. Concordance between spouses for psychiatric illness has also been consistently reported in numerous studies. This article examines alternative explanations for such observed concordance and discusses the effects of assortative mating on population genetics and the social environment.

Do assortative mating patterns for IQ block upward social mobility?

In the Minnesota Twin Family Study, assortative mating for IQ was greater than .3 in both the 11- and 17-year-old cohorts. Recognizing this, genetic variance in IQ independent of SES was greater with higher parental SES in the 11-year-old cohort. This was not true, however, in the 17-year-old cohort. In both cohorts, people of higher IQ were more likely to have ‘married down’ for IQ than people of lower IQ were to have ‘married up’. This assortative mating pattern would create greater genetic diversity for IQ in people of higher IQ than in people of lower IQ. As IQ is associated with SES, the pattern could be one reason for the observation of greater genetic variance in IQ independent of SES with greater parental SES in several samples. If so, it could block upward social mobility among those already in lower-SES groups. I discuss possible involved mechanisms and social implications.

The role of personality and intelligence in assortative mating:

Assortative mating is the individuals’ tendency to mate with those who are similar to them in some variables, at a higher rate than would be expected from random. This study aims to provide empirical evidence of assortative mating through the Big Five model of personality and two measures of intelligence using Spanish samples. The sample consisted of 244 Spanish couples. It was divided into two groups according to relationship time. The effect of age, educational level and socioeconomic status was controlled. The results showed strong assortative mating for intelligence and moderate for personality. The strongest correlations for Personality were found in Openness, Agreeableness and Conscientiousness.

Assortative mating for IQ and personality due to propinquity and personal preference:

The role of personal preference as an active process in mate selection is contrasted with the more passive results of limitations of available mates due to social, educational, and geographical propinquity. The role of personal preference estimated after removing the effects of variables representing propinquity was still significant for IQ and Eysenck’s extraversion-introversion and inconsistency (lie) scales, even though small.

Related: Heritability estimates versus large environmental effects: the IQ paradox resolved:

Some argue that the high heritability of IQ renders purely environmental explanations for large IQ differences between groups implausible. Yet, large environmentally induced IQ gains between generations suggest an important role for environment in shaping IQ. The authors present a formal model of the process determining IQ in which people’s IQs are affected by both environment and genes, but in which their environments are matched to their IQs. The authors show how such a model allows very large effects for environment, even incorporating the highest estimates of heritability. Besides resolving the paradox, the authors show that the model can account for a number of other phenomena, some of which are anomalous when viewed from the standard perspective.

 

4. Are upper class people genetically more intelligent? Or is there an effect of regression to the mean and all classes have about equal chances to spawn high IQ people?”

Stephen Hsu has a relevant post on the subject: Assortative mating, regression and all that: offspring IQ vs parental midpoint:

…James Lee, a real expert in the field, sent me a current best estimate for the probability distribution of offspring IQ as a function of parental midpoint (average between the parents’ IQs). James is finishing his Ph.D. at Harvard under Steve Pinker — you might have seen his review of R. Nesbitt’s book Intelligence and how to get it: Why schools and cultures count.

The results are stated further below. Once you plug in the numbers, you get (roughly) the following:

Assuming parental midpoint of n SD above the population average, the kids’ IQ will be normally distributed about a mean which is around +.6n with residual SD of about 12 points. (The .6 could actually be anywhere in the range (.5, .7), but the SD doesn’t vary much from choice of empirical inputs.)…

Read Hsu’s post for the rest of the details.

In short, while regression to the mean works for everyone, different people regress to different means depending on how smart their particular ancestors were. For example, if two people of IQ 100 have a kid with an IQ of 140, (Kid A) and two people of IQ 120 have a kid of IQ 140, (Kid B), Kid A’s own kids are likely to regress toward 100, while Kid B’s kids are likely to regress toward 120.

We can look at the effects of parental SES on SAT Scores and the like:

SAT scores by race and parental income

Personally, I know plenty of extremely intelligent people who come from low-SES backgrounds, but few of them ended up low-SES. Overall, I’d expect highly intelligent people to move up in status and less intelligent people to move down over time, with the upper class thus sort of “collecting” high-IQ people, but there are obviously regional and cultural effects that may make it inappropriate to compare across groups.

Hope that has been useful.