Welcome to EvX’s Book Club. Today we begin our exciting tour of Philip E. Auerswald’s The Code Eoconomy: A Forty-Thousand-Year History. with the introduction, Technology = Recipes, and Chapter one, Jobs: Divide and Coordinate if we get that far.
I’m not sure exactly how to run a book club, so just grab some coffee and let’s dive right in.
First, let’s note that Auerswald doesn’t mean code in the narrow sense of “commands fed into a computer” but in a much broader sense of all encoded processes humans have come up with. His go-to example is the cooking recipe.
The Code Economy describes the evolution of human productive activity from simplicity to complexity over the span of more than 40,000 years. I call this evolutionary process the advance of code.
I find the cooking example a bit cutesy, but otherwise it gets the job done.
How… have we humans managed to get where we are today despite our abundant failings, including wars, famine, and a demonstrably meager capacity for society-wide planning and coordination? … by developing productive activities that evolve into regular routines and standardized platforms–which is to say that we have survived, and thrived, by creating and advancing code.
There’s so much in this book that almost every sentence bears discussion. First, as I’ve noted before, social organization appears to be a spontaneous emergent feature of every human group. Without even really meaning to, humans just naturally seem compelled organize themselves. One day you’re hanging out with your friends, riding motorcycles, living like an outlaw, and the next thing you know you’re using the formal legal system to sue a toy store for infringement of your intellectual property.
At the same time, our ability to organize society at the national level is completely lacking. As one of my professors once put it, “God must hate communists, because every time a country goes communist, an “act of god” occurs and everyone dies.”
It’s a mystery why God hates communists so much, but hate ’em He does. Massive-scale social engineering is a total fail and we’ll still be suffering the results for a long time.
This creates a kind of conflict, because people can look at the small-scale organizing they do, and they look at large-scale disorganization, and struggle to understand why the small stuff can’t simply be scaled up.
And yet… society still kind of works. I can go to the grocery store and be reasonably certain that by some magical process, fresh produce has made its way from fields in California to the shelf in front of me. By some magical process, I can wave a piece of plastic around and use it to exchange enough other, unseen goods to pay for my groceries. I can climb into a car I didn’t build and cruise down a network of streets and intersections, reasonably confident that everyone else driving their own two-ton behemoth at 60 miles an hour a few feet away from me has internalized the same rules necessary for not crashing into me. Most of the time. And I can go to the gas station and pour a miracle liquid into my car and the whole system works, whether or not I have any clue how all of the parts manage to come together and do so.
The result is a miracle. Modern society is a miracle. If you don’t believe me, try using an outhouse for a few months. Try carrying all of your drinking water by hand from the local stream and chopping down all of the wood you need to boil it to make it potable. Try fighting off parasites, smallpox, or malaria without medicine or vaccinations. For all my complaints (and I know I complain a lot,) I love civilization. I love not worrying about cholera, crop failure, or dying from cavities. I love air conditioning, refrigerators, and flush toilets. I love books and the internet and domesticated strawberries. All of these are things I didn’t create and can’t take credit for, but get to enjoy nonetheless. I have been blessed.
But at the same time, “civilization” isn’t equally distributed. Millions (billions?) of the world’s peoples don’t have toilets, electricity, refrigerators, or even a decent road from their village to the next.
Auerswald is a passionate champion of code. His answer to unemployment problems is probably “learn to code,” but in such a broad, metaphorical way that encompasses so many human activities that we can probably forgive him for it. One thing he doesn’t examine is why code takes off in some places but not others. Why is civilization more complex in Hong Kong than in Somalia? Why does France boast more Fields Medalists than the DRC?
In our next book (Niall Ferguson’s The Great Degeneration,) we’ll discuss whether specific structures like legal and tax codes can affect how well societies grow and thrive (spoiler alert: they do, just see communism,) and of course you are already familiar with the Jared Diamond environmentalist theory that folks in some parts of the world just had better natural resources to work than in other parts (also true, at least in some cases. I’m not expecting some great industry to get up and running on its own in the arctic.)
But laying these concerns aside, there are obviously other broad factors at work. A map of GDP per capita looks an awful lot like a map of average IQs, with obvious caveats about the accidentally oil-rich Saudis and economically depressed ex-communists.
Auerswald believes that the past 40,000 years of code have not been disasters for the human race, but rather a cascade of successes, as each new invention and expansion to our repertoir of “recipes” or “codes” has enabled a whole host of new developments. For example, the development of copper tools didn’t just put flint knappers out of business, it also opened up whole new industries because you can make more varieties of tools out of copper than flint. Now we had copper miners, copper smelters (a new profession), copper workers. Copper tools could be sharpened and, unlike stone, resharpened, making copper tools more durable. Artists made jewelry; spools of copper wires became trade goods, traveling long distances and stimulating the prehistoric “economy.” New code bequeaths complexity and even more code, not mass flint-knapper unemployment.
Likewise, the increase in reliable food supply created by farming didn’t create mass hunter-gatherer unemployment, but stimulated the growth of cities and differentiation of humans into even more professions, like weavers, cobblers, haberdashers, writers, wheelwrights, and mathematicians.
It’s a hopeful view, and I appreciate it in these anxious times.
But it’s very easy to say that the advent of copper or bronze or agriculture was a success because we are descended from the people who succeeded. We’re not descended from the hunter-gatherers who got displaced or wiped out by agriculturalists. In recent cases where hunter-gatherer or herding societies were brought into the agriculturalist fold, the process has been rather painful.
Elizabeth Marshall Thomas’s The Harmless People, about the Bushmen of the Kalahari, might overplay the romance and downplay the violence, but the epilogue’s description of how the arrival of “civilization” resulted in the deaths and degradation of the Bushmen brought tears to my eyes. First they died of dehydration because new fences erected to protect “private property” cut them off from the only water. No longer free to pursue the lives they had lived for centuries, they were moved onto what are essentially reservations and taught to farm and herd. Alcoholism and violence became rampant.
Among the book’s many characters was a man who had lost most of his leg to snakebite. He suffered terribly as his leg rotted away, cared for by his wife and family who brought him food. Eventually, with help, he healed and obtained a pair of crutches, learned to walk again, and resumed hunting: providing for his family.
And then in “civilization” he was murdered by one of his fellow Bushmen.
It’s a sad story and there are no easy answers. Bushman life is hard. Most people, when given the choice, seem to pick civilization. But usually we aren’t given a choice. The Bushmen weren’t. Neither were factory workers who saw their jobs automated and outsourced. Some Bushmen will adapt and thrive. Nelson Mandela was part Bushman, and he did quite well for himself. But many will suffer.
What to do about the suffering of those left behind–those who cannot cope with change, who do not have the mental or physical capacity to “learn to code” or otherwise adapt remains an unanswered question. Humanity might move on without them, ignoring their suffering because we find them undeserving of compassion–or we might get bogged down trying to save them all. Perhaps we can find a third route: sympathy for the unfortunate without encouraging obsolete behavior?
In The Great Degeneration, Ferguson wonders why the systems (“code”) that supports our society appears to be degenerating. I have a crude but answer: people are getting stupider. It takes a certain amount of intelligence to run a piece of code. Even a simple task like transcribing numbers is better performed by a smarter person than a dumber person, who is more likely to accidentally write down the wrong number. Human systems are built and executed by humans, and if the humans in them are less intelligent than the ones who made them, then they will do a bad job of running the systems.
Unfortunately for those of us over in civilization, dysgenics is a real thing:
Whether you blame IQ itself or the number of years smart people spend in school, dumb people have more kids (especially the parents of the Baby Boomers.) Epigone here only looks at white data (I believe Jayman has the black data and it’s just as bad, if not worse.)
Of course we can debate about the Flynn effect and all that, but I suspect there two competing things going on: First, a rising 50’s economic tide lifted all boats, making everyone healthier and thus smarter and better at taking IQ tests and making babies, and second, declining infant mortality since the late 1800s and possibly the Welfare state made it easier for the children of the poorest and least capable parents to survive.
The effects of these two trends probably cancel out at first, but after a while you run out of Flynn effect (maybe) and then the other starts to show up. Eventually you get Greece: once the shining light of Civilization, now defaulting on its loans.
Well, we have made it a page in!
What do you think of the book? Have you finished it yet? What do you think of the way Auersbach conceptualizes of “code” and its basis as the building block of pretty much all human activity? Do you think Auersbach is essentially correct to be hopeful about our increasingly code-driven future, or should we beware of the tradeoffs to individual autonomy and freedom inherent in becoming a glorified colony of ants?
The other day on Twitter, Nick B. Steves challenged me to find data supporting or refuting his assertion that Nerds vs. Jocks is a false stereotype, invented around 1975. Of course, we HBDers have a saying–“all stereotypes are true,” even the ones about us–but let’s investigate Nick’s claim and see where it leads us.
(NOTE: If you have relevant data, I’d love to see it.)
Unfortunately, terms like “nerd,” “jock,” and “chad” are not all that well defined. Certainly if we define “jock” as “athletic but not smart” and nerd as “smart but not athletic,” then these are clearly separate categories. But what if there’s a much bigger group of people who are smart and athletic?
Or what if we are defining “nerd” and “jock” too narrowly? Wikipedia defines nerd as, “a person seen as overly intellectual, obsessive, or lacking social skills.” I recall a study–which I cannot find right now–which found that nerds had, overall, lower-than-average IQs, but that study included people who were obsessive about things like comic books, not just people who majored in STEM. Similarly, should we define “jock” only as people who are good at sports, or do passionate sports fans count?
For the sake of this post, I will define “nerd” as “people with high math/science abilities” and “jock” as “people with high athletic abilities,” leaving the matter of social skills undefined. (People who merely like video games or watch sports, therefore, do not count.)
Nick is correct on one count: according to Wikipedia, although the word “nerd” has been around since 1951, it was popularized during the 70s by the sitcom Happy Days. However, Wikipedia also notes that:
An alternate spelling, as nurd or gnurd, also began to appear in the mid-1960s or early 1970s. Author Philip K. Dick claimed to have coined the nurd spelling in 1973, but its first recorded use appeared in a 1965 student publication at Rensselaer Polytechnic Institute.Oral tradition there holds that the word is derived from knurd (drunk spelled backward), which was used to describe people who studied rather than partied. The term gnurd (spelled with the “g”) was in use at the Massachusetts Institute of Technology by 1965. The term nurd was also in use at the Massachusetts Institute of Technology as early as 1971 but was used in the context for the proper name of a fictional character in a satirical “news” article.
suggesting that the word was already common among nerds themselves before it was picked up by TV.
Terman’s goal was to disprove the then-current belief that gifted children were sickly, socially inept, and not well-rounded.
This belief was especially popular in a little nation known as Germany, where it inspired people to take schoolchildren on long hikes in the woods to keep them fit and the mass-extermination of Jews, who were believed to be muddying the German genepool with their weak, sickly, high-IQ genes (and nefariously trying to marry strong, healthy German in order to replenish their own defective stock.) It didn’t help that German Jews were both high-IQ and beset by a number of illnesses (probably related to high rates of consanguinity,) but then again, the Gypsies are beset by even more debilitating illnesses, but no one blames this on all of the fresh air and exercise afforded by their highly mobile lifestyles.
(Just to be thorough, though, the Nazis also exterminated the Gypsies and Hans Asperger’s subjects, despite Asperger’s insistence that they were very clever children who could probably be of great use to the German war effort via code breaking and the like.)
The results of Terman’s study are strongly in Nick’s favor. According to Psychology Today’s account:
His final group of “Termites” averaged a whopping IQ of 151. Following-up his group 35-years later, his gifted group at mid-life definitely seemed to conform to his expectations. They were taller, healthier, physically better developed, and socially adept (dispelling the myth at the time of high-IQ awkward nerds).
…the first volume of the study reported data on the children’s family, educational progress, special abilities, interests, play, and personality. He also examined the children’s racial and ethnic heritage. Terman was a proponent of eugenics, although not as radical as many of his contemporary social Darwinists, and believed that intelligence testing could be used as a positive tool to shape society.
Based on data collected in 1921–22, Terman concluded that gifted children suffered no more health problems than normal for their age, save a little more myopia than average. He also found that the children were usually social, were well-adjusted, did better in school, and were even taller than average. A follow-up performed in 1923–1924 found that the children had maintained their high IQs and were still above average overall as a group.
Of course, we can go back even further than Terman–in the early 1800s, allergies like hay fever were associated with the nobility, who of course did not do much vigorous work in the fields.
My impression, based on studies I’ve seen previously, is that athleticism and IQ are positively correlated. That is, smarter people tend to be more athletic, and more athletic people tend to be smarter. There’s a very obvious reason for this: our brains are part of our bodies, people with healthier bodies therefore also have healthier brains, and healthier brains tend to work better.
At the very bottom of the IQ distribution, mentally retarded people tend to also be clumsy, flacid, or lacking good muscle tone. The same genes (or environmental conditions) that make children have terrible health/developmental problems often also affect their brain growth, and conditions that affect their brains also affect their bodies. As we progress from low to average to above-average IQ, we encounter increasingly healthy people.
In most smart people, high-IQ doesn’t seem to be a random fluke, a genetic error, nor fitness reducing: in a genetic study of children with exceptionally high IQs, researchers failed to find many genes that specifically endowed the children with genius, but found instead a fortuitous absence of deleterious genes that knock a few points off the rest of us. The same genes that have a negative effect on the nerves and proteins in your brain probably also have a deleterious effect on the nerves and proteins throughout the rest of your body.
Controlling for age, physical maturity, and mother’s education, a significant curvilinear relationship between intelligence and coital status was demonstrated; adolescents at the upper and lower ends of the intelligence distribution were less likely to have sex. Higher intelligence was also associated with postponement of the initiation of the full range of partnered sexual activities. … Higher intelligence operates as a protective factor against early sexual activity during adolescence, and lower intelligence, to a point, is a risk factor.
Here we see the issue plainly: males at 120 and 130 IQ are less likely to get laid than clinically retarded men in 70s and 60s. The right side of the graph are “nerds”, the left side, “jocks.” Of course, the high-IQ females are even less likely to get laid than the high-IQ males, but males tend to judge themselves against other men, not women, when it comes to dating success. Since the low-IQ females are much less likely to get laid than the low-IQ males, this implies that most of these “popular” guys are dating girls who are smarter than themselves–a fact not lost on the nerds, who would also like to date those girls.
In 2001, the MIT/Wellesley magazine Counterpart (Wellesley is MIT’s “sister school” and the two campuses allow cross-enrollment in each other’s courses) published a sex survey that provides a more detailed picture of nerd virginity:
I’m guessing that computer scientists invented polyamory, and neuroscientists are the chads of STEM. The results are otherwise pretty predictable.
Unfortunately, Counterpoint appears to be defunct due to lack of funding/interest and I can no longer find the original survey, but here is Jason Malloy’s summary from Gene Expression:
By the age of 19, 80% of US males and 75% of women have lost their virginity, and 87% of college students have had sex. But this number appears to be much lower at elite (i.e. more intelligent) colleges. According to the article, only 56% of Princeton undergraduates have had intercourse. At Harvard 59% of the undergraduates are non-virgins, and at MIT, only a slight majority, 51%, have had intercourse. Further, only 65% of MIT graduate students have had sex.
The student surveys at MIT and Wellesley also compared virginity by academic major. The chart for Wellesley displayed below shows that 0% of studio art majors were virgins, but 72% of biology majors were virgins, and 83% of biochem and math majors were virgins! Similarly, at MIT 20% of ‘humanities’ majors were virgins, but 73% of biology majors. (Apparently those most likely to read Darwin are also the least Darwinian!)
How Rolling Stone-ish are the few lucky souls who are doing the horizontal mambo? Well, not very. Considering all the non-virgins on campus, 41% of Wellesley and 32% of MIT students have only had one partner (figure 5). It seems that many Wellesley and MIT students are comfortingly monogamous. Only 9% of those who have gotten it on at MIT have been with more than 10 people and the number is 7% at Wellesley.
Someone needs to find the original study and PUT IT BACK ON THE INTERNET.
But this lack of early sexual success seems to translate into long-term marital happiness, once nerds find “the one.”Lex Fridman’s Divorce Rates by Profession offers a thorough list. The average divorce rate was 16.35%, with a high of 43% (Dancers) and a low of 0% (“Media and communication equipment workers.”)
I’m not sure exactly what all of these jobs are nor exactly which ones should count as STEM (veterinarian? anthropologists?) nor do I know how many people are employed in each field, but I count 49 STEM professions that have lower than average divorce rates (including computer scientists, economists, mathematical science, statisticians, engineers, biologists, chemists, aerospace engineers, astronomers and physicists, physicians, and nuclear engineers,) and only 23 with higher than average divorce rates (including electricians, water treatment plant operators, radio and telecommunication installers, broadcast engineers, and similar professions.) The purer sciences obviously had lower rates than the more practical applied tech fields.
The big outliers were mathematicians (19.15%), psychologists (19.26%), and sociologists (23.53%), though I’m not sure they count (if so, there were only 22 professions with higher than average divorce rates.)
I’m not sure which professions count as “jock” or “chad,” but athletes had lower than average rates of divorce (14.05%) as did firefighters, soldiers, and farmers. Financial examiners, hunters, and dancers, (presumably an athletic female occupation) however, had very high rates of divorce.
According to the survey recently taken by the “infidelity dating website,” Victoria Milan, individuals working in the finance field, such as brokers, bankers, and analysts, are more likely to cheat than those in any other profession. However, following those in finance comes those in the aviation field, healthcare, business, and sports.
With the exception of healthcare and maybe aviation, these are pretty typical Chad occupations, not STEM.
The Mirror has a similar list of jobs where people are most and least likely to be married. Most likely: Dentist, Chief Executive, Sales Engineer, Physician, Podiatrist, Optometrist, Farm product buyer, Precision grinder, Religious worker, Tool and die maker.
Least likely: Paper-hanger, Drilling machine operator, Knitter textile operator, Forge operator, Mail handler, Science technician, Practical nurse, Social welfare clerk, Winding machine operative, Postal clerk.
I struggled to find data on male fertility by profession/education/IQ, but there’s plenty on female fertility, eg the deceptively titled High-Fliers have more Babies:
…American women without any form of high-school diploma have a fertility rate of 2.24 children. Among women with a high-school diploma the fertility rate falls to 2.09 and for women with some form of college education it drops to 1.78.
However, among women with college degrees, the economists found the fertility rate rises to 1.88 and among women with advanced degrees to 1.96. In 1980 women who had studied for 16 years or more had a fertility rate of just 1.2.
As the economists prosaically explain: “The relationship between fertility and women’s education in the US has recently become U-shaped.”
Here is another article about the difference in fertility rates between high and low-IQ women.
But female fertility and male fertility may not be the same–I recall data elsewhere indicating that high-IQ men have more children than low IQ men, which implies those men are having their children with low-IQ women. (For example, while Bill and Hillary seem about matched on IQ, and have only one child, Melania Trump does not seem as intelligent as Trump, who has five children.)
Of the 1,508,874 children born in 1920 in the birth registration area of the United states, occupations of fathers are stated for … 96.9%… The average number of children ever born to the present wives of these occupied fathers is 3.3 and the average number of children living 2.9.
The average number of children ever born ranges from 4.6 for foremen, overseers, and inspectors engaged in the extraction of minerals to 1.8 for soldiers, sailors, and marines. Both of these extreme averages are easily explained, for soldier, sailors and marines are usually young, while such foremen, overseers, and inspectors are usually in middle life. For many occupations, however, the ages of the fathers are presumably about the same and differences shown indicate real differences in the size of families. For example, the low figure for dentists, (2), architects, (2.1), and artists, sculptors, and teachers of art (2.2) are in striking contrast with the figure for mine operatives (4.3), quarry operatives (4.1) bootblacks, and brick and stone masons (each 3.9). …
As a rule the occupations credited with the highest number of children born are also credited with the highest number of children living, the highest number of children living appearing for foremen, overseers, and inspectors engaged in the extraction of minerals (3.9) and for steam and street railroad foremen and overseer (3.8), while if we exclude groups plainly affected by the age of fathers, the highest number of children living appear for mine and quarry operatives (each 3.6).
Obviously the job market was very different in 1920–no one was majoring in computer science. Perhaps some of those folks who became mine and quarry operatives back then would become engineers today–or perhaps not. Here are the average numbers of surviving children for the most obviously STEM professions (remember average for 1920 was 2.9):
The Journal-Constitution studied 54 public universities, “including the members of the six major Bowl Championship Series conferences and other schools whose teams finished the 2007-08 season ranked among the football or men’s basketball top 25.”…
Football players average 220 points lower on the SAT than their classmates. Men’s basketball was 227 points lower.
University of Florida won the prize for biggest gap between football players and the student body, with players scoring 346 points lower than their peers.
Georgia Tech had the nation’s best average SAT score for football players, 1028 of a possible 1600, and best average high school GPA, 3.39 of a possible 4.0. But because its student body is apparently very smart, Tech’s football players still scored 315 SAT points lower than their classmates.
UCLA, which has won more NCAA championships in all sports than any other school, had the biggest gap between the average SAT scores of athletes in all sports and its overall student body, at 247 points.
From the original article, which no longer seems to be up on the Journal-Constitution website:
All 53 schools for which football SAT scores were available had at least an 88-point gap between team members’ average score and the average for the student body. …
Football players performed 115 points worse on the SAT than male athletes in other sports.
The differences between athletes’ and non-athletes’ SAT scores were less than half as big for women (73 points) as for men (170).
Many schools routinely used a special admissions process to admit athletes who did not meet the normal entrance requirements. … At Georgia, for instance, 73.5 percent of athletes were special admits compared with 6.6 percent of the student body as a whole.
On the other hand, as Discover Magazine discusses in “The Brain: Why Athletes are Geniuses,” athletic tasks–like catching a fly ball or slapping a hockey puck–require exceptionally fast and accurate brain signals to trigger the correct muscle movements.
Ryan Stegal studied the GPAs of highschool student athletes vs. non-athletes and found that the athletes had higher average GPAs than the non-athletes, but he also notes that the athletes were required to meet certain minimum GPA requirements in order to play.
But within athletics, it looks like the smarter athletes perform better than dumber ones, which is why the NFL uses the Wonderlic Intelligence Test:
NFL draft picks have taken the Wonderlic test for years because team owners need to know if their million dollar player has the cognitive skills to be a star on the field.
What does the NFL know about hiring that most companies don’t? They know that regardless of the position, proof of intelligence plays a profound role in the success of every individual on the team. It’s not enough to have physical ability. The coaches understand that players have to be smart and think quickly to succeed on the field, and the closer they are to the ball the smarter they need to be. That’s why, every potential draft pick takes the Wonderlic Personnel Test at the combine to prove he does–or doesn’t—have the brains to win the game. …
The first use of the WPT in the NFL was by Tom Landry of the Dallas Cowboys in the early 70s, who took a scientific approach to finding players. He believed players who could use their minds where it counted had a strategic advantage over the other teams. He was right, and the test has been used at the combine ever since.
For the NFL, years of testing shows that the higher a player scores on the Wonderlic, the more likely he is to be in the starting lineup—for any position. “There is no other reasonable explanation for the difference in test scores between starting players and those that sit on the bench,” Callans says. “Intelligence plays a role in how well they play the game.”
A large study conducted at the Sahlgrenska Academy and Sahlgrenska University Hospital in Gothenburg, Sweden, reveals that young adults who regularly exercise have higher IQ scores and are more likely to go on to university.
The study was published in the Proceedings of the National Academy of Sciences (PNAS), and involved more than 1.2 million Swedish men. The men were performing military service and were born between the years 1950 and 1976. Both their physical and IQ test scores were reviewed by the research team. …
The researchers also looked at data for twins and determined that primarily environmental factors are responsible for the association between IQ and fitness, and not genetic makeup. “We have also shown that those youngsters who improve their physical fitness between the ages of 15 and 18 increase their cognitive performance.”…
I have seen similar studies before, some involving mice and some, IIRC, the elderly. It appears that exercise is probably good for you.
I have a few more studies I’d like to mention quickly before moving on to discussion.
Overall, it looks like smarter people are more athletic, more athletic people are smarter, smarter athletes are better athletes, and exercise may make you smarter. For most people, the nerd/jock dichotomy is wrong.
However, there is very little overlap at the very highest end of the athletic and intelligence curves–most college (and thus professional) athletes are less intelligent than the average college student, and most college students are less athletic than the average college (and professional) athlete.
Additionally, while people with STEM degrees make excellent spouses (except for mathematicians, apparently,) their reproductive success is below average: they have sex later than their peers and, as far as the data I’ve been able to find shows, have fewer children.
Even if there is a large overlap between smart people and athletes, they are still separate categories selecting for different things: a cripple can still be a genius, but can’t play football; a dumb person can play sports, but not do well at math. Stephen Hawking can barely move, but he’s still one of the smartest people in the world. So the set of all smart people will always include more “stereotypical nerds” than the set of all athletes, and the set of all athletes will always include more “stereotypical jocks” than the set of all smart people.
In my experience, nerds aren’t socially awkward (aside from their shyness around women.) The myth that they are stems from the fact that they have different interests and communicate in a different way than non-nerds. Let nerds talk to other nerds, and they are perfectly normal, communicative, socially functional people. Put them in a room full of non-nerds, and suddenly the nerds are “awkward.”
Unfortunately, the vast majority of people are not nerds, so many nerds have to spend the majority of their time in the company of lots of people who are very different than themselves. By contrast, very few people of normal IQ and interests ever have to spend time surrounded by the very small population of nerds. If you did put them in a room full of nerds, however, you’d find that suddenly they don’t fit in. The perception that nerds are socially awkward is therefore just normie bias.
Why did the nerd/jock dichotomy become so popular in the 70s? Probably in part because science and technology were really taking off as fields normal people could aspire to major in, man had just landed on the moon and the Intel 4004 was released in 1971. Very few people went to college or were employed in sciences back in 1920; by 1970, colleges were everywhere and science was booming.
And at the same time, colleges and highschools were ramping up their athletics programs. I’d wager that the average school in the 1800s had neither PE nor athletics of any sort. To find those, you’d probably have to attend private academies like Andover or Exeter. By the 70s, though, schools were taking their athletics programs–even athletic recruitment–seriously.
How strong you felt the dichotomy probably depends on the nature of your school. I have attended schools where all of the students were fairly smart and there was no anti-nerd sentiment, and I have attended schools where my classmates were fiercely anti-nerd and made sure I knew it.
But the dichotomy predates the terminology. Take Superman, first 1938. His disguise is a pair of glasses, because no one can believe that the bookish, mild-mannered, Clark Kent is actually the super-strong Superman. Batman is based on the character of El Zorro, created in 1919. Zorro is an effete, weak, foppish nobleman by day and a dashing, sword-fighting hero of the poor by night. Of course these characters are both smart and athletic, but their disguises only work because others do not expect them to be. As fantasies, the characters are powerful because they provide a vehicle for our own desires: for our everyday normal failings to be just a cover for how secretly amazing we are.
But for the most part, most smart people are perfectly fit, healthy, and coordinated–even the ones who like math.
Using a mobile-based virtual reality navigation task, we measured spatial navigation ability in more than 2.5 million people globally. Using a clustering approach, we find that navigation ability is not smoothly distributed globally but clustered into five distinct yet geographically related groups of countries. Furthermore, the economic wealth of a nation (Gross Domestic Product per capita) was predictive of the average navigation ability of its inhabitants and gender inequality (Gender Gap Index) was predictive of the size of performance difference between males and females. Thus, cognitive abilities, at least for spatial navigation, are clustered according to economic wealth and gender inequalities globally.
This is an incredible study. They got 2.5 million people from all over the world to participate.
If you’ve been following any of the myriad debates about intelligence, IQ, and education, you’re probably familiar with the concept of “multiple intelligences” and the fact that there’s rather little evidence that people actually have “different intelligences” that operate separately from each other. In general, it looks like people who have brains that are good at working out how to do one kind of task tend to be good at working out other sorts of tasks.
I’ve long held navigational ability as a possible exception to this: perhaps people in, say, Polynesian societies depended historically far more on navigational abilities than the rest of us, even though math and literacy were nearly absent.
Unfortunately, it doesn’t look like the authors got enough samples from Polynesia to include it in the study, but they did get data from Indonesia and the Philippines, which I’ll return to in a moment.
Frankly, I don’t see what the authors mean by “five distinct yet geographically related groups of countries.” South Korea is ranked between the UK and Belgium; Russia is next to Malaysia; Indonesia is next to Portugal and Hungary.
GDP per capita appears to be a stronger predictor than geography:
Some people will say these results merely reflect experience playing video games–people in wealthier countries have probably spent more time and money on computers and games. But assuming that the people who are participating in the study in the first place are people who have access to smartphones, computers, video games, etc., the results are not good for the multiple-intelligences hypothesis.
In the GDP per Capita vs. Conditional Modes (ie how well a nation scored overall, with low scores better than high scores) graph, countries above the trend line are under-performing relative to their GDPs, and countries below the line are over-performing relative to their GDPs.
South Africa, for example, significantly over-performs relative to its GDP, probably due to sampling bias: white South Africans with smartphones and computers were probably more likely to participate in the study than the nation’s 90% black population, but the GDP reflects the entire population. Finland and New Zealand are also under-performing economically, perhaps because Finland is really cold and NZ is isolated.
On the other side of the line, the UAE, Saudi Arabia, and Greece over-perform relative to GDP. Two of these are oil states that would be much poorer if not for geographic chance, and as far as I can tell, the whole Greek economy is being propped up by German loans. (There is also evidence that Greek IQ is falling, though this may be a near universal problem in developed nations.)
Three other nations stand out in the “scoring better than GDP predicts” category: Ukraine, (which suffered under Communism–Communism seems to do bad things to countries,) Indonesia and the Philippines. While we could be looking at selection bias similar to South Africa, these are island nations in which navigational ability surely had some historical effect on people’s ability to survive.
Indonesia and the Philippines still didn’t do as well as first-world nations like Norway and Canada, but they outperformed other nations with similar GDPs like Egypt, India, and Macedonia. This is the best evidence I know of for independent selection for navigational ability in some populations.
The study’s other interesting findings were that women performed consistently worse than men, both across countries and age groups (except for the post-90 cohort, but that might just be an error in the data.) Navigational ability declines steeply for everyone post-23 years old until about 75 years; the authors suggest the subsequent increase in abilities post-70s might be sampling error due to old people who are good at video games being disproportionately likely to seek out video game related challenges.
The authors note that people who drive more (eg, the US and Canada) might do better on navigational tasks than people who use public transportation more (eg, Europeans) but also that Finno-Scandians are among the world’s best navigators despite heavy use of public transport in those countries. The authors write:
We speculate that this specificity may be linked to Nordic countries sharing a culture of participating in a sport related to navigation: orienteering. Invented as an official sport in the late 19th century in Sweden, the first orienteering competition open to the public was held in Norway in 1897. Since then, it has been more popular in Nordic countries than anywhere else in the world, and is taught in many schools . We found that ‘orienteering world championship’ country results significantly correlated with countries’ CM (Pearson’s correlation ρ = .55, p = .01), even after correcting for GDP per capita (see Extended Data Fig. 15). Future targeted research will be required to evaluate the impact of cultural activities on navigation skill.
I suggest a different causal relationship: people make hobbies out of things they’re already good at and enjoy doing, rather than things they’re bad at.
Please note that the study doesn’t look at a big chunk of countries, like most of Africa. Being at the bottom in navigational abilities in this study by no means indicates that a country is at the bottom globally–given the trends already present in the data, it is likely that the poorer countries that weren’t included in the study would do even worse.
A species may live in relative equilibrium with its environment, hardly changing from generation to generation, for millions of years. Turtles, for example, have barely changed since the Cretaceous, when dinosaurs still roamed the Earth.
But if the environment changes–critically, if selective pressures change–then the species will change, too. This was most famously demonstrated with English moths, which changed color from white-and-black speckled to pure black when pollution darkened the trunks of the trees they lived on. To survive, these moths need to avoid being eaten by birds, so any moth that stands out against the tree trunks tends to get turned into an avian snack. Against light-colored trees, dark-colored moths stood out and were eaten. Against dark-colored trees, light-colored moths stand out.
This change did not require millions of years. Dark-colored moths were virtually unknown in 1810, but by 1895, 98% of the moths were black.
The time it takes for evolution to occur depends simply on A. The frequency of a trait in the population and B. How strongly you are selecting for (or against) it.
Let’s break this down a little bit. Within a species, there exists a great deal of genetic variation. Some of this variation happens because two parents with different genes get together and produce offspring with a combination of their genes. Some of this variation happens because of random errors–mutations–that occur during copying of the genetic code. Much of the “natural variation” we see today started as some kind of error that proved to be useful, or at least not harmful. For example, all humans originally had dark skin similar to modern Africans’, but random mutations in some of the folks who no longer lived in Africa gave them lighter skin, eventually producing “white” and “Asian” skin tones.
(These random mutations also happen in Africa, but there they are harmful and so don’t stick around.)
Natural selection can only act on the traits that are actually present in the population. If we tried to select for “ability to shoot x-ray lasers from our eyes,” we wouldn’t get very far, because no one actually has that mutation. By contrast, albinism is rare, but it definitely exists, and if for some reason we wanted to select for it, we certainly could. (The incidence of albinism among the Hopi Indians is high enough–1 in 200 Hopis vs. 1 in 20,000 Europeans generally and 1 in 30,000 Southern Europeans–for scientists to discuss whether the Hopi have been actively selecting for albinism. This still isn’t a lot of albinism, but since the American Southwest is not a good environment for pale skin, it’s something.)
You will have a much easier time selecting for traits that crop up more frequently in your population than traits that crop up rarely (or never).
Second, we have intensity–and variety–of selective pressure. What % of your population is getting removed by natural selection each year? If 50% of your moths get eaten by birds because they’re too light, you’ll get a much faster change than if only 10% of moths get eaten.
Selection doesn’t have to involve getting eaten, though. Perhaps some of your moths are moth Lotharios, seducing all of the moth ladies with their fuzzy antennae. Over time, the moth population will develop fuzzier antennae as these handsome males out-reproduce their less hirsute cousins.
No matter what kind of selection you have, nor what part of your curve it’s working on, all that ultimately matters is how many offspring each individual has. If white moths have more children than black moths, then you end up with more white moths. If black moths have more babies, then you get more black moths.
So what happens when you completely remove selective pressures from a population?
Back in 1968, ethologist John B. Calhoun set up an experiment popularly called “Mouse Utopia.” Four pairs of mice were given a large, comfortable habitat with no predators and plenty of food and water.
Predictably, the mouse population increased rapidly–once the mice were established in their new homes, their population doubled every 55 days. But after 211 days of explosive growth, reproduction began–mysteriously–to slow. For the next 245 days, the mouse population doubled only once every 145 days.
The birth rate continued to decline. As births and death reached parity, the mouse population stopped growing. Finally the last breeding female died, and the whole colony went extinct.
As I’ve mentioned before Israel is (AFAIK) the only developed country in the world with a TFR above replacement.
It has long been known that overcrowding leads to population stress and reduced reproduction, but overcrowding can only explain why the mouse population began to shrink–not why it died out. Surely by the time there were only a few breeding pairs left, things had become comfortable enough for the remaining mice to resume reproducing. Why did the population not stabilize at some comfortable level?
Professor Bruce Charlton suggests an alternative explanation: the removal of selective pressures on the mouse population resulted in increasing mutational load, until the entire population became too mutated to reproduce.
Unfortunately, randomly changing part of your genetic code is more likely to give you no skin than skintanium armor.
But only the worst genetic problems that never see the light of day. Plenty of mutations merely reduce fitness without actually killing you. Down Syndrome, famously, is caused by an extra copy of chromosome 21.
While a few traits–such as sex or eye color–can be simply modeled as influenced by only one or two genes, many traits–such as height or IQ–appear to be influenced by hundreds or thousands of genes:
Differences in human height is 60–80% heritable, according to several twin studies and has been considered polygenic since the Mendelian-biometrician debate a hundred years ago. A genome-wide association (GWA) study of more than 180,000 individuals has identified hundreds of genetic variants in at least 180 loci associated with adult human height. The number of individuals has since been expanded to 253,288 individuals and the number of genetic variants identified is 697 in 423 genetic loci.
Obviously most of these genes each plays only a small role in determining overall height (and this is of course holding environmental factors constant.) There are a few extreme conditions–gigantism and dwarfism–that are caused by single mutations, but the vast majority of height variation is caused by which particular mix of those 700 or so variants you happen to have.
The general figure for the heritability of IQ, according to an authoritative American Psychological Association report, is 0.45 for children, and rises to around 0.75 for late teens and adults. In simpler terms, IQ goes from being weakly correlated with genetics, for children, to being strongly correlated with genetics for late teens and adults. … Recent studies suggest that family and parenting characteristics are not significant contributors to variation in IQ scores; however, poor prenatal environment, malnutrition and disease can have deleterious effects.…
Despite intelligence having substantial heritability2 (0.54) and a confirmed polygenic nature, initial genetic studies were mostly underpowered3, 4, 5. Here we report a meta-analysis for intelligence of 78,308 individuals. We identify 336 associated SNPs (METAL P < 5 × 10−8) in 18 genomic loci, of which 15 are new. Around half of the SNPs are located inside a gene, implicating 22 genes, of which 11 are new findings. Gene-based analyses identified an additional 30 genes (MAGMA P < 2.73 × 10−6), of which all but one had not been implicated previously. We show that the identified genes are predominantly expressed in brain tissue, and pathway analysis indicates the involvement of genes regulating cell development (MAGMA competitive P = 3.5 × 10−6). Despite the well-known difference in twin-based heritability2 for intelligence in childhood (0.45) and adulthood (0.80), we show substantial genetic correlation (rg = 0.89, LD score regression P = 5.4 × 10−29). These findings provide new insight into the genetic architecture of intelligence.
The greater number of genes influence a trait, the harder they are to identify without extremely large studies, because any small group of people might not even have the same set of relevant genes.
High IQ correlates positively with a number of life outcomes, like health and longevity, while low IQ correlates with negative outcomes like disease, mental illness, and early death. Obviously this is in part because dumb people are more likely to make dumb choices which lead to death or disease, but IQ also correlates with choice-free matters like height and your ability to quickly press a button. Our brains are not some mysterious entities floating in a void, but physical parts of our bodies, and anything that affects our overall health and physical functioning is likely to also have an effect on our brains.
The study focused, for the first time, on rare, functional SNPs – rare because previous research had only considered common SNPs and functional because these are SNPs that are likely to cause differences in the creation of proteins.
The researchers did not find any individual protein-altering SNPs that met strict criteria for differences between the high-intelligence group and the control group. However, for SNPs that showed some difference between the groups, the rare allele was less frequently observed in the high intelligence group. This observation is consistent with research indicating that rare functional alleles are more often detrimental than beneficial to intelligence.
Greg Cochran has some interesting Thoughts on Genetic Load. (Currently, the most interesting candidate genes for potentially increasing IQ also have terrible side effects, like autism, Tay Sachs and Torsion Dystonia. The idea is that–perhaps–if you have only a few genes related to the condition, you get an IQ boost, but if you have too many, you get screwed.) Of course, even conventional high-IQ has a cost: increased maternal mortality (larger heads).
the difference between the fitness of an average genotype in a population and the fitness of some reference genotype, which may be either the best present in a population, or may be the theoretically optimal genotype. … Deleterious mutation load is the main contributing factor to genetic load overall. Most mutations are deleterious, and occur at a high rate.
There’s math, if you want it.
Normally, genetic mutations are removed from the population at a rate determined by how bad they are. Really bad mutations kill you instantly, and so are never born. Slightly less bad mutations might survive, but never reproduce. Mutations that are only a little bit deleterious might have no obvious effect, but result in having slightly fewer children than your neighbors. Over many generations, this mutation will eventually disappear.
(Some mutations are more complicated–sickle cell, for example, is protective against malaria if you have only one copy of the mutation, but gives you sickle cell anemia if you have two.)
Throughout history, infant mortality was our single biggest killer. For example, here is some data from Jakubany, a town in the Carpathian Mountains:
We can see that, prior to the 1900s, the town’s infant mortality rate stayed consistently above 20%, and often peaked near 80%.
When I first ran a calculation of the infant mortality rate, I could not believe certain of the intermediate results. I recompiled all of the data and recalculated … with the same astounding result – 50.4% of the children born in Jakubany between the years 1772 and 1890 would diebefore reaching ten years of age! …one out of every two! Further, over the same 118 year period, of the 13306 children who were born, 2958 died (~22 %) before reaching the age of one.
Historical infant mortality rates can be difficult to calculate in part because they were so high, people didn’t always bother to record infant deaths. And since infants are small and their bones delicate, their burials are not as easy to find as adults’. Nevertheless, Wikipedia estimates that Paleolithic man had an average life expectancy of 33 years:
Based on the data from recent hunter-gatherer populations, it is estimated that at 15, life expectancy was an additional 39 years (total 54), with a 0.60 probability of reaching 15.
In other words, a 40% chance of dying in childhood. (Not exactly the same as infant mortality, but close.)
Wikipedia gives similarly dismal stats for life expectancy in the Neolithic (20-33), Bronze and Iron ages (26), Classical Greece(28 or 25), Classical Rome (20-30), Pre-Columbian Southwest US (25-30), Medieval Islamic Caliphate (35), Late Medieval English Peerage (30), early modern England (33-40), and the whole world in 1900 (31).
Over at ThoughtCo: Surviving Infancy in the Middle Ages, the author reports estimates for between 30 and 50% infant mortality rates. I recall a study on Anasazi nutrition which I sadly can’t locate right now, which found 100% malnutrition rates among adults (based on enamel hypoplasias,) and 50% infant mortality.
As Priceonomics notes, the main driver of increasing global life expectancy–48 years in 1950 and 71.5 years in 2014 (according to Wikipedia)–has been a massive decrease in infant mortality. The average life expectancy of an American newborn back in 1900 was only 47 and a half years, whereas a 60 year old could expect to live to be 75. In 1998, the average infant could expect to live to about 75, and the average 60 year old could expect to live to about 80.
Michael A Woodley suggests that what was going on [in the Mouse experiment] was much more likely to be mutation accumulation; with deleterious (but non-fatal) genes incrementally accumulating with each generation and generating a wide range of increasingly maladaptive behavioural pathologies; this process rapidly overwhelming and destroying the population before any beneficial mutations could emerge to ‘save; the colony from extinction. …
The reason why mouse utopia might produce so rapid and extreme a mutation accumulation is that wild mice naturally suffer very high mortality rates from predation. …
Thus mutation selection balance is in operation among wild mice, with very high mortality rates continually weeding-out the high rate of spontaneously-occurring new mutations (especially among males) – with typically only a small and relatively mutation-free proportion of the (large numbers of) offspring surviving to reproduce; and a minority of the most active and healthy (mutation free) males siring the bulk of each generation.
However, in Mouse Utopia, there is no predation and all the other causes of mortality (eg. Starvation, violence from other mice) are reduced to a minimum – so the frequent mutations just accumulate, generation upon generation – randomly producing all sorts of pathological (maladaptive) behaviours.
Today, almost everyone in the developed world has plenty of food, a comfortable home, and doesn’t have to worry about dying of bubonic plague. We live in humantopia, where the biggest factor influencing how many kids you have is how many you want to have.
Back in 1930, infant mortality rates were highest among the children of unskilled manual laborers, and lowest among the children of professionals (IIRC, this is Brittish data.) Today, infant mortality is almost non-existent, but voluntary childlessness has now inverted this phenomena:
Yes, the percent of childless women appears to have declined since 1994, but the overall pattern of who is having children still holds. Further, while only 8% of women with post graduate degrees have 4 or more children, 26% of those who never graduated from highschool have 4+ kids. Meanwhile, the age of first-time moms has continued to climb.
Take a moment to consider the high-infant mortality situation: an average couple has a dozen children. Four of them, by random good luck, inherit a good combination of the couple’s genes and turn out healthy and smart. Four, by random bad luck, get a less lucky combination of genes and turn out not particularly healthy or smart. And four, by very bad luck, get some unpleasant mutations that render them quite unhealthy and rather dull.
Infant mortality claims half their children, taking the least healthy. They are left with 4 bright children and 2 moderately intelligent children. The three brightest children succeed at life, marry well, and end up with several healthy, surviving children of their own, while the moderately intelligent do okay and end up with a couple of children.
On average, society’s overall health and IQ should hold steady or even increase over time, depending on how strong the selective pressures actually are.
Or consider a consanguineous couple with a high risk of genetic birth defects: perhaps a full 80% of their children die, but 20% turn out healthy and survive.
Today, by contrast, your average couple has two children. One of them is lucky, healthy, and smart. The other is unlucky, unhealthy, and dumb. Both survive. The lucky kid goes to college, majors in underwater intersectionist basket-weaving, and has one kid at age 40. That kid has Down Syndrome and never reproduces. The unlucky kid can’t keep a job, has chronic health problems, and 3 children by three different partners.
Your consanguineous couple migrates from war-torn Somalia to Minnesota. They still have 12 kids, but three of them are autistic with IQs below the official retardation threshold. “We never had this back in Somalia,” they cry. “We don’t even have a word for it.”
People normally think of dysgenics as merely “the dumb outbreed the smart,” but genetic load applies to everyone–men and women, smart and dull, black and white, young and especially old–because we all make random transcription errors when copying our DNA.
I could offer a list of signs of increasing genetic load, but there’s no way to avoid cherry-picking trends I already know are happening, like falling sperm counts or rising (diagnosed) autism rates, so I’ll skip that. You may substitute your own list of “obvious signs society is falling apart at the genes” if you so desire.
Nevertheless, the transition from 30% (or greater) infant mortality to almost 0% is amazing, both on a technical level and because it heralds an unprecedented era in human evolution. The selective pressures on today’s people are massively different from those our ancestors faced, simply because our ancestors’ biggest filter was infant mortality. Unless infant mortality acted completely at random–taking the genetically loaded and unloaded alike–or on factors completely irrelevant to load, the elimination of infant mortality must continuously increase the genetic load in the human population. Over time, if that load is not selected out–say, through more people being too unhealthy to reproduce–then we will end up with an increasing population of physically sick, maladjusted, mentally ill, and low-IQ people.
If all of the above is correct, then I see only 4 ways out:
Do nothing: Genetic load increases until the population is non-functional and collapses, resulting in a return of Malthusian conditions, invasion by stronger neighbors, or extinction.
Sterilization or other weeding out of high-load people, coupled with higher fertility by low-load people
Abortion of high load fetuses
#1 sounds unpleasant, and #2 would result in masses of unhappy people. We don’t have the technology for #4, yet. I don’t think the technology is quite there for #2, either, but it’s much closer–we can certainly test for many of the deleterious mutations that we do know of.
I recently received a few IQ-related questions. Now, IQ is not my specialty, so I do not feel particularly adequate for the task, but I’ll do my best. I recommend anyone really interested in the subject read Pumpkin Person’s blog, as he really enjoys talking about IQ all the time.
I wanted to ask if you know any IQ test on the internet that is an equivalent to the reliable tests given by psychologists?
I suppose it depends on what you want the test for. Curiosity? Diagnosis? Personally, I suspect that the average person isn’t going to learn very much from an IQ test that they didn’t already know just from living (similarly, I don’t think you’re going to discover that you’re an introvert or extrovert by taking an online quiz if you didn’t know it already from interacting with people,) but there are cases where people might want to take an IQ test, so let’s get searching.
The Wechsler Adult Intelligence Scale (WAIS) is an IQ test designed to measure intelligence and cognitive ability in adults and older adolescents. The original WAIS (Form I) was published in February 1955 by David Wechsler, as a revision of the Wechsler–Bellevue Intelligence Scale, released in 1939. It is currently in its fourth edition (WAIS-IV) released in 2008 by Pearson, and is the most widely used IQ test, for both adults and older adolescents, in the world.
Since IQ tests excite popular interest but no one really wants to pay $1,000 just to take a test, the internet is littered with “free” tests of questionable quality. For example, WeschlerTest.com offers “free sample tests,” but the bottom of the website notes that, “Disclaimer: This is not an official Wechsler test and is only for entertainment purposes. Any scores derived from it may not accurately reflect the score you would attain on an official Wechsler test.” Here is a similar wesbsite that offers free Stanford-Binet Tests.
I am not personally in a position to judge if these are any good.
It looks like the US military has put its Armed Services Vocational Aptitude Battery online, or at least a practice version. This seems like one of the best free options, because the army is a real organization that’s deeply interested in getting accurate results and the relationship between the ASVAB and other IQ tests is probably well documented. From the website:
The ASVAB is a timed test that measures your skills in a number of different areas. You complete questions that reveal your skills in paragraph comprehension, word knowledge, arithmetic reasoning and mathematics knowledge. These are basic skills that you will need as a member of the U.S. military. The score you receive on the ASVAB is factored into your Armed Forces Qualifying Test (AFQT) score. This score is used to figure out whether you qualify to enlist in the armed services. …
The ASVAB was created in 1968. By 1976, all branches of the military began using this test. In 2002, the test underwent many revisions, but its main goal of gauging a person’s basic skills remained the same. Today, there is a computerized version of the test as well as a written version. The Department of Defense developed this test and it’s taken by students in thousands of schools across the country. It is also given at Military Entrance Processing Stations (MEPS).
Naturally, each branch of the United States armed services wants to enlist the best, most qualified candidates each year. The ASVAB is a tool that helps in the achievement of that purpose. Preparing to take the ASVAB is just one more step in the journey toward your goal of joining the U.S. armed services. …
Disclaimer: The tests on this website are for entertainment purposes only, and may not accurately reflect the scores you would attain on a professionally administered ASVAB test.
Drawing a page from Pumpkin Person’s book, I recommend taking several different tests and then comparing results. Use your good judgment about whether a particular test seems reliable–is it covered in ads? Does random guessing get you a score of 148? Did you get a result similar to what you’d expect based on real life experiences?
2. Besides that I wanted to ask you how much social class and IQ are correlated?
A fair amount.
With thanks to Pumpkin Person
Really dumb people are too dumb to commit as much crime as mildly dumb people
When dumb children are born to rich people, they tend to do badly in life and don’t make much money; they subsequently sink in social status. When smart children are born to poor people, they tend to do well in life and rise in social status. Even in societies with strict social classes where moving from class to class is effectively impossible, we should still expect that really dumb people born into wealth will squander it, leading to their impoverishment. Likewise, among the lower classes, we would still expect that smarter low-class people would do better in life than dumber ones.
This is all somewhat built into the entire definition of “IQ” and what people were trying to measure when they created the tests.
3. Basically do traditional upper classes form separate genetic clusters like Gregory Clark claims?
I haven’t read Clark’s book, but I’m sure the pathetic amount of research I can do here would be nothing compared to what he’s amassed.
A similar pattern of spousal association for IQ scores and personality traits was found in two British samples from Oxford and Cambridge. There was no indirect evidence from either sample to suggest that convergence occurred during marriage. All observed assortative mating might well be due to initial assortment.
This article reviews the literature on assortative mating for psychological traits and psychiatric illness. Assortative mating appears to exist for personality traits, but to a lesser degree than that observed for physical traits, sociodemographic traits, intelligence, and attitudes and values. Concordance between spouses for psychiatric illness has also been consistently reported in numerous studies. This article examines alternative explanations for such observed concordance and discusses the effects of assortative mating on population genetics and the social environment.
In the Minnesota Twin Family Study, assortative mating for IQ was greater than .3 in both the 11- and 17-year-old cohorts. Recognizing this, genetic variance in IQ independent of SES was greater with higher parental SES in the 11-year-old cohort. This was not true, however, in the 17-year-old cohort. In both cohorts, people of higher IQ were more likely to have ‘married down’ for IQ than people of lower IQ were to have ‘married up’. This assortative mating pattern would create greater genetic diversity for IQ in people of higher IQ than in people of lower IQ. As IQ is associated with SES, the pattern could be one reason for the observation of greater genetic variance in IQ independent of SES with greater parental SES in several samples. If so, it could block upward social mobility among those already in lower-SES groups. I discuss possible involved mechanisms and social implications.
Assortative mating is the individuals’ tendency to mate with those who are similar to them in some variables, at a higher rate than would be expected from random. This study aims to provide empirical evidence of assortative mating through the Big Five model of personality and two measures of intelligence using Spanish samples. The sample consisted of 244 Spanish couples. It was divided into two groups according to relationship time. The effect of age, educational level and socioeconomic status was controlled. The results showed strong assortative mating for intelligence and moderate for personality. The strongest correlations for Personality were found in Openness, Agreeableness and Conscientiousness.
The role of personal preference as an active process in mate selection is contrasted with the more passive results of limitations of available mates due to social, educational, and geographical propinquity. The role of personal preference estimated after removing the effects of variables representing propinquity was still significant for IQ and Eysenck’s extraversion-introversion and inconsistency (lie) scales, even though small.
Some argue that the high heritability of IQ renders purely environmental explanations for large IQ differences between groups implausible. Yet, large environmentally induced IQ gains between generations suggest an important role for environment in shaping IQ. The authors present a formal model of the process determining IQ in which people’s IQs are affected by both environment and genes, but in which their environments are matched to their IQs. The authors show how such a model allows very large effects for environment, even incorporating the highest estimates of heritability. Besides resolving the paradox, the authors show that the model can account for a number of other phenomena, some of which are anomalous when viewed from the standard perspective.
4. Are upper class people genetically more intelligent? Or is there an effect of regression to the mean and all classes have about equal chances to spawn high IQ people?”
…James Lee, a real expert in the field, sent me a current best estimate for the probability distribution of offspring IQ as a function of parental midpoint (average between the parents’ IQs). James is finishing his Ph.D. at Harvard under Steve Pinker — you might have seen his review of R. Nesbitt’s book Intelligence and how to get it: Why schools and cultures count.
The results are stated further below. Once you plug in the numbers, you get (roughly) the following:
Assuming parental midpoint of n SD above the population average, the kids’ IQ will be normally distributed about a mean which is around +.6n with residual SD of about 12 points. (The .6 could actually be anywhere in the range (.5, .7), but the SD doesn’t vary much from choice of empirical inputs.)…
Read Hsu’s post for the rest of the details.
In short, while regression to the mean works for everyone, different people regress to different means depending on how smart their particular ancestors were. For example, if two people of IQ 100 have a kid with an IQ of 140, (Kid A) and two people of IQ 120 have a kid of IQ 140, (Kid B), Kid A’s own kids are likely to regress toward 100, while Kid B’s kids are likely to regress toward 120.
We can look at the effects of parental SES on SAT Scores and the like:
Personally, I know plenty of extremely intelligent people who come from low-SES backgrounds, but few of them ended up low-SES. Overall, I’d expect highly intelligent people to move up in status and less intelligent people to move down over time, with the upper class thus sort of “collecting” high-IQ people, but there are obviously regional and cultural effects that may make it inappropriate to compare across groups.
Apropos Friday’s conversation about the transition from hunting to pastoralism and the different strategies hunters employ in different environments, I got to thinking about how these different food-production systems could influence the development of different “intelligences,” or at least mental processes that underlie intelligence.
Ingold explains that in warm climes, hunter-gatherers have many food resources they can exploit, and if one resource starts running low, they can fairly easily switch to another. If there aren’t enough yams around, you can eat melons; if not enough melons, squirrels; if no squirrels, eggs. I recall a study of Australian Aborigines who agreed to go back to hunter-gatherering for a while after living in town for several decades. Among other things (like increased health,) scientists noted that the Aborigines increased the number of different kinds of foods they consumed from, IIRC, about 40 per week to 100.
By contrast, hunters in the arctic are highly dependent on exploiting only a few resources–fish, seals, reindeer, and perhaps a few polar bears and foxes. Ingold claims that there are (were) tribes that depended largely on only a few major hunts of migrating animals (netting hundreds of kills) to supply themselves for the whole year.
If those migrating change their course by even a few miles, it’s easy to see how the hunters could miss the herds entirely and, with no other major species around to exploit, starve over the winter.
Let’s consider temperate agriculture as well: the agriculturalist can store food better than the arctic hunter (seal meat does not do good things in the summer,) but lacks the tropical hunter-gatherer’s flexibility; he must stick to his fields and keep working, day in and day out, for a good nine months in a row. Agricultural work is more flexible than assembly line work, where your every minute is dictated by the needs of the factory, but a farmer can’t just wander away from his crops to go hunt for a months just because he feels like it, nor can he hope to make up for a bad wheat harvest by wandering into his neighbor’s fields and picking their potatoes.
Which got me thinking: clearly different people are going to do better at different systems.
But first, what is intelligence? Obviously we could define it in a variety of ways, but let’s stick to reasonable definitions, eg, the ability to use your brain to achieve success, or the ability to get good grades on your report card.
A variety of mental traits contribute to “intelligence,” such as:
The ability to learn lots of information. Information is really useful, both in life and on tests, and smarter brains tend to be better at storing lots and lots of data.
Flexible thinking. This is the ability to draw connections between different things you’ve learned, to be creative, to think up new ideas, etc.
Some form of Drive, Self Will, or long-term planning–that is, the ability to plan for your future and then push yourself to accomplish your goals. (These might more properly be two different traits, but we’ll keep them together for now.)
Your stereotypical autistic, capable of memorizing large quantities of data but not doing much with them, has trait #1 but not 2 or 3.
Artists and musicians tend to have a lot of trait #2, but not necessarily 1 or 3 (though successful artists obviously have a ton of #3)
And an average kid who’s not that bright but works really hard, puts in extra hours of effort on their homework, does extra credit assignments, etc., has a surfeit of #3 but not much 2 or 1.
Anyway, it seems to me like the tropical hunting/gathering environment, with many different species to exploit, would select for flexible thinking–if one food isn’t working out, look for a different one. This may also apply to people from tropical farming/horticulturalist societies.
By contrast, temperate farming seems more likely to select for planning–you can’t just wander off or try to grow something new in time for winter if your first crop doesn’t work out.
Many people have noted that America’s traditionally tropical population (African Americans) seems to be particularly good at flexible thinking, leading to much innovation in arts and music. They are not as talented, though, at Drive, leading to particularly high highschool dropout rates.
America’s traditionally rice-farming population (Asians,) by contrast, has been noted for over a century for its particularly high drive and ability to plan for the future, but not so much for contributions to the arts. East Asian people are noted for their particularly high IQ/SAT/PISA scores, despite the fact that China lags behind the West in GDP and quality of life terms. (Japan, of course, is a fully developed country.) One potential explanation for this is that the Chinese, while very good at working extremely hard, aren’t as good at flexible thinking that would help spur innovation. (I note that the Japanese seem to do just fine at flexible thinking, but you know, the Japanese aren’t Chinese and Japan isn’t China.)
(I know I’m not really stating anything novel.) But the real question is:
What kind of mental traits might pastoralism, arctic pastoralism, or arctic hunting select for?
It’s been a slow week for comments, probably because everyone is still passed out/out of town/tired/sick/busy from all of the holiday revelry. Some of you are still celebrating. Still, I invite you all to come in, take a seat by the fire, pick up a warm mug of cocoa, and enjoy yourselves with some relaxing chat and mingle.
But the stone tools on Naxos appeared to be hewn by Paleolithic people — much more ancient humans, perhaps not members of our species at all.
Since 2013, Carter has co-directed a new round of investigations on Naxos. He and a handful of others working in the region have begun to furnish evidence that humans reached the islands of the Aegean Sea 250,000 years ago and maybe earlier. If those dates are confirmed, it means the first people there were Neanderthals, their probable ancestors, Homo heidelbergensis or maybe even Homo erectus. …
Other researchers insist that much better evidence needs to be discovered to attribute such complex behaviours to Neanderthals and other hominins …
Then, in 1988, archeologists began excavating a collapsed rock shelter on the southern shore of Cyprus. They found about 1,000 bladelets and small tools typically associated with pre-Neolithic people.
“There was a lot of skepticism at first,” said Alan Simmons, an anthropologist at the University of Nevada Las Vegas who was involved in the work. “But once we had all the radiocarbon dates, it came to be accepted.”
The site pushed the peopling of Cyprus back to 12,000 years ago — only a few millennia, but enough to break the Neolithic barrier and establish the presence of hunter-gatherers. Today, the distance to mainland Turkey is about 75 kilometres. Sea levels have fluctuated and the crossing was once shorter, but Cyprus has always been an island.
The discoveries on Cyprus overturned the idea that hunter-gatherers were incapable or unwilling to travel by sea. But the debate was still confined to the activities of our species, Homo sapiens.
In 2008, a Greek-American team of archeologists began searching on the southwest coast of Crete for pre-Neolithic artifacts. They found many from roughly the same era as those on Cyprus. But they also found rough quartz hand axes and cleavers that appeared to be much more ancient.
The team discovered artifacts eroding out of a layer of soil that dated to at least 130,000 years ago, and the tools themselves looked like those archeologists associate with archaic hominin sites on the mainland — ones that are at least 250,000 years old. …
“only about one-quarter of one percent (0.25 percent) of all whites will be violently victimized by a black person this year”
This would mean that its 2,5% every 10 years. A typical american white lives 80 years this would mean their lifelong chance of getting attacked by a black is 20%(!!!) exactly the same number they argue the chance of a women is to be raped in life. The same Tim Wise made a big deal of how high that is. Of course he takes anual number for other crime and lifelong numbers for rape.
Kanazawa (2014), reviewed the data on the research between obesity and IQ. What he found was that those studies that concluded that obesity causes lowered intelligence only observed cross-sectional studies. Longitudinal studies that looked into the link between obesity and intelligence found that those who had low IQs since childhood then became obese later in life and that obesity does not lead to low IQ. … He states that those with IQs below 74 gained 5.19 BMI points, whereas those with IQs over above 126 gained 3.73 BMI points in 22 years, which is a statistically significant difference. Also noted, was that those at age 7 who had IQs above 125 had a 13.5 percent chance of being obese at age 51, whereas those with IQs below 74 at age 7 had a 31.9 percent chance of being obese.
Thanks everyone, and keep up the good work/great comments!
To summarize, our current generous welfare system is making it increasingly difficult for hard working members of society to afford to have children. Lazy and incapable people meanwhile are continuing to have children without restriction, courtesy of those hard working people. Its more than likely that average intelligence is falling as a result of these pressures.
Ever since someone proposed the idea of eguenic (ie, good) breeding, people have been concerned by the possibility of dysgenic (bad) breeding. If traits are heritable (as, indeed, they are,) then you can breed for more of that trait or less of that trait. Anyone who has ever raised livestock or puppies knows as much–the past 10,000 years of animal husbandry have been devoted to producing superior stock, long before anyone knew anything about “genes.”
Historically–that is, before 1900–the world was harsh and survival far from guaranteed. Infant and childhood mortality were high, women often died in childbirth, famines were frequent, land (in Europe) was scarce, and warfare + polygamy probably prevented the majority of men from ever reproducing. In those days, at least in Western Europe, the upper classes tended to have more (surviving) children than the lower classes, leading to a gradual replacement of the lower classes.
The situation today is, obviously, radically different. Diseases–genetic or pathogenic–kill far fewer people. We can cure Bubonic Plague with penicillin, have wiped out Smallpox, and can perform heart surgery on newborns whose hearts were improperly formed. Welfare prevents people from starving in the streets and the post-WWII prosperity led to an unprecedented percent of men marrying and raising families. (The percent of women who married and raised families probably didn’t change that much.)
All of these pleasant events raise concerns that, long-term, prosperity could result in the survival of people whose immune systems are weak, carry rare but debilitating genetic mutations, or are just plain dumb.
So how is Western fertility? Are the dumb outbreeding the smart, or should we be grateful that the “gender studies” sorts are selecting themselves out of the population? And with negative fertility rates + unprecedented levels of immigration, how smart are our immigrants (and their children?)
Data on these questions is not the easiest to find. Jayman has data on African American fertility (dysgenic,) but white American fertility may be currently eugenic (after several decades of dysgenics.) Jayman also notes a peculiar gender difference in these trends: female fertility is strongly dysgenic, while male is eugenic (for both whites and blacks). Given that historically, about 80% of women reproduced vs. only 40% of males, I think it likely that this pattern has always been true: women only want to marry intelligent, high-performing males, while males are okay with marrying dumb women. (Note: the female ability to detect intelligence may be broken by modern society.)
Counter-Currents has a review of Lynn’s Dysgenics with some less hopeful statistics, like an estimation that Greece lost 5 IQ points during the Baby Boom, which would account for their current economic woes. (Overall, I think the Baby Boom had some definite negative effects on the gene pool that are now working their way out.)
Richwine estimates the IQ of our immigrant Hispanic-American population at 89.2, with a slight increase for second and third-generation kids raised here. Since the average American IQ is 98 and Hispanics are our fastest-growing ethnic group, this is strongly dysgenic. (The rest of our immigrants, from countries like China, are likely to be higher-IQ than Americans.) However, since Hispanic labor is typically used to avoid African American (reported 85 average IQ) labor, the replacement of African Americans with Mexicans is locally eugenic–hence the demand for Hispanic labor.
Without better data, none of this conclusively proves whether fertility in the West is currently eugenic or dysgenic, but I can propose three main factors that should be watched for their potentially negative effects:
Welfare–I suspect the greater black reliance on welfare may be diving black dysgenics, but some other factor like crime could actually be at play.
I’m going to focus on the last one because it’s the only one that hasn’t already been explained in great detail elsewhere.
For American women, childbearing is low-class and isolating.
For all our fancy talk about maternity leave, supporting working moms, etc., America is not a child-friendly place. Society frowns on loud, rambunctious children running around in public, and don’t get me started on how public schools deal with boys. Just try to find something entertaining for both kids and grown-ups that doesn’t cost an arm and a leg for larger families–admission to the local zoo for my family costs over $50 and requires over an hour, round trip, of driving. (And it isn’t even a very good zoo.) Now try to find an activity your childless friends would also like to do with you.
Young women are constantly told that getting pregnant will ruin their lives (most vocally by their own parents,) and that if they want to stay home and raise children, they are social parasites. (Yes, literally.) We see child-rearing, like tomato picking, as a task best performed by low-wage immigrant daycare workers.
I am reminded here of a mom’s essay I read about the difference in attitudes toward children in the US and Israel, the only Western nation with a positive native fertility rate. Israel, as she put it, is a place where children are valued and “kids can be kids.” I’ve never been to Israel, so I’ll just have to trust her:
How Israelis love kids, anyone’s kids. The country is a free-for-all for the youngest set, something I truly appreciated only once I started bringing my own children there. When I was a teenager visiting Israel from the States, I noticed how people there just don’t allow a child to cry. One pout, one sob, and out comes candy, trinkets and eager smiles to turn a kid around. That would never happen back home—a stranger give a child candy?!—but in Israel, in a nation that still harbors a post-Holocaust mentality, there is no reason that a Jewish child should ever cry again, if someone can help it.
Incidentally, if you qualify under Israeli health care law, you can get a free, state-funded abortion. Abortion doesn’t appear to have destroyed Israel’s fertility.
Since male fertility is (probably) already eugenic, then the obvious place to focus is female fertility: make your country a place where children are actively valued and intelligent women are encouraged instead of insulted for wanting them, and–hopefully–things can improve.
I do not believe that IQ tests measure intelligence. Rather I believe that they measure a combination of intelligence, learning and concentration at a particular point in time. …
You may wish to read the whole thing there.
The short response is that I basically agree with the bit quoted, and I suspect that virtually everyone who takes IQ tests seriously does as well. We all know that if you come into an IQ test hungover, sick, and desperately needing to pee, you’ll do worse than if you’re well-rested, well-fed, and feeling fine.
That time I fell asleep during finals?
Not so good.
Folks who study IQ for a living, like the famous Flynn, believe that environmental effects like the elimination of leaded gasoline and general improvements in nutrition have raised average IQ scores over the past century or two. (Which I agree seems pretty likely.)
The ability to sit still and concentrate is especially variable in small children–little boys are especially notorious for preferring to run and play instead of sit at a desk and solve problems. And while real IQ tests (as opposed to the SAT) have been designed not to hinge on whether or not a student has learned a particular word or fact, the effects of environmental “enrichment” such as better schools or high-IQ adoptive parents do show up in children’s test scores–but fade away as children grow up.
There’s a very sensible reason for this. I am reminded here of an experiment I read about some years ago: infants (probably about one year old) were divided into two groups, and one group was taught how to climb the stairs. Six months later, the special-instruction group was still better at stair-climbing than the no-instruction group. But two years later, both groups of children were equally skilled at stair-climbing.
There is only so good anyone will ever get at stair-climbing, after all, and after two years of practice, everyone is about equally talented.
The sensible conclusion is that we should never evaluate an entire person based on just one IQ test result (especially in childhood.)
The mistake some people (not Chuancey Tinker) make is to jump from “IQ tests are not 100% reliable” to “IQ tests are meaningless.” Life is complicated, and people like to sort it into neat little packages. Friend or foe, right or wrong. And while single IQ test is insufficient to judge an entire person, the results of multiple IQ tests are fairly reliable–and if we aggregate our results over multiple people, we get even better results.
As with all data, more tests + more people => random incorrect data matters less.
I think the “IQ tests are meaningless” crowd is operating under the assumption that IQ scholars are actually dumb enough to blindly judge an entire person based on a single childhood test. (Dealing with this strawman becomes endlessly annoying.)
Like all data, the more the merrier:
So this complicated looking graph shows us the effects of different factors on IQ scores over time, using several different data sets (mostly twins studies.)
At 5 years old, “genetic” factors, (the diamond and thick lines) are less important than “shared environment.” Shared environment=parenting and teachers.
That is, at the age of 5, a pair of identical twins who were adopted by two different families will have IQ scores that look more like their adoptive parents’ IQ scores than their genetic relatives’ IQ scores. Like the babies taught to climb stairs before their peers, the kids whose parents have been working hard to teach them their ABCs score better than kids whose parents haven’t.
By the age of 7, however, this parenting effect has become less important than genetics. This means that those adopted kids are now starting to have IQ scores more similar to their biological relatives than to their adoptive relatives. Like the kids from the stair-climbing experiment, their scores are now more based on their genetic abilities (some kids have better balance and coordination, resulting in better stair-climbing) than on whatever their parents are doing with them.
By the age of 12, the effects of parenting drop to around 0. At this point, it’s all up to the kid.
Of course, adoption studies are not perfect–adoptive parents are not randomly selected and have to go through various hoops to prove that they will be decent parents, and so tend not to be the kinds of people who lock their children in closets or refuse to feed them. I am sure this kind of parenting does terrible things to IQ, but there is no ethical way to design a randomized study to test them. Thankfully, the % of children subject to such abysmal parenting is very low. Within the normal range of parenting practices, parenting doesn’t appear to have much (if any) effect on adult IQ.
The point of all this is that what I think Chauncey means by “learning,” that is, advantages some students have over others because they’ve learned a particular fact or method before the others do, does appear to have an effect on childhood IQ scores, but this effect fades with age.
I think Pumpkin Person is fond of saying that life is the ultimate IQ test.
While we can probably all attest to a friend who is “smart but lazy,” or smart but interested in a field that doesn’t pay very well, like art or parenting, the correlation between IQ and life outcomes (eg, money) are amazingly solid:
The correlation even holds internationally:
Map of IQ by country. Source: Wikipedia.
There’s a simple reason why this correlation holds despite lazy and non-money-oriented smart people: there are also lazy and non-money-oriented dumb people, and lazy smart people tend to make more money and make better long-term financial decisions than lazy dumb people.
Note that none of these graphs are the result of a single test. A single test would, indeed, be useless.
More than 13 million pain-blocking epidural procedures are performed every year in the United States. Although epidurals are generally regarded as safe, there are complications in up to 10 percent of cases, in which the needles are inserted too far or placed in the wrong tissue.
A team of researchers from MIT and Massachusetts General Hospital hopes to improve those numbers with a new sensor that can be embedded into an epidural needle, helping anesthesia doctors guide the needle to the correct location.
Since inserting a giant needle into your spine is really freaky, but going through natural childbirth is hideously painful, I strongly support this kind of research.
More than half of Americans under the age of 25 who have a bachelor’s degree are either unemployed or underemployed. According to The Christian Science Monitor, nearly 1 percent of bartenders and 14 percent of parking lot attendants have a bachelor’s degree.
Adding additional degrees is no guarantee of employment either. According to a recent Urban Institute report, nearly 300,000 Americans with master’s degrees and over 30,000 with doctorates are on public relief. …
Unless you have a “hard” skill, such as a mastery of accounting, or a vocational certificates (e.g., in teaching) your liberal arts education generally will not equip you with the skill set that an employer will need.
Obviously colleges still do some good things. Much of the research I cite here in this blog originated at a college of some sort. And of course, if you are careful and forward thinking, you can use college to obtain useful skills/information.
But between the years, money, and effort students spend, not to mention the absurd political indoctrination, college is probably a net negative for most students.
A few doctors in the 1400s probably saved the lives of their patients, but far more killed them.