Book Club: The Code Economy ch. 1

Greetings! Grab a cup of coffee and pull up a chair. Tea is also good. Today we’re diving into chapter one of Philip Auerswald’s The Code Economy, “Jobs: Divide and Coordinate.”

I wish this chapter had been much longer; we speed through almost 2.5 million years of cognitive evolution in a couple of pages.

The earliest hominins had about the same short-term memory as a modern-day chimpanzee, which is to say they could keep track of only two operations at a time. … Our methods for creating tools gradually became more sophisticated, until we were using the tools we created to produce other tools in a repetitive and predictable manner. These processes for creating stone tools were among humanity’s first production algorithms-that is, the earliest code. They appeared almost simultaneously in human communities in most part of the world around 40,000 BC.

Footnote:

…[E.O.] Wilson refers to this phenomenon more broadly as the discovery of eusocial behavior… Wilson situates the date far earlier in human history than I do here. I chose 50,000 years [ago] because my focus is on the economy. it is clear that an epochal change in society occurred roughly 10,000 years BCE, when humans invented agriculture in six parts of the world simultaneously. The fact of this simultaneity directly suggests the advance of code represented by the invention of agriculture was part of a forward movement of code that started much earlier.

What do you think? Does the simultaneous advent of behavioral modernity–or eusociality–in far-flung human groups roughly 50,000 years ago, followed by the simultaneous advent of agriculture in several far-flung groups about 10,000 years ago speak to the existence of some universal, underlying process? Why did so many different groups of people develop similar patterns of life and technology around the same time, despite some of them being highly isolated? Was society simply inevitable?

The caption on the photo is similarly interesting:

Demand on Short-Term Working Memory in the Production of an Obsidian Axe [from Read and van der Leeuw, 2015] … We can relate the concepts invoked in the prodcution of stone tools to the number of dimensions involved and thereby to the size of short-term workign memory (STWM) required for the prodction of the kind of stone tools that exemplify each stage in hominin evolution. …

Just hitting the end of a pebble once to create one edge, as in the simplest tools, they calculate requires holding three items in the working memory. Removing several flakes to create a longer edge (a line), takes STWM 4; working an entire side takes STWM 5; and working both sides of the stone in preparation for knapping flakes from the third requires both an ability to think about the pebble’s shape in three dimensions and STWM 7.

(The Wikipedia article on Lithic Reduction has a lovely animation of the technique.)

It took about 2 million years to proceed from the simplest tools (working memory: 3) to the most complex (working memory: 7.) Since the Neolithic, our working memory hasn’t improved–most of us are still limited to a mere 7 items in our working memory, just enough to remember a phone number if you already know the area code.

All of our advances since the Neolithic, Auerswald argues, haven’t been due to an increase in STWM, but our ability to build complexity externally: through code. And it was this invention of code that really made society take off.

By about 10,000 BCE, humans had formed the first villages… Villages were the precursors of modern-day business firms in that they were durable association built around routines. … the advance of code at the village level through the creation of new technological combinations set into motion the evolution from simplicity to complexity that has resulted in the modern economy.

It was in the village, then, that code began to evolve.

What do you think? Are Read and van der Leeuw just retroactively fitting numbers 3-7 to the tools, or do they really show an advance in working memory? Is the village really the source of most code evolution? And who do you think is more correct, Herbert Spencer or Thomas Malthus?

Auerswald then forward to 1557, with the first use of the word “job” (spelled “jobbe,” most likely from “gobbe,” or lump.)

The advent of the “jobbe” a a lump of work was to the evolution of modern society something like what the first single-celled organism was to the evolution of life.

!

The “jobbe” contrasted with the obligation to perform labor continuously and without clearly defined roles–slavery, serfdom, indentured servitude, or even apprenticeship–as had been the norm throughout human history.

Did the Black Death help create the modern “job market” by inspiring Parliament to pass the Statute of Laborers?

I am reminded here of a passage from Gulick’s Evolution of the Japanese, Social and Psychic, (published in 1903):

The idea of making a bargain when two persons entered upon some particular piece of work, the one as employer, the other as employed, was entirely repugnant to the older generation, since it was assumed that their relations as inferior and superior should determine their financial relations; the superior would do what was right, and the inferior should accept what the superior might give without a question or a murmur. Among the samurai, where the arrangement is between equals, bargaining or making fixed and fast terms which will hold to the end, and which may be carried to the courts in case of differences, was a thing practically unknown in the older civilization. Everything of a business nature was left to honor, and was carried on in mutual confidence.

“A few illustrations of this spirit of confidence from my own experience may not be without interest. On first coming to Japan, I found it usual for a Japanese who wished to take a jinrikisha to call the runner and take the ride without making any bargain, giving him at the end what seemed right. And the men generally accepted the payment without question. I have found that recently, unless there is some definite understanding arrived at before the ride, there is apt to be some disagreement, the runner presuming on the hold he has, by virtue of work done, to get more than is customary. This is especially true in case the rider is a foreigner. Another set of examples in which astonishing simplicity and confidence were manifested was in the employment of evangelists. I have known several instances in which a full correspondence with an evangelist with regard to his employment was carried on, and the settlement finally concluded, and the man set to work without a word said about money matters. It need hardly be said that no foreigner took part in that correspondence. …

“This confidence and trustfulness were the product of a civilization resting on communalistic feudalism; the people were kept as children in dependence on their feudal lord; they had to accept what he said and did; they were accustomed to that order of things from the beginning and had no other thought; on the whole too, without doubt, they received regular and kindly treatment. Furthermore, there was no redress for the peasant in case of harshness; it was always the wise policy, therefore, for him to accept whatever was given without even the appearance of dissatisfaction. This spirit was connected with the dominance of the military class. Simple trustfulness was, therefore, chiefly that of the non-military classes.

“Since the overthrow of communal feudalism and the establishment of an individualistic social order, necessitating personal ownership of property, and the universal use of money, trustful confidence is rapidly passing away.

We still identify ourselves with our profession–“I am a doctor” or “I am a paleontologist”–but much less so than in the days when “Smith” wasn’t a name.

Auerswald progresses to the modern day:

In the past two hundred years, the complexity of human economic organization has  increased by orders of magnitude. Death rates began to fall rapidly in the middle of the nineteenth century, due to a combination of increased agricultural output, improved hygiene, and the beginning of better medical practices–all different dimensions of the advance of code…. Greater numbers of people living in greater density than ever before accelerated the advance of code.

Sounds great, but:

By the twentieth century, the continued advance of code necessitated the creation of government bureaucracies and large corporations that employed vast numbers of people. These organizations executed code of sufficient complexity that it was beyond the capacity of any single individual to master.

I’ve often wondered if the explosion of communist disasters at the beginning of the 20th century occurred because we could imagine a kind of nation-wide code for production and consumption and we had the power to implement it, but we didn’t actually have the capabilities and tools necessary to make it work.

We can imagine Utopia, but we cannot reach it.

Auerswald delineates two broad categories of “epochal change” as a result of the code-explosion of the past two centuries: First, our capabilities grew. Second:

“we have, to an increasing degree, ceded to other people–and to code itself–authority and autonomy, which for millennia we had kept unto ourselves and our immediate tribal groups as uncodified cultural norms.”

Before the “job”, before even the “trade,” people lived and worked far more at their own discretion. Hoeing fields or gathering yams might be long and tedious work, but at least you didn’t have to pee in a bottle because Amazon didn’t give you time for bathroom breaks.

Every time voters demand that politicians “bring back the jobs” or politicians promise to create them, we are implicitly stating that the vast majority of people are no longer capable of making their own jobs. (At least, not jobs that afford a modern lifestyle.) The Appalachians lived in utter poverty (the vast majority of people before 1900 lived in what we would now call utter poverty), but they did not depend on anyone else to create “jobs” for them; they cleared their own land, planted their own corn, hunted their own hogs, and provided for their own needs.

Today’s humans are (probably not less intelligent nor innately capable than the average Appalachian of 1900, but the economy (and our standards of living) are much more complex. The average person no longer has the capacity to drive job growth in such a complicated system, but the solution isn’t necessarily for everyone to become smarter. After all, large, complicated organizations need hundreds of employees who are not out founding their own companies.

But this, in turn, means all of those employees–and even the companies themselves–are dependent on forces far outside their control, like Chinese monetary policy or the American electoral cycle. And this, in turn, raises demand for some kind of centralized, planned system to protect the workers from economic hardship and ensure that everyone enjoys a minimum standard of living.

Microstates suggest themselves as a way to speed the evolution of economic code by increasing the total number of organisms in the ecosystem.

With eusociality, man already became a political (that is, polis) animal around 10,000 or 40,000 or perhaps 100,000 years ago, largely unable to subsist on his own, absent the tribe. We do not seem to regret this ill-remembered transition very much, but what about the current one? Is the job-man somehow less human, less complete than the tradesman? Do we feel that something essential to the human spirit has been lost in defining and routinizing our daily tasks down to the minute, forcing men to bend to the timetables of factories and international corporations? Or have we, through the benefits of civilization (mostly health improvements) gained something far richer?

Advertisements

When did language evolve?

The smartest non-human primates, like Kanzi the bonobo and Koko the gorilla, understand about 2,000 to 4,000 words. Koko can make about 1,000 signs in sign language and Kanzi can use about 450 lexigrams (pictures that stand for words.) Koko can also make some onomatopoetic words–that is, she can make and use imitative sounds in conversation.

A four year human knows about 4,000 words, similar to an exceptional gorilla. An adult knows about 20,000-35,000 words. (Another study puts the upper bound at 42,000.)

Somewhere along our journey from ape-like hominins to homo sapiens sapiens, our ancestors began talking, but exactly when remains a mystery. The origins of writing have been amusingly easy to discover, because early writers were fond of very durable surfaces, like clay, stone, and bone. Speech, by contrast, evaporates as soon as it is heard–leaving no trace for archaeologists to uncover.

But we can find the things necessary for speech and the things for which speech, in turn, is necessary.

The main reason why chimps and gorillas, even those taught human language, must rely on lexigrams or gestures to communicate is that their voiceboxes, lungs, and throats work differently than ours. Their semi-arborial lifestyle requires using the ribs as a rigid base for the arm and shoulder muscles while climbing, which in turn requires closing the lungs while climbing to provide support for the ribs.

Full bipedalism released our early ancestors from the constraints on airway design imposed by climbing, freeing us to make a wider variety of vocalizations.

Now is the perfect time to break out my file of relevant human evolution illustrations:

Source: Scientific American What Makes Humans Special

We humans split from our nearest living ape relatives about 7-8 million years ago, but true bipedalism may not have evolved for a few more million years. Since there are many different named hominins, here is a quick guide:

Source: Macroevolution in and Around the Hominin Clade

Australopithecines (light blue in the graph,) such as the famous Lucy, are believed to have been the first fully bipedal hominins, although, based on the shape of their toes, they may have still occasionally retreated into the trees. They lived between 4 and 2 million years ago.

Without delving into the myriad classification debates along the lines of “should we count this set of skulls as a separate species or are they all part of the natural variation within one species,” by the time the homo genus arises with H Habilis or H. Rudolfensis around 2.8 million years ag, humans were much worse at climbing trees.

Interestingly, one direction humans have continued evolving in is up.

Oldowan tool

The reliable production of stone tools represents an enormous leap forward in human cognition. The first known stone tools–Oldowan–are about 2.5-2.6 million years old and were probably made by homo Habilis. These simple tools are typically shaped only one one side.

By the Acheulean–1.75 million-100,000 years ago–tool making had become much more sophisticated. Not only did knappers shape both sides of both the tops and bottoms of stones, but they also made tools by first shaping a core stone and then flaking derivative pieces from it.

The first Acheulean tools were fashioned by h Erectus; by 100,000 years ago, h Sapiens had presumably taken over the technology.

Flint knapping is surprisingly difficult, as many an archaeology student has discovered.

These technological advances were accompanied by steadily increasing brain sizes.

I propose that the complexities of the Acheulean tool complex required some form of language to facilitate learning and teaching; this gives us a potential lower bound on language around 1.75 million years ago. Bipedalism gives us an upper bound around 4 million years ago, before which our voice boxes were likely more restricted in the sounds they could make.

A Different View

Even though “homo Sapiens” has been around for about 300,000 years (or so we have defined the point where we chose to differentiate between our species and the previous one,) “behavioral modernity” only emerged around 50,000 years ago (very awkward timing if you know anything about human dispersal.)

Everything about behavioral modernity is heavily contested (including when it began,) but no matter how and when you date it, compared to the million years or so it took humans to figure out how to knap the back side of a rock, human technologic advance has accelerated significantly over the past 100,000 and even moreso over the past 50,000 and even 10,000.

Fire was another of humanity’s early technologies:

Claims for the earliest definitive evidence of control of fire by a member of Homo range from 1.7 to 0.2 million years ago (Mya).[1] Evidence for the controlled use of fire by Homo erectus, beginning some 600,000 years ago, has wide scholarly support.[2][3] Flint blades burned in fires roughly 300,000 years ago were found near fossils of early but not entirely modern Homo sapiens in Morocco.[4] Evidence of widespread control of fire by anatomically modern humans dates to approximately 125,000 years ago.[5]

What prompted this sudden acceleration? Noam Chomsky suggests that it was triggered by the evolution of our ability to use and understand language:

Noam Chomsky, a prominent proponent of discontinuity theory, argues that a single chance mutation occurred in one individual in the order of 100,000 years ago, installing the language faculty (a component of the mind–brain) in “perfect” or “near-perfect” form.[6]

(Pumpkin Person has more on Chomsky.)

More specifically, we might say that this single chance mutation created the capacity for figurative or symbolic language, as clearly apes already have the capacity for very simple language. It was this ability to convey abstract ideas, then, that allowed humans to begin expressing themselves in other abstract ways, like cave painting.

I disagree with this view on the grounds that human groups were already pretty widely dispersed by 100,000 years ago. For example, Pygmies and Bushmen are descended from groups of humans who had already split off from the rest of us by then, but they still have symbolic language, art, and everything else contained in the behavioral modernity toolkit. Of course, if a trait is particularly useful or otherwise successful, it can spread extremely quickly (think lactose tolerance,) and neither Bushmen nor Pygmies were 100% genetically isolated for the past 250,000 years, but I simply think the math here doesn’t work out.

However, that doesn’t mean Chomsky isn’t on to something. For example, Johanna Nichols (another linguist,) used statistical models of language differentiation to argue that modern languages split around 100,000 years ago.[31] This coincides neatly with the upper bound on the Out of Africa theory, suggesting that Nichols may actually have found the point when language began differentiating because humans left Africa, or perhaps she found the origin of the linguistic skills necessary to accomplish humanity’s cross-continental trek.

Philip Lieberman and Robert McCarthy looked at the shape of Neanderthal, homo Erectus, early h Sapiens and modern h Sapiens’ vocal tracts:

In normal adults these two portions of the SVT form a right angle to one another and are approximately equal in length—in a 1:1 proportion. Movements of the tongue within this space, at its midpoint, are capable of producing tenfold changes in the diameter of the SVT. These tongue maneuvers produce the abrupt diameter changes needed to produce the formant frequencies of the vowels found most frequently among the world’s languages—the “quantal” vowels [i], [u], and [a] of the words “see,” “do,” and “ma.” In contrast, the vocal tracts of other living primates are physiologically incapable of producing such vowels.

(Since juvenile humans are shaped differently than adults, they pronounce sounds slightly differently until their voiceboxes fully develop.)

Their results:

…Neanderthal necks were too short and their faces too long to have accommodated equally proportioned SVTs. Although we could not reconstruct the shape of the SVT in the Homo erectus fossil because it does not preserve any cervical vertebrae, it is clear that its face (and underlying horizontal SVT) would have been too long for a 1:1 SVT to fit into its head and neck. Likewise, in order to fit a 1:1 SVT into the reconstructed Neanderthal anatomy, the larynx would have had to be positioned in the Neanderthal’s thorax, behind the sternum and clavicles, much too low for effective swallowing. …

Surprisingly, our reconstruction of the 100,000-year-old specimen from Israel, which is anatomically modern in most respects, also would not have been able to accommodate a SVT with a 1:1 ratio, albeit for a different reason. … Again, like its Neanderthal relatives, this early modern human probably had an SVT with a horizontal dimension longer than its vertical one, translating into an inability to reproduce the full range of today’s human speech.

It was only in our reconstruction of the most recent fossil specimens—the modern humans postdating 50,000 years— that we identified an anatomy that could have accommodated a fully modern, equally proportioned vocal tract.

Just as small children who can’t yet pronounce the letter “r” can nevertheless make and understand language, I don’t think early humans needed to have all of the same sounds as we have in order to communicate with each other. They would have just used fewer sounds.

The change in our voiceboxes may not have triggered the evolution of language, but been triggered by language itself. As humans began transmitting more knowledge via language, humans who could make more sounds could utter a greater range of words perhaps had an edge over their peers–maybe they were seen as particularly clever, or perhaps they had an easier time organizing bands of hunters and warriors.

One of the interesting things about human language is that it is clearly simultaneously cultural–which language you speak is entirely determined by culture–and genetic–only humans can produce language in the way we do. Even the smartest chimps and dolphins cannot match our vocabularies, nor imitate our sounds. Human infants–unless they have some form of brain damage–learn language instinctually, without conscious teaching. (Insert reference to Steven Pinker.)

Some kind of genetic changes were obviously necessary to get from apes to human language use, but exactly what remains unclear.

A variety of genes are associated with language use, eg FOXP2. H Sapiens and chimps have different versions of the FOXP2 gene, (and Neanderthals have a third, but more similar to the H Sapiens version than the chimp,) but to my knowledge we have yet to discover exactly when the necessary mutations arose.

Despite their impressive skulls and survival in a harsh, novel climate, Neanderthals seem not to have engaged in much symbolic activity, (though to be fair, they were wiped out right about the time Sapiens really got going with its symbolic activity.) Homo Sapiens and Homo Nanderthalis split around 800-400,000 years ago–perhaps the difference in our language genes ultimately gave Sapiens the upper hand.

Just as farming appears to have emerged relatively independently in several different locations around the world at about the same time, so behavioral modernity seems to have taken off in several different groups around the same time. Of course we can’t rule out the possibility that these groups had some form of contact with each other–peaceful or otherwise–but it seems more likely to me that similar behaviors emerged in disparate groups around the same time because the cognitive precursors necessary for those behaviors had already begun before they split.

Based on genetics, the shape of their larynges, and their cultural toolkits, Neanderthals probably did not have modern speech, but they may have had something similar to it. This suggests that at the time of the Sapiens-Neanderthal split, our common ancestor possessed some primitive speech capacity.

By the time Sapiens and Neanderthals encountered each other again, nearly half a million years later, Sapiens’ language ability had advanced, possibly due to further modification of FOXP2 and other genes like it, plus our newly modified voiceboxes, while Neanderthals’ had lagged. Sapiens achieved behavioral modernity and took over the planet, while Neanderthals disappeared.

 

Anthropology Friday: Numbers and the Making of Us, by Caleb Everett, pt 3

Welcome back to our discussion of Numbers and the Making of Us: Counting and the Course of Human Cultures, by Caleb Everett.

The Pirahã are a small tribe (about 420) of Amazonian hunter-gatherers whose language is nearly unique: it has no numbers, and you can whistle it. Everett spent much of his childhood among the Piraha because his parents were missionaries, which probably makes him one of the world’s foremost non-Piraha experts on the Piraha.

Occasionally as a child I would wake up in the jungle to the cacophony of people sharing their dreams with one another–impromptu monologues followed by spurts of intense feedback. The people in question, a fascinating (to me anyhow) group known as the Piraha, are known to wake up and speak to their immediate neighbors at all hours of the night. … the voices suggested the people in the village were relaxed and completely unconcerned with my own preoccupations. …

The Piraha village my family lived in was reachable via a one-week sinuous trip along a series of Amazonian tributaries, or alternatively by a one-or flight in a Cessna single-engine airplane.

Piraha culture is, to say the least, very different from ours. Everett cites studies of Piraha counting ability in support of his idea that our ability to count past 3 is a culturally acquired process–that is, we can only count because we grew up in a numeric society where people taught us numbers, and the Piraha can’t count because they grew up in an anumeric society that not only lacks numbers, but lacks various other abstractions necessary for helping make sense of numbers. Our innate, genetic numerical abilities, (the ability to count to three and distinguish between small and large amounts,) he insists, are the same.

You see, the Piraha really can’t count. Line up 3 spools of thread and ask them to make an identical line, and they can do it. Line up 4 spools of thread, and they start getting the wrong number of spools. Line up 10 spools of thread, and it’s obvious that they’re just guessing and you’re wasting your time. Put five nuts in a can, then take two out and ask how many nuts are left: you get a response on the order of “some.”*

And this is not for lack of trying. The Piraha know other people have these things called “numbers.” They once asked Everett’s parents, the missionaries, to teach them numbers so they wouldn’t get cheated in trade deals. The missionaries tried for 8 months to teach them to count to ten and add small sums like 1 + 1. It didn’t work and the Piraha gave up.

Despite these difficulties, Everett insists that the Piraha are not dumb. After all, they survive in a very complex and demanding environment. He grew up with them; many of the are his personal friends and he regards them as mentally normal people with the exact same genetic abilities as everyone else who just lack the culturally-acquired skill of counting.

After all, on a standard IQ scale, someone who cannot even count to 4 would be severely if not profoundly retarded, institutionalized and cared for by others. The Piraha obviously live independently, hunt, raise, and gather their own food, navigate through the rainforest, raise their own children, build houses, etc. They aren’t building aqueducts, but they are surviving perfectly well outside of an institution.

Everett neglects the possibility that the Piraha are otherwise normal people who are innately bad at math.

Normally, yes, different mental abilities correlate because they depend highly on things like “how fast is your brain overall” or “were you neglected as a child?” But people also vary in their mental abilities. I have a friend who is above average in reading and writing abilities, but is almost completely unable to do math. This is despite being raised in a completely numerate culture, going to school, etc.

This is a really obvious and life-impairing problem in a society like ours, where you have to use math to function; my friend has been marked since childhood as “not cognitively normal.” It would be a completely invisible non-problem in a society like the Piraha, who use no math at all; in Piraha society, my friend would be “a totally normal guy” (or at least close.)

Everett states, explicitly, that not only are the Piraha only constrained by culture, but other people’s abilities are also directly determined by their cultures:

What is probably more remarkable about the relevant studies, though, is that they suggest that climbing any rungs of the arithmetic ladder requires numbers. How high we climb the ladder is not the result of our own inherent intelligence, but a result of the language we speak and of the culture we are born into. (page 136)

This is an absurd claim. Even my own children, raised in identically numerate environments and possessing, on the global scale, nearly identical genetics, vary in math abilities. You are probably not identical in abilities to your relatives, childhood classmates, next door neighbors, spouse, or office mates. We observe variations in mathematical abilities within cultures, families, cities, towns, schools, and virtually any group you chose that isn’t selected for math abilities. We can’t all do calculus just because we happen to live in a culture with calculus textbooks.

In fact, there is an extensive literature (which Everett ignores) on the genetics of intelligence:

Various studies have found the heritability of IQ to be between 0.7 and 0.8 in adults and 0.45 in childhood in the United States.[6][18][19] It may seem reasonable to expect that genetic influences on traits like IQ should become less important as one gains experiences with age. However, that the opposite occurs is well documented. Heritability measures in infancy are as low as 0.2, around 0.4 in middle childhood, and as high as 0.8 in adulthood.[7] One proposed explanation is that people with different genes tend to seek out different environments that reinforce the effects of those genes.[6] The brain undergoes morphological changes in development which suggests that age-related physical changes could also contribute to this effect.[20]

A 1994 article in Behavior Genetics based on a study of Swedish monozygotic and dizygotic twins found the heritability of the sample to be as high as 0.80 in general cognitive ability; however, it also varies by trait, with 0.60 for verbal tests, 0.50 for spatial and speed-of-processing tests, and 0.40 for memory tests. In contrast, studies of other populations estimate an average heritability of 0.50 for general cognitive ability.[18]

In 2006, The New York Times Magazine listed about three quarters as a figure held by the majority of studies.[21]

Thanks to Jayman

In plain speak, this means that intelligence in healthy adults is about 70-80% genetic and the rest seems to be random chance (like whether you were dropped on your head as a child or had enough iodine). So far, no one has proven that things like whole language vs. phonics instruction or two parents vs. one in the household have any effect on IQ, though they might effect how happy you are.

(Childhood IQ is much more amenable to environmental changes like “good teachers,” but these effects wear off as soon as children aren’t being forced to go to school every day.)

A full discussion of the scientific literature is beyond our current scope, but if you aren’t convinced about the heritability of IQ–including math abilities–I urge you to go explore the literature yourself–you might want to start with some of Jayman’s relevant FAQs on the subject.

Everett uses experiments done with the Piraha to support his claim that mathematical ability is culturally dependent, but this is dependent on is claim that the Piraha are cognitively identical to the rest of us in innate mathematical ability. Given that normal people are not cognitively identical in innate mathematical abilities, and mathematical abilities vary, on average, between groups (this is why people buy “Singapore Math” books and not “Congolese Math,”) there is no particular reason to assume Piraha and non-Piraha are cognitively identical. Further, there’s no reason to assume that any two groups are cognitively identical.

Mathematics only really got started when people invented agriculture, as they needed to keep track of things like “How many goats do I have?” or “Have the peasants paid their taxes?” A world in which mathematical ability is useful will select for mathematical ability; a world where it is useless cannot select for it.

Everett may still be correct that you wouldn’t be able to count if you hadn’t been taught how, but the Piraha can’t prove that one way or another. He would first have to show that Piraha who are raised in numerate cultures (say, by adoption,) are just as good at calculus as people from Singapore or Japan, but he cites no adoption studies nor anything else to this end. (And adoption studies don’t even show that for the groups we have studied, like whites, blacks, or Asians.)

Let me offer a cognitive contrast:

The Piraha are an anumeric, illiterate culture. They have encountered both letters and numbers, but not adopted them.

The Cherokee were once illiterate: they had no written language. Around 1809, an illiterate Cherokee man, Sequoyah, observed whites reading and writing letters. In a flash of insight, Sequoyah understand the concept of “use a symbol to encode a sound” even without being taught to read English. He developed his own alphabet (really a syllabary) for writing Cherokee sounds and began teaching it to others. Within 5 years of the syllabary’s completion, a majority of the Cherokee were literate; they soon had their own publishing industry producing Cherokee-language books and newspapers.

The Cherokee, though illiterate, possessed the innate ability to be literate, if only exposed to the cultural idea of letters. Once exposed, literacy spread rapidly–instantly, in human cultural evolution terms.

By contrast, the Piraha, despite their desire to adopt numbers, have not been able to do so.

(Yet. With enough effort, the Piraha probably can learn to count–after all, there are trained parrots who can count to 8. It would be strange if they permanently underperformed parrots. But it’s a difficult journey.)

That all said, I would like to make an anthropological defense of anumeracy: numeracy, as in ascribing exact values to specific items, is more productive in some contexts than others.

Do you keep track of the exact values of things you give your spouse, children, or close friends? If you invite a neighbor over for a meal, do you mark down what it cost to feed them and then expect them to feed you the same amount in return? Do you count the exact value of gifts and give the same value in return?

In Kabloona, de Poncin discusses the quasi-communist nature of the Eskimo economic system. For the Eskimo, hunter-gatherers living in the world’s harshest environment, the unit of exchange isn’t the item, but survival. A man whom you keep alive by giving him fish today is a man who can keep you alive by giving you fish tomorrow. Declaring that you will only give a starving man five fish because he previously gave you five fish will do you no good at all if he starves from not enough fish and can no longer give you some of his fish when he has an excess. The fish have, in this context, no innate, immutable value–they are as valuable as the life they preserve. To think otherwise would kill them.

It’s only when people have goods to trade, regularly, with strangers, that they begin thinking of objects as having defined values that hold steady over different transactions. A chicken is more valuable if I am starving than if I am not, but it has an identical value whether I am trading it for nuts or cows.

So it is not surprising that most agricultural societies have more complicated number systems than most hunter-gatherer societies. As Everett explains:

Led by Patience Epps of the University of Texas, a team of linguists recently documented the complexity of the number systems in many of the world’s languages. In particular, the researchers were concerned with the languages’ upper numerical limit–the highest quantity with a specific name. …

We are fond of coining new names for numbers in English, but the largest commonly used number name is googol (googolplex I define as an operation,) though there are bigger one’s like Graham’s.

The linguistic team in question found the upper numerical limits in 193 languages of hunter-gatherer cultures in Australia, Amazonia, Africa, and North America. Additionally, they examined the upper limits of 204 languages spoken by agriculturalists and pastoralists in these regions. They discovered that the languages of hunter-gatherer groups generally have low upper limits. This is particularly true in Australia and Amazonia, the regions with so-called pure hunter-gatherer subsistence strategies.

In the case of the Australian languages, the study in question observed that more than 80 percent are limited numerically, with the highest quantity represetned in such cases being only 3 or 4. Only one Australian language, Gamilaraay, was found to have an upper limit above 10, an dits highest number is for 20. … The association [between hunter-gathering and limited numbers] is also robust in South America and Amazonia more specifically. The languages of hunter-gatherer cultures in this region generally have upper limits below ten. Only one surveyed language … Huaorani, has numbers for quantities greater than 20. Approximately two-thirds of the languages of such groups in the region have upper limits of five or less, while one-third have an upper limit of 10. Similarly, about two-thirds of African hunter-gatherer languages have upper limits of 10 or less.

There are a few exceptions–agricultural societies with very few numbers, and hunter-gatherers with relatively large numbers of numbers, but:

…there are no large agricultural states without elaborate number systems, now or in recorded history.

So how did the first people develop numbers? Of course we don’t know, but Everett suggests that at some point we began associating collections of things, like shells, with the cluster of fingers found on our hands. One finger, one shell; five fingers, five shells–easy correspondences. Once we mastered five, we skipped forward to 10 and 20 rather quickly.

Everett proposes that some numeracy was a necessary prerequisite for agriculture, as agricultural people would need to keep track of things like seasons and equinoxes in order to know when to plant and harvest. I question this on the grounds that I myself don’t look at the calendar and say, “Oh look, it’s the equinox, I’d better plant my garden!” but instead look outside and say, “Oh, it’s getting warm and the grass is growing again, I’d better get busy.” The harvest is even more obvious: I harvest when the plants are ripe.

Of course, I live in a society with calendars, so I can’t claim that I don’t look at the calendar. I look at the calendar almost every day to make sure I have the date correct. So perhaps I am using my calendrical knowledge to plan my planting schedule without even realizing it because I am just so used to looking at the calendar.

“What man among you, if he has 100 sheep and has lost 1 of them, does not leave the 99 in the open pasture and go after the one which is lost until he finds it? When he has found it, he lays it on his shoulders, rejoicing.” Luke 15:3-5

Rather than develop numbers and then start planting barley and millet, I propose that humans first domesticated animals, like pigs and goats. At first people were content to have “a few,” “some,” or “many” animals, but soon they were inspired to keep better track of their flocks.

By the time we started planting millet and wheat (a couple thousand years later,) we were probably already pretty good at counting sheep.

Our fondness for tracking astronomical cycles, I suspect, began for less utilitarian reasons: they were there. The cycles of the sun, moon, and other planets were obvious and easy to track, and we wanted to figure out what they meant. We put a ton of work into tracking equinoxes and eclipses and the epicycles of Jupiter and Mars (before we figured out heliocentrism.) People ascribed all sorts of import to these cycles (“Communicator Mercury is retrograde in outspoken Sagittarius from December 3-22, mixing up messages and disrupting pre-holiday plans.”) that turned out to be completely wrong. Unless you’re a fisherman or sailor, the moon’s phases don’t make any difference in your life; the other planets’ cycles turned out to be completely useless unless you’re trying to send a space probe to visit them. Eclipses are interesting, but don’t have any real effects. For all of the effort we’ve put into astronomy, the most important results have been good calendars to keep track of dates and allow us to plan multiple years into the future.

Speaking of dates, let’s continue this discussion in a week–on the next Anthropology Friday.

*Footnote: Even though I don’t think the Piraha prove as much as Everett thinks they do, that doesn’t mean Everett is completely wrong. Maybe already having number words is (in the vast majority of cases) a necessary precondition for learning to count.

One potentially illuminating case Everett didn’t explore is how young children in numerate culture acquire numbers. Obviously they grow up in an environment with numbers, but below a certain age can’t really use them. Can children at these ages duplicate lines of objects or patterns? Or do they master that behavior only after learning to count?

Back in October I commented on Schiller and Peterson’s claim in Count on Math (a book of math curriculum ideas for toddlers and preschoolers) that young children must learn mathematical “foundation” concepts in a particular order, ie:

Developmental sequence is fundamental to children’s ability to build conceptual understanding. … The chapters in this book present math in a developmental sequence that provides children a natural transition from one concept to the next, preventing gaps in their understanding. …

When children are allowed to explore many objects, they begin to recognize similarities and differences of objects.

When children can determine similarities and differences, they can classify objects.

When children can classify objects, they can see similarities and difference well enough to recognize patterns.

When children can recognize, copy, extend and create patterns, they can arrange sets in a one-to-one relationship.

When children can match objects one to one, they can compare sets to determine which have more and which have less.

When children can compare sets, they can begin to look at the “manyness” of one set and develop number concepts.

This developmental sequence provides a conceptual framework that serves as a springboard to developing higher level math skills.

The Count on Math curriculum doesn’t even introduce the numbers 1-5 until week 39 for 4 year olds (3 year olds are never introduced to numbers) and numbers 6-10 aren’t introduced until week 37 for the 5 year olds!

Note that Schiller and Everett are arguing diametrical opposites–Everett says the ability to count to three and distinguish the “manyness” of sets is instinctual, present even in infants, but that the ability to copy patterns and match items one-to-one only comes after long acquaintance and practice with counting, specifically number words.

Schiller claims that children only develop the ability to distinguish manyness and count to three after learning to copy patterns and match items one-to-one.

As I said back in October, I think Count on Math’s claim is pure bollocks. If you miss the “comparing sets” day at preschool, you aren’t going to end up unable to multiply. The Piraha may not prove as much as Everett wants them to, but the neuroscience and animal studies he cites aren’t worthless. In general, I distrust anyone who claims that you must introduce this long a set of concepts in this strict an order just to develop a basic competency that the vast majority of people seem to acquire without difficulty.

Of course, Lynne Peterson is a real teacher with a real teacher’s certificate and a BA in … it doesn’t say, and Pam Schiller was Vice President of Professional Development for the Early childhood Division at McGraw Hill publishers and president of the Southern Early Childhood Association. She has a PhD in… it doesn’t say. Here’s some more on Dr. Schiller’s many awards. So maybe they know better than Everett, who’s just an anthropologist. But Everett has some actual evidence on his side.

But I’m a parent who has watched several children learn to count… and Schiller and Peterson are wrong.

Local Optima, Diversity, and Patchwork

Local optima–or optimums, if you prefer–are an illusion created by distance. A man standing on the hilltop at (approximately) X=2 may see land sloping downward all around himself and think that he is at the highest point on the graph.

But hand him a telescope, and he discovers that the fellow standing on the hilltop at X=4 is even higher than he is. And hand the fellow at X=4 a telescope, and he’ll discover that X=6 is even higher.

A global optimum is the best possible way of doing something; a local optimum can look like a global optimum because all of the other, similar ways of doing the same thing are worse. To get from a local optimum to a global optimum, you might have to endure a significant trough of things going worse before you reach your destination. (Those troughs would be the points X=3.03 and X=5.02 on the graph.) If the troughs are short and shallow enough, people can accidentally power their way through. If long and deep enough, people get stuck.

The introduction of new technology, exposure to another culture’s solutions, or even random chance can expose a local optimum and propel a group to cross that trough.

For example, back in 1400, Europeans were perfectly happy to get their Chinese silks, spices, and porcelains via the overland Silk Road. But with the fall of Constantinople to the Turks in 1453, the Silk Road became more fragmented and difficult (ie dangerous, ie expensive) to travel. The increased cost of the normal road prompted Europeans to start exploring other, less immediately profitable trade routes–like the possibility of sailing clear around the world, via the ocean, to the other side of China.

Without the eastern trade routes first diminishing in profitability, it wouldn’t have been economically viable to explore and develop the western routes. (With the discovery of the Americas, in the process, a happy accident.)

West Hunter (Greg Cochran) writes frequently about local optima; here’s an excerpt on plant domestication:

The reason that a few crops account for the great preponderance of modern agriculture is that a bird in the hand – an already-domesticated, already- optimized crop – feeds your family/makes money right now, while a potentially useful yet undomesticated crop doesn’t. One successful domestication tends to inhibit others that could flourish in the same niche. Several crops were domesticated in the eastern United States, but with the advent of maize and beans ( from Mesoamerica) most were abandoned. Maybe if those Amerindians had continued to selectively breed sumpweed for a few thousand years, it could have been a contender: but nobody is quite that stubborn.

Teosinte was an unpromising weed: it’s hard to see why anyone bothered to try to domesticate it, and it took a long time to turn it into something like modern maize. If someone had brought wheat to Mexico six thousand years ago, likely the locals would have dropped maize like a hot potato. But maize ultimately had advantages: it’s a C4 plant, while wheat is C3: maize yields can be much higher.

Teosinte is the ancestor of modern corn. Cochran’s point is that in the domestication game, wheat is a local optimum; given the wild ancestors of wheat and corn, you’d develop a better, more nutritious variety of wheat first and probably just abandon the corn. But if you didn’t have wheat and you just had corn, you’d keep at the corn–and in the end, get an even better plant.

(Of course, corn is a success story; plenty of people domesticated plants that actually weren’t very good just because that’s what they happened to have.)

Japan in 1850 was a culturally rich, pre-industrial, feudal society with a strong isolationist stance. In 1853, the Japanese discovered that the rest of the world’s industrial, military technology was now sufficiently advanced to pose a serious threat to Japanese sovereignty. Things immediately degenerated, culminating in the Boshin War (civil war, 1868-9,) but with the Meiji Restoration Japan embarked on an industrialization crash-course. By 1895, Japan had kicked China’s butt in the First Sino-Japanese War and the Japanese population doubled–after holding steady for centuries–between 1873 and 1935. (From 35 to 70 million people.) By the 1930s, Japan was one of the world’s most formidable industrial powers, and today it remains an economic and technological powerhouse.

Clearly the Japanese people, in 1850, contained the untapped ability to build a much more complex and advanced society than the one they had, and it did not take much exposure to the outside world to precipitate a total economic and technological revolution.

Sequoyah’s syllabary, showing script and print forms

A similar case occurred in 1821 when Sequoyah, a Cherokee man, invented his own syllabary (syllable-based alphabet) after observing American soldiers reading letters. The Cherokee quickly adopted Sequoyah’s writing system–by 1825, the majority of Cherokee were literate and the Cherokee had their own printing industry. Interestingly, although some of the Cherokee letters look like Latin, Greek, or Cyrillic letters, there is no correspondence in sound, because Sequoyah could not read English. He developed his entire syllabary after simply being exposed to the idea of writing.

The idea of literacy has occurred independently only a few times in human history; the vast majority of people picked up alphabets from someone else. Our Alphabet comes from the Latins who got it from the Greeks who adopted it from the Phoenicians who got it from some proto-canaanite script writers, and even then literacy spread pretty slowly. The Cherokee, while not as technologically advanced as Europeans at the time, were already a nice agricultural society and clearly possessed the ability to become literate as soon as they were exposed to the idea.

When I walk around our cities, I often think about what their ruins will look like to explorers in a thousand years
We also pass a ruin of what once must have been a grand building. The walls are marked with logos from a Belgian University. This must have once been some scientific study centre of sorts.”

By contrast, there are many cases of people being exposed to or given a new technology but completely lacking the ability to functionally adopt, improve, or maintain it. The Democratic Republic of the Congo, for example, is full of ruined colonial-era buildings and roads built by outsiders that the locals haven’t maintained. Without the Belgians, the infrastructure has crumbled.

Likewise, contact between Europeans and groups like the Australian Aboriginees did not result in the Aboriginees adopting European technology nor a new and improved fusion of Aboriginee and European tech, but in total disaster for the Aboriginees. While the Japanese consistently top the charts in educational attainment, Aboriginee communities are still struggling with low literacy rates, high dropout rates, and low employment–the modern industrial economy, in short, has not been kind to them.

Along a completely different evolutionary pathway, cephalopods–squids, octopuses, and their tentacled ilk–are the world’s smartest invertebrates. This is pretty amazing, given that their nearest cousins are snails and clams. Yet cephalopod intelligence only goes so far. No one knows (yet) just how smart cephalopods are–squids in particular are difficult to work with in captivity because they are active hunter/swimmers and need a lot more space than the average aquarium can devote–but their brain power appears to be on the order of a dog’s.

After millions of years of evolution, cephalopods may represent the best nature can do–with an invertebrate. Throw in a backbone, and an animal can get a whole lot smarter.

And in chemistry, activation energy is the amount of energy you have to put into a chemical system before a reaction can begin. Stable chemical systems essentially exist at local optima, and it can require the input of quite a lot of energy before you get any action out of them. For atoms, iron is the global–should we say universal?–optimum, beyond which reactions are endothermic rather than exothermic. In other words, nuclear fusion at the core of the sun ends with iron; elements heavier than iron can only be produced when stars explode.

So what do local optima have to do with diversity?

The current vogue for diversity (“Diversity is our greatest strength”) suggests that we can reach global optima faster by simply smushing everyone together and letting them compare notes. Scroll back to the Japanese case. Edo Japan had a nice culture, but it was also beset by frequent famines. Meiji Japan doubled its population. Giving everyone, right now, the same technology and culture would bring everyone up to the same level.

But you can’t tell from within if you are at a local or global optimum. That’s how they work. The Indians likely would have never developed corn had they been exposed to wheat early on, and subsequently Europeans would have never gotten to adopt corn, either. Good ideas can take a long time to refine and develop. Cultures can improve rapidly–even dramatically–by adopting each other’s good ideas, but they also need their own space and time to pursue their own paths, so that good but slowly developing ideas aren’t lost.

Which gets us back to Patchwork.

Book on a Friday: Squid Empire: The Rise and Fall of the Cephalopods by Danna Staaf

Danna Staaf’s Squid Empire: The Rise and Fall of the Cephalopods is about the evolution of squids and their relatives–nautiluses, cuttlefish, octopuses, ammonoids, etc. If you are really into squids or would like to learn more about squids, this is the book for you. If you aren’t big on reading about squids but want something that looks nice on your coffee table and matches your Cthulhu, Flying Spaghetti Monster, and 20,000 Leagues Under the Sea decor, this is the book for you. If you aren’t really into squids, you probably won’t enjoy this book.

Squids, octopuses, etc. are members of the class of cephalopods, just as you are a member of the class of mammals. Mammals are in the phylum of chordates; cephalopods are mollusks. It’s a surprising lineage for one of Earth’s smartest creatures–80% mollusk species are slugs and snails. If you think you’re surrounded by idiots, imagine how squids must feel.

The short story of cephalopodic evolution is that millions upon millions of years ago, most life was still stuck at the bottom of the ocean. There were some giant microbial mats, some slugs, some snails, some worms, and not a whole lot else. One of those snails figured out how to float by removing some of the salt from the water inside its shell, making itself a bit buoyant. Soon after its foot (all mollusks have a “foot”) split into multiple parts. The now-floating snail drifted over the seafloor, using its new tentacles to catch and eat the less-mobile creatures below it.

From here, cephalopods diversified dramatically, creating the famous ammonoids of fossil-dating lore.

Cross-section of a fossilized ammonite shell, revealing internal chambers and septa

Ammonoids are known primarily from their shells (which fossilize well) rather than their fleshy tentacle parts, (which fossilize badly). But shells we have in such abundance they can be easily used for dating other nearby fossils.

Ammonoids are obviously similar to their cousins, the lovely chambered nautiluses. (Please don’t buy nautilus shells; taking them out of their shells kills them and no one farms nautiluses so the shell trade is having a real impact on their numbers. We don’t need their shells, but they do.)

Ammonoids succeeded for millions of years, until the Creatceous extinction event that also took out the dinosaurs. The nautiluses survived–as the author speculates, perhaps because they lay large eggs with much more yolk that develop very slowly, infant nautiluses were able to wait out the event while ammonoids, with their fast-growing, tiny eggs dependent on feeding immediately after hatching simply starved in the upheaval.

In the aftermath, modern squids and octopuses proliferated.

How did we get from floating, shelled snails to today’s squishy squids?

The first step was internalization–cephalopods began growing their fleshy mantles over their shells instead of inside of them–in essence, these invertebrates became vertebrates. Perhaps this was some horrible genetic accident, but it worked out. These internalized shells gradually became smaller and thinner, until they were reduced to a flexible rod called a “pen” that runs the length of a squid’s mantle. (Cuttlefish still retain a more substantial bone, which is frequently collected on beaches and sold for birds to peck at for its calcium.)

With the loss of the buoyant shell, squids had to find another way to float. This they apparently achieved by filling themselves with ammonia salts, which makes them less dense than water but also makes their decomposition disgusting and renders them unfossilizable because they turn to mush too quickly. Octopuses, by contrast, aren’t full of ammonia and so can fossilize.

Since the book is devoted primarily to cephalopod evolution rather than modern cephalopods, it doesn’t go into much depth on the subject of their intelligence. Out of all the invertebrates, cephalopods are easily the most intelligent (perhaps really the only intelligent invertebrates). Why? If cephalopods didn’t exist, we might easily conclude that invertebrates can’t be intelligent–invertebrateness is somehow inimical to intelligence. After all, most invertebrates are about as intelligent as slugs. But cephalopods do exist, and they’re pretty smart.

The obvious answer is that cephalopods can move and are predatory, which requires bigger brains. But why are they the only invertebrates–apparently–who’ve accomplished the task?

But enough jabber–let’s let Mrs. Staaf speak:

I find myself obliged to address the perennial question: “octopuses” or “octopi”? Or, heaven help us, “octopodes”?

Whichever you like best. Seriously. Despite what you may have heard, “octopus” is neither ancient Greek nor Latin. Aristotle called the animal polypous for its “many feet.” The ancient Romans borrowed this word and latinized the spelling to polypus. It was much later that a Renaissance scientists coined and popularized the word “octopus,” using Greek roots for “eight” and “foot” but Latin spelling.

If the word had actually been Greek, it would be spelled octopous and pluralized octopodes. If translated into Latin, it might have become octopes and pluralized octopedes,  but more likely the ancient Roman would have simply borrowed the Greek word–as they did with poly pus. Those who perhaps wished to appear erudite used the Greek plural polypodes, while others favored a Latin ending and pluralized it polypi.

The latter is a tactic we English speakers emulate when we welcome “octopus” into our own language and pluralize it “octopuses” as I’ve chosen to do.

There. That settles it.

Dinosaurs are the poster children for evolution and extinction writ large…

Of course, not all of them did die. We know now that birds are simply modern dinosaurs, but out of habit we tend to reserve the word “dinosaur for the hug ancient creatures that went extinct at the end of the Cretaceous. After all, even if they had feathers, they seem so different from today’s finches and robins. For one thing, the first flying feathered dinosaurs all seem to have had four wings. There aren’t any modern birds with four wings.

Wesl… actually, domestic pigeons can be bred to grow feathers on their legs. Not fuzzy down, but long flight feathers, and along with these feathers their leg bones grow more winglike. The legs are still legs’ they can’t be used to fly like wings. They do, however, suggest a clear step along the road from four-winged dinosaurs to two-winged birds. The difference between pigeons with ordinary legs and pigeons with wing-legs is created by control switches in their DNA that alter the expression of two particular genes. These genes are found in all birds, indeed in all vertebrates,and so were most likely present in dinosaurs as well.

…and I’ve just discovered that almost all of my other bookmarks fell out of the book. Um.

So squid brains are shaped like donuts because their eating/jet propulsion tube runs through the middle of their bodies and thus through the middle of their brains. It seems like this could be a problem if the squid eats too much or eats something with sharp bits in it, but squids seem to manage.

Squids can also leap out of the water and fly through the air for some ways. Octopuses can carry water around in their mantles, allowing them to move on dry land for a few minutes without suffocating.

Since cephalopods are somewhat unique among mollusks for their ability to move quickly, they have a lot in common, genetically, with vertebrates. In essence, they are the most vertebrate-behaving of the mollusks. Convergent evolution.

The vampire squid, despite its name, is actually more of an octopus.

Let me quote from the chapter on sex and babies:

This is one arena in which cephalopods, both ancient and modern, are actually less alien than many aliens–even other mollusks. Slugs, for instance, are hermaphroditic, and in the course of impregnating each other their penises sometimes get tangled, so they chew them off. Nothing in the rest of this chapter will make you nearly that uncomfortable. …

The lovely argonaut octopus

In one living coleoid species, however, sex is blindingly obvious. Females of the octopus known as an argonaut are five times larger than males. (A killer whale is about five times larger than an average adult human, which in turn is about five times large than an opossum.)

This enormous size differential caught the attention of paleontologists who had noticed that many ammonoid species also came in two distinct size, which htey had dubbed microconch (little shell) and macroconch (big shell). Bot were clearly mature, as they had completed the juvenile part of the shell and constructed the final adult living chamber. After an initial flurry of debate, most researchers agreed to model ammonoid sex on modern argonauts, and began to call macroconchs females and microconcs males.

Some fossil nautiloids also come in macroconch and microchonch flavors, though it’s more difficult to be certain that both are adults…

However, the shells of modern nautiluses show the opposite pattern–males are somewhat large than females… Like the nautiloid shift from ten arms to many tens of arms, the pattern could certainly have evolved from a different ancestral condition. If we’re going to make that argument, though, we have to wonder when nautliloids switched from females to males as the larger sex, and why.

In modern species that have larger females, we usually assume the size difference has to do with making or brooding a lot of eggs.Female argonauts take it up a notch and actually secrete a shell-like brood chamber from their arms, using it to cradle numerous batch of eggs over their lifetime. meanwhile, each tiny male argonaut get ot mate only once. His hectocotylus is disposable–after being loaded with sperm and inserted into the female, it breaks off. …

By contrast, when males are the bigger sex, we often guess that the purpose is competition. Certainly many species of squid and cuttlefish have large males that battle for female attention on the mating grounds. They display outrageous skin patterns as they push, shove, and bite each other. Females do appear impressed; at least, they mate with the winning males and consent to be guarded by them. Even in these species, though, there are some mall males who exhibit a totally different mating strategy. While the big males strut their stuff, these small males quietly sidle up to the females, sometimes disguising themselves with female color patterns. This doesn’t put off the real females, who readily mate with these aptly named “sneaker males.” By their very nature, such obfuscating tactics are virtually impossible to glean from the fossil record…

More on octopus mating habits.

This, of course, reminded me of this graph:

In the majority of countries, women are more likely to be overweight than men (suggesting that our measure of “overweight” is probably flawed.) In some countries women are much more likely to be overweight, while in some countries men and women are almost equally likely to be overweight, and in just a few–the Czech Republic, Germany, Hungary, Japan, and barely France, men are more likely to be overweight.

Is there any rhyme or reason to this pattern? Surely affluence is related, but Japan, for all of its affluence, has very few overweight people at all, while Egypt, which is pretty poor, has far more overweight people. (A greater % of Egyptian women are overweight than American women, but American men are more likely to be overweight than Egyptian men.)

Of course, male humans are still–in every country–larger than females. Even an overweight female doesn’t necessarily weigh more than a regular male. But could the variation in male and female obesity rates have anything to do with historic mating strategies? Or is it completely irrelevant?

Back to the book:

Coleoid eyes are as complex as our own, with a lens for focusing light, a retina to detect it, and an iris to sharpen the image. … Despite their common complexity, though, there are some striking differences [between our and squid eyes]. For Example, our retina has a blind spot whee a bundle of nerves enters the eyeball before spreading out to connect to the font of every light receptor. By contrast, light receptors in the coleoid retina are innervated from behind, so there’s no “hole” or blind spot. Structural differences like this how that the two groups converged on similar solution through distinct evolutionary pathways.

Another significant difference is that fish went on to evolve color vision by increasing the variety of light-sensitive proteins in their eyes; coleoids never did and are probably color blind. I say “probably ” because the idea of color blindness in such colorful animals has flummoxed generations of scientists…

“I’m really more of a cuddlefish”

Color-blind or not, coleoids can definitely see something we humans are blind to: the polarization of light.

Sunlight normally consists of waves vibrating in all directions. but when these waves are reflected off certain surface, like water, they get organized and arrive at the retina vibrating in only one direction. We call this “glare” and we don’t like it, so we invented polarized sunglasses. … That’s pretty much all polarized sunglasses can do–block polaraized light. Sadly, they can’t help you decode the secret messages of cuttlefish, which have the ability to perform a sort of double0-talk with their skin, making color camouflage for the befit of polarization-blind predators while flashing polarized displays to their fellow cuttlefish.

That’s amazing. Here’s an article with more on cuttlefish vision and polarization.

Overall, I enjoyed this book. The writing isn’t the most thrilling, but the author has a sense of humor and a deep love for her subject. I recommend it to anyone with a serious hankering to know more about the evolution of squids, or who’d like to learn more about an ancient animal besides dinosaurs.

Are “Nerds” Just a Hollywood Stereotype?

Yes, MIT has a football team.

The other day on Twitter, Nick B. Steves challenged me to find data supporting or refuting his assertion that Nerds vs. Jocks is a false stereotype, invented around 1975. Of course, we HBDers have a saying–“all stereotypes are true,” even the ones about us–but let’s investigate Nick’s claim and see where it leads us.

(NOTE: If you have relevant data, I’d love to see it.)

Unfortunately, terms like “nerd,” “jock,” and “chad” are not all that well defined. Certainly if we define “jock” as “athletic but not smart” and nerd as “smart but not athletic,” then these are clearly separate categories. But what if there’s a much bigger group of people who are smart and athletic?

Or what if we are defining “nerd” and “jock” too narrowly? Wikipedia defines nerd as, “a person seen as overly intellectual, obsessive, or lacking social skills.” I recall a study–which I cannot find right now–which found that nerds had, overall, lower-than-average IQs, but that study included people who were obsessive about things like comic books, not just people who majored in STEM. Similarly, should we define “jock” only as people who are good at sports, or do passionate sports fans count?

For the sake of this post, I will define “nerd” as “people with high math/science abilities” and “jock” as “people with high athletic abilities,” leaving the matter of social skills undefined. (People who merely like video games or watch sports, therefore, do not count.)

Nick is correct on one count: according to Wikipedia, although the word “nerd” has been around since 1951, it was popularized during the 70s by the sitcom Happy Days. However, Wikipedia also notes that:

An alternate spelling,[10] as nurd or gnurd, also began to appear in the mid-1960s or early 1970s.[11] Author Philip K. Dick claimed to have coined the nurd spelling in 1973, but its first recorded use appeared in a 1965 student publication at Rensselaer Polytechnic Institute.[12][13] Oral tradition there holds that the word is derived from knurd (drunk spelled backward), which was used to describe people who studied rather than partied. The term gnurd (spelled with the “g”) was in use at the Massachusetts Institute of Technology by 1965.[14] The term nurd was also in use at the Massachusetts Institute of Technology as early as 1971 but was used in the context for the proper name of a fictional character in a satirical “news” article.[15]

suggesting that the word was already common among nerds themselves before it was picked up by TV.

But we can trace the nerd-jock dichotomy back before the terms were coined: back in 1921, Lewis Terman, a researcher at Stanford University, began a long-term study of exceptionally high-IQ children, the Genetic Studies of Genius aka the Terman Study of the Gifted:

Terman’s goal was to disprove the then-current belief that gifted children were sickly, socially inept, and not well-rounded.

This belief was especially popular in a little nation known as Germany, where it inspired people to take schoolchildren on long hikes in the woods to keep them fit and the mass-extermination of Jews, who were believed to be muddying the German genepool with their weak, sickly, high-IQ genes (and nefariously trying to marry strong, healthy German in order to replenish their own defective stock.) It didn’t help that German Jews were both high-IQ and beset by a number of illnesses (probably related to high rates of consanguinity,) but then again, the Gypsies are beset by even more debilitating illnesses, but no one blames this on all of the fresh air and exercise afforded by their highly mobile lifestyles.

(Just to be thorough, though, the Nazis also exterminated the Gypsies and Hans Asperger’s subjects, despite Asperger’s insistence that they were very clever children who could probably be of great use to the German war effort via code breaking and the like.)

The results of Terman’s study are strongly in Nick’s favor. According to Psychology Today’s  account:

His final group of “Termites” averaged a whopping IQ of 151. Following-up his group 35-years later, his gifted group at mid-life definitely seemed to conform to his expectations. They were taller, healthier, physically better developed, and socially adept (dispelling the myth at the time of high-IQ awkward nerds).

According to Wikipedia:

…the first volume of the study reported data on the children’s family,[17] educational progress,[18] special abilities,[19] interests,[20] play,[21] and personality.[22] He also examined the children’s racial and ethnic heritage.[23] Terman was a proponent of eugenics, although not as radical as many of his contemporary social Darwinists, and believed that intelligence testing could be used as a positive tool to shape society.[3]

Based on data collected in 1921–22, Terman concluded that gifted children suffered no more health problems than normal for their age, save a little more myopia than average. He also found that the children were usually social, were well-adjusted, did better in school, and were even taller than average.[24] A follow-up performed in 1923–1924 found that the children had maintained their high IQs and were still above average overall as a group.

Of course, we can go back even further than Terman–in the early 1800s, allergies like hay fever were associated with the nobility, who of course did not do much vigorous work in the fields.

My impression, based on studies I’ve seen previously, is that athleticism and IQ are positively correlated. That is, smarter people tend to be more athletic, and more athletic people tend to be smarter. There’s a very obvious reason for this: our brains are part of our bodies, people with healthier bodies therefore also have healthier brains, and healthier brains tend to work better.

At the very bottom of the IQ distribution, mentally retarded people tend to also be clumsy, flacid, or lacking good muscle tone. The same genes (or environmental conditions) that make children have terrible health/developmental problems often also affect their brain growth, and conditions that affect their brains also affect their bodies. As we progress from low to average to above-average IQ, we encounter increasingly healthy people.

In most smart people, high-IQ doesn’t seem to be a random fluke, a genetic error, nor fitness reducing: in a genetic study of children with exceptionally high IQs, researchers failed to find many genes that specifically endowed the children with genius, but found instead a fortuitous absence of deleterious genes that knock a few points off the rest of us. The same genes that have a negative effect on the nerves and proteins in your brain probably also have a deleterious effect on the nerves and proteins throughout the rest of your body.

And indeed, there are many studies which show a correlation between intelligence and strength (eg, Longitudinal and Cross-Sectional Assessments of Age Changes in Physical Strength as Related to Sex, Social Class, and Mental Ability) or intelligence and overall health/not dying (eg, Intelligence in young adulthood and cause-specific mortality in the Danish Conscription Database (pdf) and The effects of occupation-based social position on mortality in a large American cohort.)

On the other hand, the evolutionary standard for “fitness” isn’t strength or longevity, but reproduction, and on this scale the high-IQ don’t seem to do as well:

Smart teens don’t have sex (or kiss much either): (h/t Gene Expresion)

Controlling for age, physical maturity, and mother’s education, a significant curvilinear relationship between intelligence and coital status was demonstrated; adolescents at the upper and lower ends of the intelligence distribution were less likely to have sex. Higher intelligence was also associated with postponement of the initiation of the full range of partnered sexual activities. … Higher intelligence operates as a protective factor against early sexual activity during adolescence, and lower intelligence, to a point, is a risk factor.

Source

Here we see the issue plainly: males at 120 and 130 IQ are less likely to get laid than clinically retarded men in 70s and 60s. The right side of the graph are “nerds”, the left side, “jocks.” Of course, the high-IQ females are even less likely to get laid than the high-IQ males, but males tend to judge themselves against other men, not women, when it comes to dating success. Since the low-IQ females are much less likely to get laid than the low-IQ males, this implies that most of these “popular” guys are dating girls who are smarter than themselves–a fact not lost on the nerds, who would also like to date those girls.

 In 2001, the MIT/Wellesley magazine Counterpart (Wellesley is MIT’s “sister school” and the two campuses allow cross-enrollment in each other’s courses) published a sex survey that provides a more detailed picture of nerd virginity:

I’m guessing that computer scientists invented polyamory, and neuroscientists are the chads of STEM. The results are otherwise pretty predictable.

Unfortunately, Counterpoint appears to be defunct due to lack of funding/interest and I can no longer find the original survey, but here is Jason Malloy’s summary from Gene Expression:

By the age of 19, 80% of US males and 75% of women have lost their virginity, and 87% of college students have had sex. But this number appears to be much lower at elite (i.e. more intelligent) colleges. According to the article, only 56% of Princeton undergraduates have had intercourse. At Harvard 59% of the undergraduates are non-virgins, and at MIT, only a slight majority, 51%, have had intercourse. Further, only 65% of MIT graduate students have had sex.

The student surveys at MIT and Wellesley also compared virginity by academic major. The chart for Wellesley displayed below shows that 0% of studio art majors were virgins, but 72% of biology majors were virgins, and 83% of biochem and math majors were virgins! Similarly, at MIT 20% of ‘humanities’ majors were virgins, but 73% of biology majors. (Apparently those most likely to read Darwin are also the least Darwinian!)

College Confidential has one paragraph from the study:

How Rolling Stone-ish are the few lucky souls who are doing the horizontal mambo? Well, not very. Considering all the non-virgins on campus, 41% of Wellesley and 32% of MIT students have only had one partner (figure 5). It seems that many Wellesley and MIT students are comfortingly monogamous. Only 9% of those who have gotten it on at MIT have been with more than 10 people and the number is 7% at Wellesley.

Someone needs to find the original study and PUT IT BACK ON THE INTERNET.

But this lack of early sexual success seems to translate into long-term marital happiness, once nerds find “the one.”Lex Fridman’s Divorce Rates by Profession offers a thorough list. The average divorce rate was 16.35%, with a high of 43% (Dancers) and a low of 0% (“Media and communication equipment workers.”)

I’m not sure exactly what all of these jobs are nor exactly which ones should count as STEM (veterinarian? anthropologists?) nor do I know how many people are employed in each field, but I count 49 STEM professions that have lower than average divorce rates (including computer scientists, economists, mathematical science, statisticians, engineers, biologists, chemists, aerospace engineers, astronomers and physicists, physicians, and nuclear engineers,) and only 23 with higher than average divorce rates (including electricians, water treatment plant operators, radio and telecommunication installers, broadcast engineers, and similar professions.) The purer sciences obviously had lower rates than the more practical applied tech fields.

The big outliers were mathematicians (19.15%), psychologists (19.26%), and sociologists (23.53%), though I’m not sure they count (if so, there were only 22 professions with higher than average divorce rates.)

I’m not sure which professions count as “jock” or “chad,” but athletes had lower than average rates of divorce (14.05%) as did firefighters, soldiers, and farmers. Financial examiners, hunters, and dancers, (presumably an athletic female occupation) however, had very high rates of divorce.

Medical Daily has an article on Who is Most Likely to Cheat? The Top 9 Jobs Unfaithful People Have (according to survey):

According to the survey recently taken by the “infidelity dating website,” Victoria Milan, individuals working in the finance field, such as brokers, bankers, and analysts, are more likely to cheat than those in any other profession. However, following those in finance comes those in the aviation field, healthcare, business, and sports.

With the exception of healthcare and maybe aviation, these are pretty typical Chad occupations, not STEM.

The Mirror has a similar list of jobs where people are most and least likely to be married. Most likely: Dentist, Chief Executive, Sales Engineer, Physician, Podiatrist, Optometrist, Farm product buyer, Precision grinder, Religious worker, Tool and die maker.

Least likely: Paper-hanger, Drilling machine operator, Knitter textile operator, Forge operator, Mail handler, Science technician, Practical nurse, Social welfare clerk, Winding machine operative, Postal clerk.

I struggled to find data on male fertility by profession/education/IQ, but there’s plenty on female fertility, eg the deceptively titled High-Fliers have more Babies:

…American women without any form of high-school diploma have a fertility rate of 2.24 children. Among women with a high-school diploma the fertility rate falls to 2.09 and for women with some form of college education it drops to 1.78.

However, among women with college degrees, the economists found the fertility rate rises to 1.88 and among women with advanced degrees to 1.96. In 1980 women who had studied for 16 years or more had a fertility rate of just 1.2.

As the economists prosaically explain: “The relationship between fertility and women’s education in the US has recently become U-shaped.”

Here is another article about the difference in fertility rates between high and low-IQ women.

But female fertility and male fertility may not be the same–I recall data elsewhere indicating that high-IQ men have more children than low IQ men, which implies those men are having their children with low-IQ women. (For example, while Bill and Hillary seem about matched on IQ, and have only one child, Melania Trump does not seem as intelligent as Trump, who has five children.)

Amusingly, I did find data on fertility rate by father’s profession for 1920, in the Birth Statistics for the Birth Registration Area of the US:

Of the 1,508,874 children born in 1920 in the birth registration area of the United states, occupations of fathers are stated for … 96.9%… The average number of children ever born to the present wives of these occupied fathers is 3.3 and the average number of children living 2.9.

The average number of children ever born ranges from 4.6 for foremen, overseers, and inspectors engaged in the extraction of minerals to 1.8 for soldiers, sailors, and marines. Both of these extreme averages are easily explained, for soldier, sailors and marines are usually young, while such foremen, overseers, and inspectors are usually in middle life. For many occupations, however, the ages of the fathers are presumably about the same and differences shown indicate real differences in the size of families. For example, the low figure for dentists, (2), architects, (2.1), and artists, sculptors, and teachers of art (2.2) are in striking contrast with the figure for mine operatives (4.3), quarry operatives (4.1) bootblacks, and brick and stone masons (each 3.9). …

As a rule the occupations credited with the highest number of children born are also credited with the highest number of children living, the highest number of children living appearing for foremen, overseers, and inspectors engaged in the extraction of minerals (3.9) and for steam and street railroad foremen and overseer (3.8), while if we exclude groups plainly affected by the age of fathers, the highest number of children living appear for mine and quarry operatives (each 3.6).

Obviously the job market was very different in 1920–no one was majoring in computer science. Perhaps some of those folks who became mine and quarry operatives back then would become engineers today–or perhaps not. Here are the average numbers of surviving children for the most obviously STEM professions (remember average for 1920 was 2.9):

Electricians 2.1, Electrotypers 2.2, telegraph operator 2.2, actors 1.9, chemists 1.8, Inventors 1.8, photographers and physicians 2.1, technical engineers 1.9, veterinarians 2.2.

I don’t know what paper hangers do, but the Mirror said they were among the least likely to be married, and in 1920, they had an average of 3.1 children–above average.

What about athletes? How smart are they?

Athletes Show Huge Gaps on SAT Scores” is not a promising title for the “nerds are athletic” crew.

The Journal-Constitution studied 54 public universities, “including the members of the six major Bowl Championship Series conferences and other schools whose teams finished the 2007-08 season ranked among the football or men’s basketball top 25.”…

  • Football players average 220 points lower on the SAT than their classmates. Men’s basketball was 227 points lower.
  • University of Florida won the prize for biggest gap between football players and the student body, with players scoring 346 points lower than their peers.
  • Georgia Tech had the nation’s best average SAT score for football players, 1028 of a possible 1600, and best average high school GPA, 3.39 of a possible 4.0. But because its student body is apparently very smart, Tech’s football players still scored 315 SAT points lower than their classmates.
  • UCLA, which has won more NCAA championships in all sports than any other school, had the biggest gap between the average SAT scores of athletes in all sports and its overall student body, at 247 points.

From the original article, which no longer seems to be up on the Journal-Constitution website:

All 53 schools for which football SAT scores were available had at least an 88-point gap between team members’ average score and the average for the student body. …

Football players performed 115 points worse on the SAT than male athletes in other sports.

The differences between athletes’ and non-athletes’ SAT scores were less than half as big for women (73 points) as for men (170).

Many schools routinely used a special admissions process to admit athletes who did not meet the normal entrance requirements. … At Georgia, for instance, 73.5 percent of athletes were special admits compared with 6.6 percent of the student body as a whole.

On the other hand, as Discover Magazine discusses in “The Brain: Why Athletes are Geniuses,” athletic tasks–like catching a fly ball or slapping a hockey puck–require exceptionally fast and accurate brain signals to trigger the correct muscle movements.

Ryan Stegal studied the GPAs of highschool student athletes vs. non-athletes and found that the athletes had higher average GPAs than the non-athletes, but he also notes that the athletes were required to meet certain minimum GPA requirements in order to play.

But within athletics, it looks like the smarter athletes perform better than dumber ones, which is why the NFL uses the Wonderlic Intelligence Test:

NFL draft picks have taken the Wonderlic test for years because team owners need to know if their million dollar player has the cognitive skills to be a star on the field.

What does the NFL know about hiring that most companies don’t? They know that regardless of the position, proof of intelligence plays a profound role in the success of every individual on the team. It’s not enough to have physical ability. The coaches understand that players have to be smart and think quickly to succeed on the field, and the closer they are to the ball the smarter they need to be. That’s why, every potential draft pick takes the Wonderlic Personnel Test at the combine to prove he does–or doesn’t—have the brains to win the game. …

The first use of the WPT in the NFL was by Tom Landry of the Dallas Cowboys in the early 70s, who took a scientific approach to finding players. He believed players who could use their minds where it counted had a strategic advantage over the other teams. He was right, and the test has been used at the combine ever since.

For the NFL, years of testing shows that the higher a player scores on the Wonderlic, the more likely he is to be in the starting lineup—for any position. “There is no other reasonable explanation for the difference in test scores between starting players and those that sit on the bench,” Callans says. “Intelligence plays a role in how well they play the game.”

Let’s look at Exercising Intelligence: How Research Shows a Link Between Physical Activity and Smarts:

A large study conducted at the Sahlgrenska Academy and Sahlgrenska University Hospital in Gothenburg, Sweden, reveals that young adults who regularly exercise have higher IQ scores and are more likely to go on to university.

The study was published in the Proceedings of the National Academy of Sciences (PNAS), and involved more than 1.2 million Swedish men. The men were performing military service and were born between the years 1950 and 1976. Both their physical and IQ test scores were reviewed by the research team. …

The researchers also looked at data for twins and determined that primarily environmental factors are responsible for the association between IQ and fitness, and not genetic makeup. “We have also shown that those youngsters who improve their physical fitness between the ages of 15 and 18 increase their cognitive performance.”…

I have seen similar studies before, some involving mice and some, IIRC, the elderly. It appears that exercise is probably good for you.

I have a few more studies I’d like to mention quickly before moving on to discussion.

Here’s Grip Strength and Physical Demand of Previous Occupation in a Well-Functioning Cohort of Chinese Older Adults (h/t prius_1995) found that participants who had previously worked in construction had greater grip strength than former office workers.

Age and Gender-Specific Normative Data of Grip and Pinch Strength in a Healthy Adult Swiss Population (h/t prius_1995).

 

If the nerds are in the sedentary cohort, then they be just as athletic if not more athletic than all of the other cohorts except the heavy work.

However, in Revised normative values for grip strength with the Jamar dynamometer, the authors found no effect of profession on grip strength.

And Isometric muscle strength and anthropometric characteristics of a Chinese sample (h/t prius_1995).

And Pumpkin Person has an interesting post about brain size vs. body size.

 

Discussion: Are nerds real?

Overall, it looks like smarter people are more athletic, more athletic people are smarter, smarter athletes are better athletes, and exercise may make you smarter. For most people, the nerd/jock dichotomy is wrong.

However, there is very little overlap at the very highest end of the athletic and intelligence curves–most college (and thus professional) athletes are less intelligent than the average college student, and most college students are less athletic than the average college (and professional) athlete.

Additionally, while people with STEM degrees make excellent spouses (except for mathematicians, apparently,) their reproductive success is below average: they have sex later than their peers and, as far as the data I’ve been able to find shows, have fewer children.

Stephen Hawking

Even if there is a large overlap between smart people and athletes, they are still separate categories selecting for different things: a cripple can still be a genius, but can’t play football; a dumb person can play sports, but not do well at math. Stephen Hawking can barely move, but he’s still one of the smartest people in the world. So the set of all smart people will always include more “stereotypical nerds” than the set of all athletes, and the set of all athletes will always include more “stereotypical jocks” than the set of all smart people.

In my experience, nerds aren’t socially awkward (aside from their shyness around women.) The myth that they are stems from the fact that they have different interests and communicate in a different way than non-nerds. Let nerds talk to other nerds, and they are perfectly normal, communicative, socially functional people. Put them in a room full of non-nerds, and suddenly the nerds are “awkward.”

Unfortunately, the vast majority of people are not nerds, so many nerds have to spend the majority of their time in the company of lots of people who are very different than themselves. By contrast, very few people of normal IQ and interests ever have to spend time surrounded by the very small population of nerds. If you did put them in a room full of nerds, however, you’d find that suddenly they don’t fit in. The perception that nerds are socially awkward is therefore just normie bias.

Why did the nerd/jock dichotomy become so popular in the 70s? Probably in part because science and technology were really taking off as fields normal people could aspire to major in, man had just landed on the moon and the Intel 4004 was released in 1971.  Very few people went to college or were employed in sciences back in 1920; by 1970, colleges were everywhere and science was booming.

And at the same time, colleges and highschools were ramping up their athletics programs. I’d wager that the average school in the 1800s had neither PE nor athletics of any sort. To find those, you’d probably have to attend private academies like Andover or Exeter. By the 70s, though, schools were taking their athletics programs–even athletic recruitment–seriously.

How strong you felt the dichotomy probably depends on the nature of your school. I have attended schools where all of the students were fairly smart and there was no anti-nerd sentiment, and I have attended schools where my classmates were fiercely anti-nerd and made sure I knew it.

But the dichotomy predates the terminology. Take Superman, first 1938. His disguise is a pair of glasses, because no one can believe that the bookish, mild-mannered, Clark Kent is actually the super-strong Superman. Batman is based on the character of El Zorro, created in 1919. Zorro is an effete, weak, foppish nobleman by day and a dashing, sword-fighting hero of the poor by night. Of course these characters are both smart and athletic, but their disguises only work because others do not expect them to be. As fantasies, the characters are powerful because they provide a vehicle for our own desires: for our everyday normal failings to be just a cover for how secretly amazing we are.

But for the most part, most smart people are perfectly fit, healthy, and coordinated–even the ones who like math.

 

Navigation and the Wealth of Nations

Global Determinants of Navigational Ability, by Coutrot et al:

Using a mobile-based virtual reality navigation task, we measured spatial navigation ability in more than 2.5 million people globally. Using a clustering approach, we find that navigation ability is not smoothly distributed globally but clustered into five distinct yet geographically related groups of countries. Furthermore, the economic wealth of a nation (Gross Domestic Product per capita) was predictive of the average navigation ability of its inhabitants and gender inequality (Gender Gap Index) was predictive of the size of performance difference between males and females. Thus, cognitive abilities, at least for spatial navigation, are clustered according to economic wealth and gender inequalities globally.

This is an incredible study. They got 2.5 million people from all over the world to participate.

If you’ve been following any of the myriad debates about intelligence, IQ, and education, you’re probably familiar with the concept of “multiple intelligences” and the fact that there’s rather little evidence that people actually have “different intelligences” that operate separately from each other. In general, it looks like people who have brains that are good at working out how to do one kind of task tend to be good at working out other sorts of tasks.

I’ve long held navigational ability as a possible exception to this: perhaps people in, say, Polynesian societies depended historically far more on navigational abilities than the rest of us, even though math and literacy were nearly absent.

Unfortunately, it doesn’t look like the authors got enough samples from Polynesia to include it in the study, but they did get data from Indonesia and the Philippines, which I’ll return to in a moment.

Frankly, I don’t see what the authors mean by “five distinct yet geographically related groups of countries.” South Korea is ranked between the UK and Belgium; Russia is next to Malaysia; Indonesia is next to Portugal and Hungary.

GDP per capita appears to be a stronger predictor than geography:

Some people will say these results merely reflect experience playing video games–people in wealthier countries have probably spent more time and money on computers and games. But assuming that the people who are participating in the study in the first place are people who have access to smartphones, computers, video games, etc., the results are not good for the multiple-intelligences hypothesis.

In the GDP per Capita vs. Conditional Modes (ie how well a nation scored overall, with low scores better than high scores) graph, countries above the trend line are under-performing relative to their GDPs, and countries below the line are over-performing relative to their GDPs.

South Africa, for example, significantly over-performs relative to its GDP, probably due to sampling bias: white South Africans with smartphones and computers were probably more likely to participate in the study than the nation’s 90% black population, but the GDP reflects the entire population. Finland and New Zealand are also under-performing economically, perhaps because Finland is really cold and NZ is isolated.

On the other side of the line, the UAE, Saudi Arabia, and Greece over-perform relative to GDP. Two of these are oil states that would be much poorer if not for geographic chance, and as far as I can tell, the whole Greek economy is being propped up by German loans. (There is also evidence that Greek IQ is falling, though this may be a near universal problem in developed nations.)

Three other nations stand out in the “scoring better than GDP predicts” category: Ukraine, (which suffered under Communism–Communism seems to do bad things to countries,) Indonesia and the Philippines. While we could be looking at selection bias similar to South Africa, these are island nations in which navigational ability surely had some historical effect on people’s ability to survive.

Indonesia and the Philippines still didn’t do as well as first-world nations like Norway and Canada, but they outperformed other nations with similar GDPs like Egypt, India, and Macedonia. This is the best evidence I know of for independent selection for navigational ability in some populations.

The study’s other interesting findings were that women performed consistently worse than men, both across countries and age groups (except for the post-90 cohort, but that might just be an error in the data.) Navigational ability declines steeply for everyone post-23 years old until about 75 years; the authors suggest the subsequent increase in abilities post-70s might be sampling error due to old people who are good at video games being disproportionately likely to seek out video game related challenges.

The authors note that people who drive more (eg, the US and Canada) might do better on navigational tasks than people who use public transportation more (eg, Europeans) but also that Finno-Scandians are among the world’s best navigators despite heavy use of public transport in those countries. The authors write:

We speculate that this specificity may be linked to Nordic countries sharing a culture of participating in a sport related to navigation: orienteering. Invented as an official sport in the late 19th century in Sweden, the first orienteering competition open to the public was held in Norway in 1897. Since then, it has been more popular in Nordic countries than anywhere else in the world, and is taught in many schools [26]. We found that ‘orienteering world championship’ country results significantly correlated with countries’ CM (Pearson’s correlation ρ = .55, p = .01), even after correcting for GDP per capita (see Extended Data Fig. 15). Future targeted research will be required to evaluate the impact of cultural activities on navigation skill.

I suggest a different causal relationship: people make hobbies out of things they’re already good at and enjoy doing, rather than things they’re bad at.

 

 

Please note that the study doesn’t look at a big chunk of countries, like most of Africa. Being at the bottom in navigational abilities in this study by no means indicates that a country is at the bottom globally–given the trends already present in the data, it is likely that the poorer countries that weren’t included in the study would do even worse.

Politics are Getting Dumber

You don’t need to watch the video. I haven’t watched the video. I’m only highlighting it because it starts with a moronic question.

Meanwhile, in the social justice warriors vs inanimate objects department:

Kick that statue! Yeah! You show that big chunk of metal who’s boss!

And in inanimate objects vs. inanimate objects:

CNN is impressed by the fact that statues (normally) don’t move.

This one is stupid on several levels–the statue itself, erected by a male-dominated industry to celebrate “female empowerment,” infantilizes women by symbolically depicting them as a small, stupid child who doesn’t know enough to get out of the way of a charging bull.

You know, I could keep posting examples of stupidity all day.

Mob mentality is never good, but it seems like political discourse is getting progressively stupider.

It takes a certain level of intelligence to do two critical things:

  1. Understand and calmly discuss other people’s opinions even when you disagree with them
  2. Realize that cooperating in the prisoner’s dilemma is long-term better than defecting, even if you don’t like the people you’re cooperating with

Traditional “liberalism”* was a kind of meta-political technology for allowing different groups of people to live in one country without killing each other. Freedom of Religion, for example, became an agreed-upon principle after centuries of religious violence in Europe. If the state is going to promote a particular religion and outlaw others, then it’s in every religious person’s interest to try to take over the state and make sure it enforces their religion. If the state stays (ostensibly) neutral, then no one can commandeer it to murder their religious enemies.

*”Liberal” has in recent years become an almost empty anachronism, but I hope its meaning is clear in the historical context of 1787.

Freedom of Speech, necessary for people to make informed decisions, has recently come under attack for political reasons. Take the thousands of protestors who showed up to an anti-Free Speech rally in Boston on Sat, August 19th.

The Doublespeak is Strong with this One

Of course no one likes letting their enemies speak, but everyone is someone else’s enemy. Virtually every historical atrocity was committed by people convinced that they were right and merely opposing evil, despicable people. Respecting free speech does not require liking other people’s arguments. It requires understanding that if you start punching Nazis, Nazis will punch you back, and soon everyone will be screaming “Nazi!” while punching random people.

Edit: apparently one article I linked to was a hoax. Hard to tell sometimes.

Now, Free Speech has often been honored more as an ideal than a reality. When people are out of power, they tend to defend the ideal rather strongly; when in power, they suddenly seem to lose interest in it. But most people interested in politics still seemed to have some general sense that even if they hated that other guy’s guts, it might be a bad idea to unleash mob violence on him.

In general, principles like free speech and freedom of religion let different people–and different communities of people–run their own lives and communities as they see fit, without coming into direct conflict with each other, while still getting to enjoy the national security and trade benefits of living in a large country. The Amish get to be Amish, Vermonters get to live free or die, and Coloradans get to eat pot brownies.

But that requires being smart enough to understand that to keep a nation of over 300 million people together, you have to live and let live–and occasionally hold your nose and put up with people you hate.

These days, politics just seems like it’s getting a lot dumber:

Cat that nearly died after being attacked by a thug “because he looks like Hitler” has now recovered despite losing an eye.

An attempt to answer some questions on IQ

I recently received a few IQ-related questions. Now, IQ is not my specialty, so I do not feel particularly adequate for the task, but I’ll do my best. I recommend anyone really interested in the subject read Pumpkin Person’s blog, as he really enjoys talking about IQ all the time.

  1. I wanted to ask if you know any IQ test on the internet that is an equivalent to the reliable tests given by psychologists?

I suppose it depends on what you want the test for. Curiosity? Diagnosis? Personally, I suspect that the average person isn’t going to learn very much from an IQ test that they didn’t already know just from living (similarly, I don’t think you’re going to discover that you’re an introvert or extrovert by taking an online quiz if you didn’t know it already from interacting with people,) but there are cases where people might want to take an IQ test, so let’s get searching.

Pumpkin Person speaks highly of the Wechsler Intelligence Scales (it comes in adult and child versions.) According to Wikipedia:

The Wechsler Adult Intelligence Scale (WAIS) is an IQ test designed to measure intelligence and cognitive ability in adults and older adolescents.[1] The original WAIS (Form I) was published in February 1955 by David Wechsler, as a revision of the Wechsler–Bellevue Intelligence Scale, released in 1939.[2] It is currently in its fourth edition (WAIS-IV) released in 2008 by Pearson, and is the most widely used IQ test, for both adults and older adolescents, in the world.

Since IQ tests excite popular interest but no one really wants to pay $1,000 just to take a test, the internet is littered with “free” tests of questionable quality. For example, WeschlerTest.com offers “free sample tests,” but the bottom of the website notes that, “Disclaimer: This is not an official Wechsler test and is only for entertainment purposes. Any scores derived from it may not accurately reflect the score you would attain on an official Wechsler test.” Here is a similar wesbsite that offers free Stanford-Binet Tests.

I am not personally in a position to judge if these are any good.

It looks like the US military has put its Armed Services Vocational Aptitude Battery online, or at least a practice version. This seems like one of the best free options, because the army is a real organization that’s deeply interested in getting accurate results and the relationship between the ASVAB and other IQ tests is probably well documented. From the website:

The ASVAB is a timed test that measures your skills in a number of different areas. You complete questions that reveal your skills in paragraph comprehension, word knowledge, arithmetic reasoning and mathematics knowledge. These are basic skills that you will need as a member of the U.S. military. The score you receive on the ASVAB is factored into your Armed Forces Qualifying Test (AFQT) score. This score is used to figure out whether you qualify to enlist in the armed services. …

The ASVAB was created in 1968. By 1976, all branches of the military began using this test. In 2002, the test underwent many revisions, but its main goal of gauging a person’s basic skills remained the same. Today, there is a computerized version of the test as well as a written version. The Department of Defense developed this test and it’s taken by students in thousands of schools across the country. It is also given at Military Entrance Processing Stations (MEPS).

Naturally, each branch of the United States armed services wants to enlist the best, most qualified candidates each year. The ASVAB is a tool that helps in the achievement of that purpose. Preparing to take the ASVAB is just one more step in the journey toward your goal of joining the U.S. armed services. …

Disclaimer: The tests on this website are for entertainment purposes only, and may not accurately reflect the scores you would attain on a professionally administered ASVAB test.

The blog Random Critical Analysis gives a thorough rundown of the correlation between ASVAB and IQ scores (they are highly correlated) along with the SAT and ACT.

Additionally, there are a couple of tests linked here in Lipscomb’s Intelligence Course Lab : Classical IQ Test from Psychology Today and IQTest.com.

Drawing a page from Pumpkin Person’s book, I recommend taking several different tests and then comparing results. Use your good judgment about whether a particular test seems reliable–is it covered in ads? Does random guessing get you a score of 148? Did you get a result similar to what you’d expect based on real life experiences?

2. Besides that I wanted to ask you how much social class and IQ are correlated?

A fair amount.

Thanks to Tino Sanandaji

With thanks to Pumpkin Person

Really dumb people are too dumb to commit as much crime as mildly dumb people

IQ by country–red = low; purple – high. Source Wikipedia

 

 

 

 

 

 

 

 

 

 

I do wonder why he made the graph so much bigger than the relevant part
Lifted gratefully from La Griffe Du Lion’s Smart Fraction II article
Oh, there you are, correlation
Lifted gratefully from La Griffe Du Lion’s Smart Fraction II article

When dumb children are born to rich people, they tend to do badly in life and don’t make much money; they subsequently sink in social status. When smart children are born to poor people, they tend to do well in life and rise in social status. Even in societies with strict social classes where moving from class to class is effectively impossible, we should still expect that really dumb people born into wealth will squander it, leading to their impoverishment. Likewise, among the lower classes, we would still expect that smarter low-class people would do better in life than dumber ones.

This is all somewhat built into the entire definition of “IQ” and what people were trying to measure when they created the tests.

3. Basically do traditional upper classes form separate genetic clusters like Gregory Clark claims?

I haven’t read Clark’s book, but I’m sure the pathetic amount of research I can do here would be nothing compared to what he’s amassed.

There are a number of studies on assortative mating and IQ, eg: Spouse similarity for IQ and personality and convergence:

A similar pattern of spousal association for IQ scores and personality traits was found in two British samples from Oxford and Cambridge. There was no indirect evidence from either sample to suggest that convergence occurred during marriage. All observed assortative mating might well be due to initial assortment.

Assortative mating for psychiatric disorders and psychological traits:

This article reviews the literature on assortative mating for psychological traits and psychiatric illness. Assortative mating appears to exist for personality traits, but to a lesser degree than that observed for physical traits, sociodemographic traits, intelligence, and attitudes and values. Concordance between spouses for psychiatric illness has also been consistently reported in numerous studies. This article examines alternative explanations for such observed concordance and discusses the effects of assortative mating on population genetics and the social environment.

Do assortative mating patterns for IQ block upward social mobility?

In the Minnesota Twin Family Study, assortative mating for IQ was greater than .3 in both the 11- and 17-year-old cohorts. Recognizing this, genetic variance in IQ independent of SES was greater with higher parental SES in the 11-year-old cohort. This was not true, however, in the 17-year-old cohort. In both cohorts, people of higher IQ were more likely to have ‘married down’ for IQ than people of lower IQ were to have ‘married up’. This assortative mating pattern would create greater genetic diversity for IQ in people of higher IQ than in people of lower IQ. As IQ is associated with SES, the pattern could be one reason for the observation of greater genetic variance in IQ independent of SES with greater parental SES in several samples. If so, it could block upward social mobility among those already in lower-SES groups. I discuss possible involved mechanisms and social implications.

The role of personality and intelligence in assortative mating:

Assortative mating is the individuals’ tendency to mate with those who are similar to them in some variables, at a higher rate than would be expected from random. This study aims to provide empirical evidence of assortative mating through the Big Five model of personality and two measures of intelligence using Spanish samples. The sample consisted of 244 Spanish couples. It was divided into two groups according to relationship time. The effect of age, educational level and socioeconomic status was controlled. The results showed strong assortative mating for intelligence and moderate for personality. The strongest correlations for Personality were found in Openness, Agreeableness and Conscientiousness.

Assortative mating for IQ and personality due to propinquity and personal preference:

The role of personal preference as an active process in mate selection is contrasted with the more passive results of limitations of available mates due to social, educational, and geographical propinquity. The role of personal preference estimated after removing the effects of variables representing propinquity was still significant for IQ and Eysenck’s extraversion-introversion and inconsistency (lie) scales, even though small.

Related: Heritability estimates versus large environmental effects: the IQ paradox resolved:

Some argue that the high heritability of IQ renders purely environmental explanations for large IQ differences between groups implausible. Yet, large environmentally induced IQ gains between generations suggest an important role for environment in shaping IQ. The authors present a formal model of the process determining IQ in which people’s IQs are affected by both environment and genes, but in which their environments are matched to their IQs. The authors show how such a model allows very large effects for environment, even incorporating the highest estimates of heritability. Besides resolving the paradox, the authors show that the model can account for a number of other phenomena, some of which are anomalous when viewed from the standard perspective.

 

4. Are upper class people genetically more intelligent? Or is there an effect of regression to the mean and all classes have about equal chances to spawn high IQ people?”

Stephen Hsu has a relevant post on the subject: Assortative mating, regression and all that: offspring IQ vs parental midpoint:

…James Lee, a real expert in the field, sent me a current best estimate for the probability distribution of offspring IQ as a function of parental midpoint (average between the parents’ IQs). James is finishing his Ph.D. at Harvard under Steve Pinker — you might have seen his review of R. Nesbitt’s book Intelligence and how to get it: Why schools and cultures count.

The results are stated further below. Once you plug in the numbers, you get (roughly) the following:

Assuming parental midpoint of n SD above the population average, the kids’ IQ will be normally distributed about a mean which is around +.6n with residual SD of about 12 points. (The .6 could actually be anywhere in the range (.5, .7), but the SD doesn’t vary much from choice of empirical inputs.)…

Read Hsu’s post for the rest of the details.

In short, while regression to the mean works for everyone, different people regress to different means depending on how smart their particular ancestors were. For example, if two people of IQ 100 have a kid with an IQ of 140, (Kid A) and two people of IQ 120 have a kid of IQ 140, (Kid B), Kid A’s own kids are likely to regress toward 100, while Kid B’s kids are likely to regress toward 120.

We can look at the effects of parental SES on SAT Scores and the like:

SAT scores by race and parental income

Personally, I know plenty of extremely intelligent people who come from low-SES backgrounds, but few of them ended up low-SES. Overall, I’d expect highly intelligent people to move up in status and less intelligent people to move down over time, with the upper class thus sort of “collecting” high-IQ people, but there are obviously regional and cultural effects that may make it inappropriate to compare across groups.

Hope that has been useful.

What Mental Traits does the Arctic Select for?

Apropos Friday’s conversation about the transition from hunting to pastoralism and the different strategies hunters employ in different environments, I got to thinking about how these different food-production systems could influence the development of different “intelligences,” or at least mental processes that underlie intelligence.

Ingold explains that in warm climes, hunter-gatherers have many food resources they can exploit, and if one resource starts running low, they can fairly easily switch to another. If there aren’t enough yams around, you can eat melons; if not enough melons, squirrels; if no squirrels, eggs. I recall a study of Australian Aborigines who agreed to go back to hunter-gatherering for a while after living in town for several decades. Among other things (like increased health,) scientists noted that the Aborigines increased the number of different kinds of foods they consumed from, IIRC, about 40 per week to 100.

By contrast, hunters in the arctic are highly dependent on exploiting only a few resources–fish, seals, reindeer, and perhaps a few polar bears and foxes. Ingold claims that there are (were) tribes that depended largely on only a few major hunts of migrating animals (netting hundreds of kills) to supply themselves for the whole year.

If those migrating change their course by even a few miles, it’s easy to see how the hunters could miss the herds entirely and, with no other major species around to exploit, starve over the winter.

Let’s consider temperate agriculture as well: the agriculturalist can store food better than the arctic hunter (seal meat does not do good things in the summer,) but lacks the tropical hunter-gatherer’s flexibility; he must stick to his fields and keep working, day in and day out, for a good nine months in a row. Agricultural work is more flexible than assembly line work, where your every minute is dictated by the needs of the factory, but a farmer can’t just wander away from his crops to go hunt for a months just because he feels like it, nor can he hope to make up for a bad wheat harvest by wandering into his neighbor’s fields and picking their potatoes.

Which got me thinking: clearly different people are going to do better at different systems.

But first, what is intelligence? Obviously we could define it in a variety of ways, but let’s stick to reasonable definitions, eg, the ability to use your brain to achieve success, or the ability to get good grades on your report card.

A variety of mental traits contribute to “intelligence,” such as:

  1. The ability to learn lots of information. Information is really useful, both in life and on tests, and smarter brains tend to be better at storing lots and lots of data.
  2. Flexible thinking. This is the ability to draw connections between different things you’ve learned, to be creative, to think up new ideas, etc.
  3. Some form of Drive, Self Will, or long-term planning–that is, the ability to plan for your future and then push yourself to accomplish your goals. (These might more properly be two different traits, but we’ll keep them together for now.)

Your stereotypical autistic, capable of memorizing large quantities of data but not doing much with them, has trait #1 but not 2 or 3.

Artists and musicians tend to have a lot of trait #2, but not necessarily 1 or 3 (though successful artists obviously have a ton of #3)

And an average kid who’s not that bright but works really hard, puts in extra hours of effort on their homework, does extra credit assignments, etc., has a surfeit of #3 but not much 2 or 1.

Anyway, it seems to me like the tropical hunting/gathering environment, with many different species to exploit, would select for flexible thinking–if one food isn’t working out, look for a different one. This may also apply to people from tropical farming/horticulturalist societies.

By contrast, temperate farming seems more likely to select for planning–you can’t just wander off or try to grow something new in time for winter if your first crop doesn’t work out.

Many people have noted that America’s traditionally tropical population (African Americans) seems to be particularly good at flexible thinking, leading to much innovation in arts and music. They are not as talented, though, at Drive, leading to particularly high highschool dropout rates.

America’s traditionally rice-farming population (Asians,) by contrast, has been noted for over a century for its particularly high drive and ability to plan for the future, but not so much for contributions to the arts. East Asian people are noted for their particularly high IQ/SAT/PISA scores, despite the fact that China lags behind the West in GDP and quality of life terms. (Japan, of course, is a fully developed country.) One potential explanation for this is that the Chinese, while very good at working extremely hard, aren’t as good at flexible thinking that would help spur innovation. (I note that the Japanese seem to do just fine at flexible thinking, but you know, the Japanese aren’t Chinese and Japan isn’t China.)

(I know I’m not really stating anything novel.) But the real question is:

What kind of mental traits might pastoralism, arctic pastoralism, or arctic hunting select for?