Evolution is slow–until it’s fast: Genetic Load and the Future of Humanity

Source: Priceonomics

A species may live in relative equilibrium with its environment, hardly changing from generation to generation, for millions of years. Turtles, for example, have barely changed since the Cretaceous, when dinosaurs still roamed the Earth.

But if the environment changes–critically, if selective pressures change–then the species will change, too. This was most famously demonstrated with English moths, which changed color from white-and-black speckled to pure black when pollution darkened the trunks of the trees they lived on. To survive, these moths need to avoid being eaten by birds, so any moth that stands out against the tree trunks tends to get turned into an avian snack. Against light-colored trees, dark-colored moths stood out and were eaten. Against dark-colored trees, light-colored moths stand out.

This change did not require millions of years. Dark-colored moths were virtually unknown in 1810, but by 1895, 98% of the moths were black.

The time it takes for evolution to occur depends simply on A. The frequency of a trait in the population and B. How strongly you are selecting for (or against) it.

Let’s break this down a little bit. Within a species, there exists a great deal of genetic variation. Some of this variation happens because two parents with different genes get together and produce offspring with a combination of their genes. Some of this variation happens because of random errors–mutations–that occur during copying of the genetic code. Much of the “natural variation” we see today started as some kind of error that proved to be useful, or at least not harmful. For example, all humans originally had dark skin similar to modern Africans’, but random mutations in some of the folks who no longer lived in Africa gave them lighter skin, eventually producing “white” and “Asian” skin tones.

(These random mutations also happen in Africa, but there they are harmful and so don’t stick around.)

Natural selection can only act on the traits that are actually present in the population. If we tried to select for “ability to shoot x-ray lasers from our eyes,” we wouldn’t get very far, because no one actually has that mutation. By contrast, albinism is rare, but it definitely exists, and if for some reason we wanted to select for it, we certainly could. (The incidence of albinism among the Hopi Indians is high enough–1 in 200 Hopis vs. 1 in 20,000 Europeans generally and 1 in 30,000 Southern Europeans–for scientists to discuss whether the Hopi have been actively selecting for albinism. This still isn’t a lot of albinism, but since the American Southwest is not a good environment for pale skin, it’s something.)

You will have a much easier time selecting for traits that crop up more frequently in your population than traits that crop up rarely (or never).

Second, we have intensity–and variety–of selective pressure. What % of your population is getting removed by natural selection each year? If 50% of your moths get eaten by birds because they’re too light, you’ll get a much faster change than if only 10% of moths get eaten.

Selection doesn’t have to involve getting eaten, though. Perhaps some of your moths are moth Lotharios, seducing all of the moth ladies with their fuzzy antennae. Over time, the moth population will develop fuzzier antennae as these handsome males out-reproduce their less hirsute cousins.

No matter what kind of selection you have, nor what part of your curve it’s working on, all that ultimately matters is how many offspring each individual has. If white moths have more children than black moths, then you end up with more white moths. If black moths have more babies, then you get more black moths.

Source SUPS.org

So what happens when you completely remove selective pressures from a population?

Back in 1968, ethologist John B. Calhoun set up an experiment popularly called “Mouse Utopia.” Four pairs of mice were given a large, comfortable habitat with no predators and plenty of food and water.

Predictably, the mouse population increased rapidly–once the mice were established in their new homes, their population doubled every 55 days. But after 211 days of explosive growth, reproduction began–mysteriously–to slow. For the next 245 days, the mouse population doubled only once every 145 days.

The birth rate continued to decline. As births and death reached parity, the mouse population stopped growing. Finally the last breeding female died, and the whole colony went extinct.

source

As I’ve mentioned before Israel is (AFAIK) the only developed country in the world with a TFR above replacement.

It has long been known that overcrowding leads to population stress and reduced reproduction, but overcrowding can only explain why the mouse population began to shrink–not why it died out. Surely by the time there were only a few breeding pairs left, things had become comfortable enough for the remaining mice to resume reproducing. Why did the population not stabilize at some comfortable level?

Professor Bruce Charlton suggests an alternative explanation: the removal of selective pressures on the mouse population resulted in increasing mutational load, until the entire population became too mutated to reproduce.

What is genetic load?

As I mentioned before, every time a cell replicates, a certain number of errors–mutations–occur. Occasionally these mutations are useful, but the vast majority of them are not. About 30-50% of pregnancies end in miscarriage (the percent of miscarriages people recognize is lower because embryos often miscarry before causing any overt signs of pregnancy,) and the majority of those miscarriages are caused by genetic errors.

Unfortunately, randomly changing part of your genetic code is more likely to give you no skin than skintanium armor.

But only the worst genetic problems that never see the light of day. Plenty of mutations merely reduce fitness without actually killing you. Down Syndrome, famously, is caused by an extra copy of chromosome 21.

While a few traits–such as sex or eye color–can be simply modeled as influenced by only one or two genes, many traits–such as height or IQ–appear to be influenced by hundreds or thousands of genes:

Differences in human height is 60–80% heritable, according to several twin studies[19] and has been considered polygenic since the Mendelian-biometrician debate a hundred years ago. A genome-wide association (GWA) study of more than 180,000 individuals has identified hundreds of genetic variants in at least 180 loci associated with adult human height.[20] The number of individuals has since been expanded to 253,288 individuals and the number of genetic variants identified is 697 in 423 genetic loci.[21]

Obviously most of these genes each plays only a small role in determining overall height (and this is of course holding environmental factors constant.) There are a few extreme conditions–gigantism and dwarfism–that are caused by single mutations, but the vast majority of height variation is caused by which particular mix of those 700 or so variants you happen to have.

The situation with IQ is similar:

Intelligence in the normal range is a polygenic trait, meaning it’s influenced by more than one gene.[3][4]

The general figure for the heritability of IQ, according to an authoritative American Psychological Association report, is 0.45 for children, and rises to around 0.75 for late teens and adults.[5][6] In simpler terms, IQ goes from being weakly correlated with genetics, for children, to being strongly correlated with genetics for late teens and adults. … Recent studies suggest that family and parenting characteristics are not significant contributors to variation in IQ scores;[8] however, poor prenatal environment, malnutrition and disease can have deleterious effects.[9][10]

And from a recent article published in Nature Genetics, Genome-wide association meta-analysis of 78,308 individuals identifies new loci and genes influencing human intelligence:

Despite intelligence having substantial heritability2 (0.54) and a confirmed polygenic nature, initial genetic studies were mostly underpowered3, 4, 5. Here we report a meta-analysis for intelligence of 78,308 individuals. We identify 336 associated SNPs (METAL P < 5 × 10−8) in 18 genomic loci, of which 15 are new. Around half of the SNPs are located inside a gene, implicating 22 genes, of which 11 are new findings. Gene-based analyses identified an additional 30 genes (MAGMA P < 2.73 × 10−6), of which all but one had not been implicated previously. We show that the identified genes are predominantly expressed in brain tissue, and pathway analysis indicates the involvement of genes regulating cell development (MAGMA competitive P = 3.5 × 10−6). Despite the well-known difference in twin-based heritability2 for intelligence in childhood (0.45) and adulthood (0.80), we show substantial genetic correlation (rg = 0.89, LD score regression P = 5.4 × 10−29). These findings provide new insight into the genetic architecture of intelligence.

The greater number of genes influence a trait, the harder they are to identify without extremely large studies, because any small group of people might not even have the same set of relevant genes.

High IQ correlates positively with a number of life outcomes, like health and longevity, while low IQ correlates with negative outcomes like disease, mental illness, and early death. Obviously this is in part because dumb people are more likely to make dumb choices which lead to death or disease, but IQ also correlates with choice-free matters like height and your ability to quickly press a button. Our brains are not some mysterious entities floating in a void, but physical parts of our bodies, and anything that affects our overall health and physical functioning is likely to also have an effect on our brains.

Like height, most of the genetic variation in IQ is the combined result of many genes. We’ve definitely found some mutations that result in abnormally low IQ, but so far we have yet (AFAIK) to find any genes that produce the IQ gigantism. In other words, low (genetic) IQ is caused by genetic load–Small Yet Important Genetic Differences Between Highly Intelligent People and General Population:

The study focused, for the first time, on rare, functional SNPs – rare because previous research had only considered common SNPs and functional because these are SNPs that are likely to cause differences in the creation of proteins.

The researchers did not find any individual protein-altering SNPs that met strict criteria for differences between the high-intelligence group and the control group. However, for SNPs that showed some difference between the groups, the rare allele was less frequently observed in the high intelligence group. This observation is consistent with research indicating that rare functional alleles are more often detrimental than beneficial to intelligence.

Maternal mortality rates over time, UK data

Greg Cochran has some interesting Thoughts on Genetic Load. (Currently, the most interesting candidate genes for potentially increasing IQ also have terrible side effects, like autism, Tay Sachs and Torsion Dystonia. The idea is that–perhaps–if you have only a few genes related to the condition, you get an IQ boost, but if you have too many, you get screwed.) Of course, even conventional high-IQ has a cost: increased maternal mortality (larger heads).

Wikipedia defines genetic load as:

the difference between the fitness of an average genotype in a population and the fitness of some reference genotype, which may be either the best present in a population, or may be the theoretically optimal genotype. … Deleterious mutation load is the main contributing factor to genetic load overall.[5] Most mutations are deleterious, and occur at a high rate.

There’s math, if you want it.

Normally, genetic mutations are removed from the population at a rate determined by how bad they are. Really bad mutations kill you instantly, and so are never born. Slightly less bad mutations might survive, but never reproduce. Mutations that are only a little bit deleterious might have no obvious effect, but result in having slightly fewer children than your neighbors. Over many generations, this mutation will eventually disappear.

(Some mutations are more complicated–sickle cell, for example, is protective against malaria if you have only one copy of the mutation, but gives you sickle cell anemia if you have two.)

Jakubany is a town in the Carpathian Mountains

Throughout history, infant mortality was our single biggest killer. For example, here is some data from Jakubany, a town in the Carpathian Mountains:

We can see that, prior to the 1900s, the town’s infant mortality rate stayed consistently above 20%, and often peaked near 80%.

The graph’s creator states:

When I first ran a calculation of the infant mortality rate, I could not believe certain of the intermediate results. I recompiled all of the data and recalculated … with the same astounding result – 50.4% of the children born in Jakubany between the years 1772 and 1890 would diebefore reaching ten years of age! …one out of every two! Further, over the same 118 year period, of the 13306 children who were born, 2958 died (~22 %) before reaching the age of one.

Historical infant mortality rates can be difficult to calculate in part because they were so high, people didn’t always bother to record infant deaths. And since infants are small and their bones delicate, their burials are not as easy to find as adults’. Nevertheless, Wikipedia estimates that Paleolithic man had an average life expectancy of 33 years:

Based on the data from recent hunter-gatherer populations, it is estimated that at 15, life expectancy was an additional 39 years (total 54), with a 0.60 probability of reaching 15.[12]

Priceonomics: Why life expectancy is misleading

In other words, a 40% chance of dying in childhood. (Not exactly the same as infant mortality, but close.)

Wikipedia gives similarly dismal stats for life expectancy in the Neolithic (20-33), Bronze and Iron ages (26), Classical Greece(28 or 25), Classical Rome (20-30), Pre-Columbian Southwest US (25-30), Medieval Islamic Caliphate (35), Late Medieval English Peerage (30), early modern England (33-40), and the whole world in 1900 (31).

Over at ThoughtCo: Surviving Infancy in the Middle Ages, the author reports estimates for between 30 and 50% infant mortality rates. I recall a study on Anasazi nutrition which I sadly can’t locate right now, which found 100% malnutrition rates among adults (based on enamel hypoplasias,) and 50% infant mortality.

As Priceonomics notes, the main driver of increasing global life expectancy–48 years in 1950 and 71.5 years in 2014 (according to Wikipedia)–has been a massive decrease in infant mortality. The average life expectancy of an American newborn back in 1900 was only 47 and a half years, whereas a 60 year old could expect to live to be 75. In 1998, the average infant could expect to live to about 75, and the average 60 year old could expect to live to about 80.

Back in his post on Mousetopia, Charlton writes:

Michael A Woodley suggests that what was going on [in the Mouse experiment] was much more likely to be mutation accumulation; with deleterious (but non-fatal) genes incrementally accumulating with each generation and generating a wide range of increasingly maladaptive behavioural pathologies; this process rapidly overwhelming and destroying the population before any beneficial mutations could emerge to ‘save; the colony from extinction. …

The reason why mouse utopia might produce so rapid and extreme a mutation accumulation is that wild mice naturally suffer very high mortality rates from predation. …

Thus mutation selection balance is in operation among wild mice, with very high mortality rates continually weeding-out the high rate of spontaneously-occurring new mutations (especially among males) – with typically only a small and relatively mutation-free proportion of the (large numbers of) offspring surviving to reproduce; and a minority of the most active and healthy (mutation free) males siring the bulk of each generation.

However, in Mouse Utopia, there is no predation and all the other causes of mortality (eg. Starvation, violence from other mice) are reduced to a minimum – so the frequent mutations just accumulate, generation upon generation – randomly producing all sorts of pathological (maladaptive) behaviours.

Historically speaking, another selective factor operated on humans: while about 67% of women reproduced, only 33% of men did. By contrast, according to Psychology Today, a majority of today’s men have or will have children.

Today, almost everyone in the developed world has plenty of food, a comfortable home, and doesn’t have to worry about dying of bubonic plague. We live in humantopia, where the biggest factor influencing how many kids you have is how many you want to have.

source

Back in 1930, infant mortality rates were highest among the children of unskilled manual laborers, and lowest among the children of professionals (IIRC, this is Brittish data.) Today, infant mortality is almost non-existent, but voluntary childlessness has now inverted this phenomena:

Yes, the percent of childless women appears to have declined since 1994, but the overall pattern of who is having children still holds. Further, while only 8% of women with post graduate degrees have 4 or more children, 26% of those who never graduated from highschool have 4+ kids. Meanwhile, the age of first-time moms has continued to climb.

In other words, the strongest remover of genetic load–infant mortality–has all but disappeared; populations with higher load (lower IQ) are having more children than populations with lower load; and everyone is having children later, which also increases genetic load.

Take a moment to consider the high-infant mortality situation: an average couple has a dozen children. Four of them, by random good luck, inherit a good combination of the couple’s genes and turn out healthy and smart. Four, by random bad luck, get a less lucky combination of genes and turn out not particularly healthy or smart. And four, by very bad luck, get some unpleasant mutations that render them quite unhealthy and rather dull.

Infant mortality claims half their children, taking the least healthy. They are left with 4 bright children and 2 moderately intelligent children. The three brightest children succeed at life, marry well, and end up with several healthy, surviving children of their own, while the moderately intelligent do okay and end up with a couple of children.

On average, society’s overall health and IQ should hold steady or even increase over time, depending on how strong the selective pressures actually are.

Or consider a consanguineous couple with a high risk of genetic birth defects: perhaps a full 80% of their children die, but 20% turn out healthy and survive.

Today, by contrast, your average couple has two children. One of them is lucky, healthy, and smart. The other is unlucky, unhealthy, and dumb. Both survive. The lucky kid goes to college, majors in underwater intersectionist basket-weaving, and has one kid at age 40. That kid has Down Syndrome and never reproduces. The unlucky kid can’t keep a job, has chronic health problems, and 3 children by three different partners.

Your consanguineous couple migrates from war-torn Somalia to Minnesota. They still have 12 kids, but three of them are autistic with IQs below the official retardation threshold. “We never had this back in Somalia,” they cry. “We don’t even have a word for it.”

People normally think of dysgenics as merely “the dumb outbreed the smart,” but genetic load applies to everyone–men and women, smart and dull, black and white, young and especially old–because we all make random transcription errors when copying our DNA.

I could offer a list of signs of increasing genetic load, but there’s no way to avoid cherry-picking trends I already know are happening, like falling sperm counts or rising (diagnosed) autism rates, so I’ll skip that. You may substitute your own list of “obvious signs society is falling apart at the genes” if you so desire.

Nevertheless, the transition from 30% (or greater) infant mortality to almost 0% is amazing, both on a technical level and because it heralds an unprecedented era in human evolution. The selective pressures on today’s people are massively different from those our ancestors faced, simply because our ancestors’ biggest filter was infant mortality. Unless infant mortality acted completely at random–taking the genetically loaded and unloaded alike–or on factors completely irrelevant to load, the elimination of infant mortality must continuously increase the genetic load in the human population. Over time, if that load is not selected out–say, through more people being too unhealthy to reproduce–then we will end up with an increasing population of physically sick, maladjusted, mentally ill, and low-IQ people.

(Remember, all mental traits are heritable–so genetic load influences everything, not just controversial ones like IQ.)

If all of the above is correct, then I see only 4 ways out:

  1. Do nothing: Genetic load increases until the population is non-functional and collapses, resulting in a return of Malthusian conditions, invasion by stronger neighbors, or extinction.
  2. Sterilization or other weeding out of high-load people, coupled with higher fertility by low-load people
  3. Abortion of high load fetuses
  4. Genetic engineering

#1 sounds unpleasant, and #2 would result in masses of unhappy people. We don’t have the technology for #4, yet. I don’t think the technology is quite there for #2, either, but it’s much closer–we can certainly test for many of the deleterious mutations that we do know of.

Advertisements

Cannibalism, Abortion, and R/K Selection.

Reindeer herder, from "Quarter of a Million Reindeers to be Butched... after Anthrax Outbreak" : "Serbian officials have demanded a huge cull of a 250,000 reindeers by Christmas over the risk of an anthrax outbreak. Currently 730,000 animals are being kept in the Yamal Peninsula and the rest of the Yamalo-Nenets region."
Reindeer herder, from Quarter of a Million Reindeers to be Butched… after Anthrax Outbreak: “Currently 730,000 animals are being kept in the Yamal Peninsula and the rest of the Yamalo-Nenets region.”

In Hunters, Pastoralists, and Ranchers: Reindeer Economies and their Transformations [PDF,] Ingold describes the social distribution of food among hunter-gatherers. In normal times, when food is neither super-abundant nor scarce, each family basically consumes what it brings in, without feeling any particular compulsion to share with their neighbors. In times of super-abundance, food is distributed throughout the tribe, often quite freely:

Since harvested animals, unlike a plant crop, will not reproduce, the multiplicative accumulation of material wealth is not possible within the framework of hunting relations of production. Indeed, what is most characteristic of hunting societies everywhere is the emphasis not on accumulation but on its obverse: the sharing of the kill, to varying degrees, amongst all those associated with the hunter. …

The fortunate hunter, when he returns to camp with his kill, is expected to play host to the rest of the community, in bouts of extravagant consumption.

The other two ethnographies I have read of hunter-gatherers (The Harmless People, about the Bushmen of the Kalahari, and Kabloona, about the Eskimo aka Inuit) both support this: large kills are communal feasts. Hunter gatherers often have quite strict rules about how exactly a kill is to be divided, but the most important thing is that everyone gets some.

And this is eminently sensible–you try eating an entire giraffe by yourself, in the desert, before it rots.

Even in the arctic, where men can (in part of the year) freeze food for the future, your neighbor’s belly is as good as a freezer, because the neighbor you feed today will feed you tomorrow. Hunting is an activity that can be wildly successful one day and fail completely the next, so if hunters did not share with each other, soon each one would starve.

Whilst the successful hunter is required to distribute his spoils freely amongst his camp fellows, he does so with the assurance that in any future eventuality, when through bad luck he fails to find game, or through illness or old age he can no longer provide for himself and his family, he will receive in his turn. Were each hunter to produce only for his own domestic needs, everyone would eventually perish from hunger (Jochelson 1926:124). Thus, through its contribution to the survival and reproduction of potential producers, sharing ensures the perpetuation of society as a whole. …

Yet he is also concerned to set aside stocks of food to see his household through at least a part of the coming winter. The meat that remains after the obligatory festive redistribution is therefore placed in the household’s cache, on which the housewife can draw specifically for the provision of her own domestic group (Spencer 1959:149). After the herds have passed by, domestic autonomy is re-establisheddraws on its own reserves of stored food.

But what happens at the opposite extreme, not under conditions of abundance, but when everyone‘s stocks run out? Ingold claims that in times of famine, the obligation to share what little food one has with one’s neighbors is also invoked:

We find, therefore, that the incidence of generalized reciprocity tends to peak towards the two extremes of scarcity and abundance… The communal feast that follows a successful hunting drive involves the same heightening of band solidarity, and calls into play the same functions of leadership in the apportionment of food, as does the consumption of famine rations.

I am reminded here of a scene in The Harmless People in which there was not enough food to go around, but the rules of distribution were still followed, each person just cutting their piece smaller. Thomas described one of the small children, hungry, trying to grab the food bowl–not the food itself–to stop their mother from giving away their food to the next person in the chain of obligation.

Here Ingold pauses to discuss a claim by Sahlins that such social order will (or should) break down under conditions of extreme hunger:

Probably every primitive organization has its breaking-point, or at least its turning-point. Every one might see the time when co-operation is overwhelmed by the scale of disaster and chicanery becomes the order of the day. The range of assistance contracts progressively to the family level; perhaps even these bonds dissolve and, washed away, reveal an inhuman, yet most human, self-interest. Moreover, by the same measure that the circle of charity is
compressed that of ‘negative reciprocity* is potentially expanded. People who helped each other in normal times and through the first stages of disaster display now an indifference to each others’ plight, if they do not exacerbate a mutual downfall by guile, haggle and theft.

Ingold responds:

I can find no evidence, either in my reading of circumpolar ethnography, or in the material cited by Sahlins, for the existence of such a ‘turning-point’ in hunting societies. On the contrary, as the crisis deepens, generalized reciprocity proceeds to the point of dissolution of domestic group boundaries. ‘Negative reciprocity’, rather than closing in from beyond the frontiers of the household, will be expelled altogether from the wider social field, only to make its appearance within the heart of the domestic group itself.

Thus the women of the household, who are allowed to eat only after the appetites of their menfolk have been satisfied, may be left in times of want with the merest scraps of food. Among the Chipewyan, ‘when real distress approaches, many of them are permitted to starve, when the males are amply provided for’…

In situations of economic collapse, negative reciprocity afflicts not only the domestic relations between husband and wife, but those between mother and child, and between parent and grandparent. If the suckling of children is the purest expression of generalized reciprocity, in the form of a sustained one-way flow, then infanticide must surely represent the negative extreme. Likewise, old or sick members of the household will be the first to be abandoned when provisions run short. Even in normal times, individuals who are past labour have to scavenge the left-overs of food and skins (Hearne 1911:326). In the most dire circumstances of all, men will consume their starving wives and children before turning upon one another.

Drawing on Eskimo material, Hoebel derives the following precepts of cannibal conduct: Not unusually . . . parents kill their own children to be eaten. This act is no different from infanticide. A man may kill and eat his wife; it is his privilege. Killing and eating a relative will produce no legal consequences. It is to be presumed, however, that killing a non-relative for food is murder. (1941:672, cited in Eidlitz 1969:132)

In short, the ‘circle of charity’ is not compressed but inverted: as the threat of starvation becomes a reality, the legitimacy of killing increases towards the centre. The act is ‘inhuman’ since it strips the humanity of the victim to its organic, corporeal substance. If altruism is an index of sociability, then its absolute negation annuls the sodality of the recipient: persons, be they human or animal, become things.

297px-world_population_v3-svgThis is gruesome, but let us assume it is true (I have not read the accounts Ingold cites, so I must trust him, and I do not always trust him but for now we will.)

The cold, hard logic of infanticide is that a mother can produce more children if she loses one, but a child who has lost its mother will likely die as well, along with all of its siblings. One of my great-great grandmothers suffered the loss of half her children in infancy and still managed to raise 5+ to adulthood. Look around: even with abortion and birth control widely available, humanity is not suffering a lack of children. ETA: As BaruchK correctly noted, today’s children are largely coming from people who don’t use birth control or have legal access to abortion; fertility rates are below replacement throughout the West, with the one exception AFAIK of Israel.

c08pnclw8aapot6Furthermore, children starve faster and are easier to kill than parents; women are easier to kill than men; people who live with you are easier to kill than people who don’t.

Before we condemn these people, let us remember that famine is a truly awful, torturous way to die, and that people who are on the brink of starving to death are not right in their minds. As “They’re not human”: How 19th-century Inuit coped with a real-life invasion of the Walking Dead recounts:

“Finally, as the footsteps stopped just outside the igloo, it was the old man who went out to investigate.

“He emerged to see a disoriented figure seemingly unaware of his presence. The being was touching the outside of the igloo with curiosity, and raised no protest when the old man reached his hand out to touch its cheek.

“His skin was cold. …

The figures, of course, were the last survivors of the Franklin Expedition. They had buried their captain. They had seen their ship entombed by ice. They had eaten the dead to survive. …

Inuit nomads had come across streams of men that “didn’t seem to be right.” Maddened by scurvy, botulism or desperation, they were raving in a language the Inuit couldn’t understand. In one case, hunters came across two Franklin Expedition survivors who had been sleeping for days in the hollowed-out corpses of seals. …

The figures were too weak to be dangerous, so Inuit women tried to comfort the strangers by inviting them into their igloo. …

The men spit out pieces of cooked seal offered to them. They rejected offers of soup. They grabbed jealous hold of their belongings when the Inuit offered to trade.

When the Inuit men returned to the camp from their hunt, they constructed an igloo for the strangers, built them a fire and even outfitted the shelter with three whole seals. …

When a small party went back to the camp to retrieve [some items], they found an igloo filled with corpses.

The seals were untouched. Instead, the men had eaten each other. …

In 1854, Rae had just come back from a return trip to the Arctic, where he had been horrified to discover that many of his original Inuit sources had fallen to the same fates they had witnessed in the Franklin Expedition.

An outbreak of influenza had swept the area, likely sparked by the wave of Franklin searchers combing the Arctic. As social mores broke down, food ran short.

Inuit men that Rae had known personally had chosen suicide over watching the slow death of their children. Families had starved for days before eating their dog teams. Some women, who had seen their families die around them, had needed to turn to the “last resource” to survive the winter.

Infanticide, cannibalism, and human sacrifice were far more common prior to 1980 or so than we like to think; God forbid we should ever know such fates.

According to Wikipedia:

“Many Neolithic groups routinely resorted to infanticide … Joseph Birdsell believed that infanticide rates in prehistoric times were between 15% and 50% of the total number of births,[10] while Laila Williamson estimated a lower rate ranging from 15% to 20%.[6]:66 Comparative anthropologists have calculated that 50% of female newborn babies were killed by their parents during the Paleolithic era.[12] Decapitated skeletons of hominid children have been found with evidence of cannibalism.[13]

400px-Magliabchanopage_73r“Three thousand bones of young children, with evidence of sacrificial rituals, have been found in Sardinia. Pelasgians offered a sacrifice of every tenth child during difficult times. Syrians sacrificed children to Jupiter and Juno. Many remains of children have been found in Gezer excavations with signs of sacrifice. Child skeletons with the marks of sacrifice have been found also in Egypt dating 950-720 BCE. In Carthage “[child] sacrifice in the ancient world reached its infamous zenith.”[11]:324  …

“According to Shelby Brown, Carthaginians, descendants of the Phoenicians, sacrificed infants to their gods.[25] Charred bones of hundreds of infants have been found in Carthaginian archaeological sites. One such area harbored as many as 20,000 burial urns.[25]

Picture 4Plutarch (c. 46–120 AD) mentions the practice, as do Tertullian, Orosius, Diodorus Siculus and Philo. The Hebrew Bible also mentions what appears to be child sacrifice practiced at a place called the Tophet (from the Hebrew taph or toph, to burn) by the Canaanites. Writing in the 3rd century BCE, Kleitarchos, one of the historians of Alexander the Great, described that the infants rolled into the flaming pit. Diodorus Siculus wrote that babies were roasted to death inside the burning pit of the god Baal Hamon, a bronze statue.

“… the exposure of newborns was widely practiced in ancient Greece, it was even advocated by Aristotle in the case of congenital deformity — “As to the exposure of children, let there be a law that no deformed child shall live.”[30]

“The practice was prevalent in ancient Rome, as well. … A letter from a Roman citizen to his sister, or a pregnant wife from her husband,[35] dating from 1 BC, demonstrates the casual nature with which infanticide was often viewed:

“I am still in Alexandria. … I beg and plead with you to take care of our little child, and as soon as we receive wages, I will send them to you. In the meantime, if (good fortune to you!) you give birth, if it is a boy, let it live; if it is a girl, expose it.” [36][37]

CgxAZrOUYAEeANF“In some periods of Roman history it was traditional for a newborn to be brought to the pater familias, the family patriarch, who would then decide whether the child was to be kept and raised, or left to die by exposure.[39] The Twelve Tables of Roman law obliged him to put to death a child that was visibly deformed. …

“According to William L. Langer, exposure in the Middle Ages “was practiced on gigantic scale with absolute impunity, noticed by writers with most frigid indifference”.[47]:355–356 At the end of the 12th century, notes Richard Trexler, Roman women threw their newborns into the Tiber river in daylight.[48]” …

400px-Kodeks_tudela_21“Philosopher Han Fei Tzu, a member of the ruling aristocracy of the 3rd century BC, who developed a school of law, wrote: “As to children, a father and mother when they produce a boy congratulate one another, but when they produce a girl they put it to death.”[63]

“Buddhist belief in transmigration allowed poor residents of the country to kill their newborn children if they felt unable to care for them, hoping that they would be reborn in better circumstances. Furthermore, some Chinese did not consider newborn children fully “human”, and saw “life” beginning at some point after the sixth month after birth.[65]

“Contemporary writers from the Song dynasty note that, in Hubei and Fujian provinces, residents would only keep three sons and two daughters (among poor farmers, two sons and one daughter), and kill all babies beyond that number at birth.[66]”

Sex Ratio at birth in the People's Republic of China
Sex Ratio at birth in the People’s Republic of China

“It was not uncommon that parents threw a child to the sharks in the Ganges River as a sacrificial offering. The British colonists were unable to outlaw the custom until the beginnings of the 19th century.[82]:78

“According to social activists, female infanticide has remained a problem in India into the 21st century, with both NGOs and the government conducting awareness campaigns to combat it.[83] …

“In the Eastern Shoshone there was a scarcity of Indian women as a result of female infanticide.[100] For the Maidu Native Americans twins were so dangerous that they not only killed them, but the mother as well.[101] In the region known today as southern Texas, the Mariame Indians practiced infanticide of females on a large scale. Wives had to be obtained from neighboring groups.[102]

Meanwhile in the Americas:

In 2005 a mass grave of one- to two-year-old sacrificed children was found in the Maya region of Comalcalco. The sacrifices were apparently performed for consecration purposes when building temples at the Comalcalco acropolis.[2] …

Archaeologists have found the remains of 42 children sacrificed to Tlaloc (and a few to Ehecátl Quetzalcóatl) in the offerings of the Great Pyramid of Tenochtitlan. In every case, the 42 children, mostly males aged around six, were suffering from serious cavities, abscesses or bone infections that would have been painful enough to make them cry continually. Tlaloc required the tears of the young so their tears would wet the earth. As a result, if children did not cry, the priests would sometimes tear off the children’s nails before the ritual sacrifice.[7]

And don’t get me started on cannibalism.

James Cook witnessing human sacrifice in Tahiti
James Cook witnessing human sacrifice in Tahiti

It is perhaps more profitable to ask which cultures didn’t practice some form of infanticide/infant sacrifice/cannibalism than which ones did. The major cases Wikipedia notes are Ancient Egypt, Judaism, Christianity, and Islam (we may note that Judaism in many ways derived from ancient Egypt, and Christianity and Islam from Judaism.) Ancient Egypt stands out as unique among major the pre-modern, pre-monotheistic societies to show no signs of regular infanticide–and even in the most infamous case where the Egyptian pharaoh went so far as to order the shocking act, we find direct disobedience in his own household:

3 And when she [Jochebed] could not longer hide him [the baby], she took for him an ark of bulrushes, and daubed it with slime and with pitch, and put the child therein; and she laid it in the flags by the river’s brink.4 And his sister stood afar off, to wit what would be done to him.

pharaohs_daughter-15 And the daughter of Pharaoh came down to wash herself at the river; and her maidens walked along by the river’s side; and when she saw the ark among the flags, she sent her maid to fetch it.

6 And when she had opened it, she saw the child: and, behold, the babe wept. And she had compassion on him, and said, “This is one of the Hebrews’ children.”

7 Then said his sister to Pharaoh’s daughter, “Shall I go and call to thee a nurse of the Hebrew women, that she may nurse the child for thee?”

8 And Pharaoh’s daughter said to her, “Go.” And the maid went and called the child’s mother.

9 And Pharaoh’s daughter said unto her, “Take this child away, and nurse it for me, and I will give thee thy wages.” And the women took the child, and nursed it.

10 And the child grew, and she brought him unto Pharaoh’s daughter, and he became her son. And she called his name Moses: and she said, “Because I drew him out of the water.”

–Exodus 2:3-10

I don’t know the actual infanticide numbers in modern Muslim countries (le wik notes that poverty in places like Pakistan still drives infanticide) but it is officially forbidden by Islam.

According to Abortions in America: • Black women are five times more likely to abort than white women. • 69% of pregnancies among Blacks are unintended, while that number is 54% among Hispanics and 40% of pregnancies among Whites. • Planned Parenthood, ... has located 80% of its abortion clinics in minority neighborhoods
According to Abortions in America:
• Black women are five times more likely to abort than white women.
• 69% of pregnancies among Blacks are unintended, while that number is 54% among Hispanics and 40% of pregnancies among Whites.
• Planned Parenthood, … has located 80% of its abortion clinics in minority neighborhoods

Today, between the spread of Abrahamic religions, Western Values, and general prosperity, the infanticide rate has been cut and human sacrifice and cannibalism have been all but eliminated. Abortion, though, is legal–if highly controversial–throughout the West and Israel.

According to the CDC, the abortion rate for 2013 was 200 abortions per 1,000 live births, or about 15% of pregnancies. (The CDC also notes that the abortion rate has been falling since at least 2004.) Of these, “91.6% of abortions were performed at ≤13 weeks’ gestation; … In 2013, 22.2% of all abortions were early medical abortions.”

To what can we attribute this anti-infanticide sentiment of modern monotheistic societies? Is it just a cultural accident, a result of inheritance from ancient Egypt, or perhaps the lucky effects of some random early theologian? Or as the religious would suggest, due to God’s divine decree? Or is it an effect of the efforts parents must expend on their few children in societies where children must attend years of school in order to succeed?

According to Wikipedia:

In ecology, r/K selection theory relates to the selection of combinations of traits in an organism that trade off between quantity and quality of offspring. The focus upon either increased quantity of offspring at the expense of individual parental investment of r-strategists, or reduced quantity of offspring with a corresponding increased parental investment of K-strategists, varies widely, seemingly to promote success in particular environments. …

In r/K selection theory, selective pressures are hypothesised to drive evolution in one of two generalized directions: r– or K-selection.[1] These terms, r and K, are drawn from standard ecological algebra as illustrated in the simplified Verhulst model of population dynamics:[7]

d N d t = r N ( 1 − N K ) {\frac {dN}{dt}}=rN\left(1-{\frac {N}{K}}\right)

where r is the maximum growth rate of the population (N), K is the carrying capacity of its local environmental setting, and the notation dN/dt stands for the derivative of N with respect to t (time). Thus, the equation relates the rate of change of the population N to the current population size and expresses the effect of the two parameters. …

As the name implies, r-selected species are those that place an emphasis on a high growth rate, and, typically exploit less-crowded ecological niches, and produce many offspring, each of which has a relatively low probability of surviving to adulthood (i.e., high r, low K).[8] A typical r species is the dandelion Taraxacum genus.

In unstable or unpredictable environments, r-selection predominates due to the ability to reproduce quickly. There is little advantage in adaptations that permit successful competition with other organisms, because the environment is likely to change again. Among the traits that are thought to characterize r-selection are high fecundity, small body size, early maturity onset, short generation time, and the ability to disperse offspring widely. …

By contrast, K-selected species display traits associated with living at densities close to carrying capacity, and typically are strong competitors in such crowded niches that invest more heavily in fewer offspring, each of which has a relatively high probability of surviving to adulthood (i.e., low r, high K). In scientific literature, r-selected species are occasionally referred to as “opportunistic” whereas K-selected species are described as “equilibrium”.[8]

In stable or predictable environments, K-selection predominates as the ability to compete successfully for limited resources is crucial and populations of K-selected organisms typically are very constant in number and close to the maximum that the environment can bear (unlike r-selected populations, where population sizes can change much more rapidly).

Traits that are thought to be characteristic of K-selection include large body size, long life expectancy, and the production of fewer offspring, which often require extensive parental care until they mature.

Of course you are probably already aware of Rushton’s R/K theory of human cultures:

Rushton’s book Race, Evolution, and Behavior (1995) uses r/K selection theory to explain how East Asians consistently average high, blacks low, and whites in the middle on an evolutionary scale of characteristics indicative of nurturing behavior. He first published this theory in 1984. Rushton argues that East Asians and their descendants average a larger brain size, greater intelligence, more sexual restraint, slower rates of maturation, and greater law abidingness and social organization than do Europeans and their descendants, who average higher scores on these dimensions than Africans and their descendants. He theorizes that r/K selection theory explains these differences.

I’d be remiss if I didn’t also mention that the article states, “Rushton’s application of r/K selection theory to explain differences among racial groups has been widely criticised. One of his many critics is the evolutionary biologist Joseph L. Graves, who has done extensive testing of the r/K selection theory with species of Drosophila flies. …”

Genetics or culture, in dense human societies, people must devote a great deal of energy to a small number of children they can successfully raise, leading to the notion that parents are morally required to put this effort into their children. But this system is at odds with the fact that without some form of intervention, the average married couple will produce far more than two offspring.

Ultimately, I don’t have answers, only theories.

Source: CDC data, I believe
Source: CDC data, I believe

Open Thread: Scotus

(Who’s a good shark?)

I don’t really do domestic politics, but what do you guys think of Trump’s SCOTUS pick?

c3h0xgsuoaanakvI’ve been working on some material about instincts and thought this was an interesting graph, relevant to my theory that pregnancy/childbirth have a physical (chemical/hormonal) effect on women, causing their mothering instincts to kick in (mothering isn’t nearly as useful before you have kids as after,) which in turn causes a change in attitudes toward abortion.

Obviously the graph proves little, because people who don’t approve of abortion are predisposed to have more children than people who do. What I really want is a time-based graph, measuring attitudes before and after pregnancy. But I’m having trouble finding that.

Pension obligations are doomed
Pension obligations (CA specific?)

And here’s a graph of pension obligations. Pensions: they’ve got serious issues.

On to comments of the week. First I’m reposting Unknown128′s question because I can’t answer it, but maybe one of you folks can:

I wanted to ask what the German Nazis view on IQ and IQ testing was. From what I know they didnt realy percieve intelligence as a very valuable trait in the first place, prefering physical strenght, endurance and “nordic racial traits”. They bred warriors not thinkers. Also one does hear that they banned IQ testing or at least strongly disliked it.

Do you know anything specific?

This has been a relatively low comment week, but awards go to Leonard:

Antifa is American as apple pie. The use of violence to achieve political objectives is nothing new to communists, and America is a communist country. The original communist country. Don’t forget the civil war. Antifa is just updated antisla.

The reason why you’re seeing more antifa stuff now is that the communists have lost control of the Potemkin government. It’s not that they didn’t exist; it’s that rioting did not serve much political purpose. Pressuring Obama from the left was easy enough without violence. So using violence was fairly senseless.

and With the thoughts you’d be thinkin:

My understanding is Antifas evolved or are a subset of anarchist and socialist groups mainly focussed on combating fringe nationalist parties like the National Front (UK) or the Front National (FRA), whatever their equivalents were in other Western Europe countries etc, in the 70s-90s. Honestly they’re just black bloc dickheads who want to pretend they’re fighting Hitler.

The general dressing in black and breaking stuff style of leftwing activisim is known as black blocs. Basically just dress up in black cover your face and break things. I think it was popularised as a tactic originally against groups like the IMF and WTO, in the US at least it was probably due to “The battle for Seattle”, that was opposition to the WTO. You get routine riots and mischief by groups who perform the same behaviour for various left wing causes all around the world. I think Oakland has a lot of those guys behaving like dicks semi-consistently. …

That’s all for now. What are you thinking about?

In Defense of Planned Parenthood

Abortion and birth control are important tools in the ultimate human thriving toolkit.

Unless you want to eliminate all the robots and go back to agricultural labor (which is not going to get you an interstellar society,) you will have to deal, somehow, with all of the humans who don’t have the chops to survive in society. Letting people starve in the streets is inhumane and inspires people to fund large social welfare states, which may have negative long-term effects.

Historically, death rates were very high, especially infant mortality. My great-great grandparents lost over half of their 16 children before the age of five; such was normal.

The effects of declining infant mortality are happy parents, of course, but also long-term degradation of the gene pool, overpopulation, and eventual systemic collapse as we burn through the Earth’s resources. We’re already seeing this, both in increasing reaction times (it looks like Whites are getting dumber, and Ashkenazi IQ is probably plummeting, relatively speaking,) and the flooding of high-breeding peoples out of their exhausted biomes into fresh territory to consume (to the detriment of those trying to maintain a non-degraded biome.)

As I believe I have mentioned before, there is nothing like a parenting forum to convince you that parents are idiots. Unfortunately, a very large percentage of people become parents because they are too dumb not to.

I recently had a conversation with a friend who tried to reassure me that this was not a problem. “Don’t worry,” they said. “Dumb people have always had more kids than smart people.”

“No,” I said. “No no no. Dumb people did not historically have more kids than smart people.” History was brutal; 20-50% infant mortality was the norm, and people who did a better job taking care of and providing for their children had more children who made it to adulthood than those who didn’t.

No one in their right mind wants to simply eliminate all maternal and childhood medical care (and hygiene) so we can return to the age of high infant mortality. There are far better solutions than giving everyone Smallpox and seeing who makes it. But you also do not want a situation where the primary barrier to reproduction is actually intelligence.

The obvious solution is free IUDs for everyone. Globally. The long-term planners will get theirs removed when they’re ready to have children, and the short term planners will be able to go about their business without making “oopsies.” People who want 18 children will still be able to have 18 children, but people who don’t have the resources to support children don’t have to have any.

Abortion also plays an important role in the maintenance of modern society. Ideally, free abundant birth control would eliminate most of the need for abortion, but there will always be mistakes, medical complications, and non-viable fetuses of various sorts. Eliminate these earlier, not later.

These are not the children of intelligent, healthy, well-adjusted people who have some weird phobia of childbirth. These are fetuses with health problems and fetuses whose parents don’t have the resources, mentally or physically, to take care of them. The apple does not fall far from the tree, and genetically, those children will inherit their parents’ traits. If you are not volunteering to raise those fetuses (and their fetuses) yourself, then I think you should give some serious thought to who you think will.

After all is said and done, I don’t care what Planned Parenthood does with aborted fetuses, so long as they’re disposed of hygienically. They’re already dead, for goodness sakes.

Of Course your Enemies are Organized

Organization is a spontaneous feature of all human societies. Heck, bees are organized.

There is a myth, routinely proposed by those who know no better, that other people are acting completely independently. Independent action, flowing spontaneously from one’s sense of morality or injustice, prompting sudden, wide-spread social change.

Then people discover that, contrary to this ignorant assumption, other humans actually put in a bunch of effort to organize, choosing carefully when and where they should act or at least helping each other out, and they are shocked, just shocked.

Somehow, spontaneous action is regarded as purely motivated, whereas organized action–the natural result of all human socializing–is impure, some kind of dastardly conspiracy.

Don’t be dumb. Of course your enemies are organized–whoever your enemies happen to be.

The Rosa Parks story comes immediately to mind, as analyzed by Herbert Kohl in Should we Burn Babar? Kohl, I should be clear, does not regard Mrs. Parks as an enemy–he wants her story to be told accurately. Kohl examines school textbook accounts of Mrs. Parks’s story, finding, IIRC, that most were inaccurate.

One of the most common inaccuracies Kohl found in the textbooks was the description of Mrs. Parks as a totally normal person, (just like you and me!) not involved in any political movements, who was just really tired one day and didn’t want to move.

This is wrong, of course. Rosa Parks was an active member of a civil rights organization. Her refusal to move was not the spontaneous result of being tired one day, but a planned protest against segregation. Mrs. Parks was not the first black person who refused to give up her seat on a bus, but civil rights leaders chose to publicize her case and not earlier cases because they thought the American public would find Mrs. Parks a more sympathetic character–you see, the other lady who did the exact same courageous act as Mrs. Parks but hasn’t received any credit for it was divorced.

If you find this surprising, please ask yourself Why? Do you really think the March on Washington or Montgomery Bus Boycott happened without organization? At the very least, people had to get to work.

Kohl offers unsatisfying explanations for the inaccuracies in textbook accounts of Mrs. Parks’s story, mostly because he doesn’t quite understand the writers’ motivations. Here’s what I think happened:

In Mrs. Parks’s time, liberals tended to be on the side of the Civil Rights movement, and conservatives tended to be against it. Both sides knew darn well that the Civil Rights movement was organized.

Today, even conservatives are generally in favor of Civil Rights–conservatives I know wax rhapsodically and frequently about how much they love Dr. King. I shit you not, I know white, southern conservatives who actually see themselves in solidarity with Blacks and Hispanics against evil white liberals.

But these rather conventional, mainstream conservatives do not believe in radical, organized protest against the state. That is a liberal thing. Mainstream conservatives like the status quo and generally seek to protect it (and supporting Civil Rights is now status quo, so they do,) but organized protest is anti-status quo, so they don’t like it.

To make Mrs. Parks an appropriate conservative hero, capable of appearing in children’s history texts without inspiring parental protest that the texts are teaching politics instead of history, Mrs. Parks’ associations with organized liberalism have been scrubbed. A conservative hero like Mrs. Parks couldn’t possibly do something so controversially liberal as join a social organization in favor of her own self-interest.

 

The recent riots in Baltimore and elsewhere are another example. People report, most conspiratorially, that these riots were organized. Of course they were organized! We call them campus organizations for a reason. Heck, I know people who can’t get together with their friends to play video games without setting up a treasury and by-laws, but people are surprised to learn that folks who’ve been in Social Justice organizations for years might donate money to help protesters with legal fees or organize transportation together.

It’s one thing for people to just spontaneously decide to protest. It’s a totally different matter to carpool.

 

Conservatives organize too, btw. Evangelical churches, talk radio, and various conservative think tanks have been organizing conservatives for decades. Conservatives are quite good at organizing, as they tend, even more than liberals, to like being part of an organized social hierarchy. These organizations have had a pretty big effect on the American political scene, because they are very effective at getting their members to vote in lock-step for whatever policies and politicians they support. Thus, four decades years after Roe V. Wade, conservatives still pose a real threat to legal abortion. (Evolutionist X believes abortion should be free and easily available.)

 

Whoever your enemies are, of course they are organized. Organization is a basic feature of all human societies. Stop acting surprised.

It’s all or Nothin’

I posit that it is difficult for humans to adequately respond to things that they regard as merely somewhat problematic. Getting just about anything done requires a ton more work than sitting around doing nothing, so people who are motivated to change things are generally people who are convinced that things are really, really bad.

If you don’t think things are really, really bad, you’ll probably end up self-justifying that things are really good, so you don’t need to spend a bunch of time trying to change them, so you can comfortably hang out and relax.

If you do want to change things, you’ll probably have to spend a lot of time convincing yourself that things are truly dire in order to keep up the emotional energy necessary to get the work done.

Either way, you’re probably lying to yourself (or others), but I’m not sure if humans are really capable of saying, “this system is mostly good and mostly beneficial to the people in it, but it has really bad effects on a few people.”

Your opinions about a system are probably going to be particularly skewed one way or another if you have no direct or second-hand experience with that system, because you’re most likely hearing reports from people who care enough to put in the effort to talk about their systems.

Likewise, the people who care the most about political issues tend to have more extreme views; moderates tend not to be terribly vocal.

It makes an impassioned defense of moderatism kind of anomalous.

A good example of this effect is religion. If you’ve ever listened to American atheists talk about religion, you’ve probably gotten the impression that, as far as they’re concerned, religion is super duper evil.

By contrast, if you’ve ever talked to a religious person, you know they tend to think religion is totally awesome.

About 80% of Americans claim to be religious (though in typical me-fashion, I suspect some of them are lying because how could so many people possibly be religious?) We’ll call that 75%, because some people are just going along with the crowd. Since religion is voluntary and most religious people seem to like their religions, we’ll conclude that religion is more or less a positive in 75% of people’s lives.

Only about 40% of people actually attend religious services weekly–we’ll call these our devoted, hard-core believers. These people tend to really love their religion, though even non-attenders can get some sort of comfort out of their beliefs.

It’s difficult to determine exactly what % of Americans believe in particular forms of Christianity, but about 30% profess to be some form of “Evangelical”; Fundamentalists are a much smaller but often overlapping %, probably somewhere between 10 and 25%.

So let’s just stick with “about 75% like their religion, and about 40% have some beliefs that may be really problematic for other people” (after all, it’s not Unitarians and Neo-Pagans people are complaining about.)

For what % of people is religion really problematic? LGBT folks have it hard due to some popular religious beliefs–we can estimate them at 5%, according to the Wikipedia.

People who need or want abortions are another big category. Estimates vary, but let’s go with 1/3 of women being interested in abortion at some point in their lives, with I think 12% citing health reasons. 33 is a pretty big %, but since abortion is currently basically legal, religion is currently more of a potential problem than a real problem for most of these women.

A third category is non-Christians who face discrimination in various aspects of life, and kids/teens who have to put up with super-controlling parents. I have no idea what the stats are on them, but the logic of encounters suggests that the 30% or so of non-Christians are going to have trouble with the 40% or so of problematic-belief-Christians, mediated by non-Christians being concentrated in certain parts of the country, so lets go with 15% of people having significant issues at some point, though these are unlikely to be life-long issues (and some % of these people overlap with the previous two groups.)

So, let’s say 70% like religion; 40% have problematic beliefs; 20% suffer some sort of discrimination in their lives, and about 5% suffer significantly.

In short, most of the time, religion is actually a really positive thing for the vast majority of people, and a really bad thing for a small % of people.

But most people who have an interest in religion don’t say, “Religion is basically good but occasionally bad.” Most people say, either, “Religion is totally awesome,” or “Religion totally sucks.” And that has a lot to do with whether you and your friends are primarily people for whom it is good or bad. The moderate position gets lost.

Sociopaths within, sociopaths without

A few posts back, I made a comment to the effect that liberals tend to be “good people” (or at least well-intentioned people) who are concerned about sociopaths.

I feel like this comment deserves some explanation, because it comes across as harsher than intended toward conservatives.

Conservatives would kill themselves to save the people they love. My mother would literally give me her good kidney if I needed it; she has stated on many occasions that she would die for her grandchildren.

Conservatives are disproportionately employed in the riskiest fields that require risking their lives to save or protect others, like fire fighters, police, and military. They also take on shitty, dangerous jobs simply to feed their families, like crab fishing and coal mining.

The flipside to that extreme level of altruism is that you simply cannot extend it to everyone. You cannot die for just anyone.

Suicidal altruism can only exist if it makes the individual’s genes more likely to persist into the future.

If I die to save my childrens’ lives, then my genes will continue to exist, because they (each) carry half of my genes, and in their genes they carry some altruistic sentiment. Not sacrificing myself to save my children means that my genetic line ends with me, and with them dies my lack of altruism.

But if I die to save the life of a stranger, orphaning my own children, someone else’s genetic line is more likely to continue, while mine is more likely to end as my orphans starve. If a stranger cannot reciprocate my altruism, then being altruistic to them lessens the chance of altruistic genes in the future population.

The amount of charity (altruism, help,) people are willing to extend to each other therefore has a lot to do with how much they can afford to risk the other person not reciprocating. If you can guarantee that the other person will reciprocate (“cooperate”, in the Prisoner’s Dilemma,) your kindness, then you will be likely to be kind to them. If the other person can defect without consequences, then you would be a fool to help them.

Liberals and conservatives show different patterns of altruism, suggesting that they perceive different patterns of cooperation/defection and are possibly genetically distinct from each other.

Conservatives display very high levels of altruism toward their kin, friends, groups they identify with. They display comparatively low levels of altruism toward strangers, whom they will readily kill in order to save their loved ones.

Liberals display low levels of altruism to a much wider range of people. They are much less willing to risk their lives to save anyone (few liberal firefighters or marines,) but they are also less willing to kill random Iraqis on the off chance that one of them might be a potential terrorist.

The two groups perceive threat differently–conservatives see strangers as basically threatening, while liberals assume that strangers have no particular reason to cause them any harm. Conservatives would rather kill ’em all and let god sort ’em out, whereas liberals do not believe in god and would rather just make friends.

I recently posed a moral dilemma to several of my relatives: A man’s wife is dying of cancer. A doctor has invented a miracle drug that will cure the cancer, but he’s charging a million dollars a bottle and the man simply cannot afford it. Without the medicine, his wife will die.

Should he steal the medicine?

Now, my sample size is very small, (N=6), but the pattern has been amusingly consistent. The conservatives answer automatically–of course they would steal the medication. (One person launched into a discussion of the importance of properly casing the joint, so that you don’t get caught and go to jail, but I’m counting that as “would steal.” Another person objected that I must have the question wrong, because there was absolutely no way anyone would ever answer “no”.) The liberals, by contrast, equivocated. The question made them uncomfortable. I got responses like, “He should work/appeal to charities to save up/raise enough money,” and general refusals to fully answer the question.

To the liberal, conservative behavior toward the strangers looks sociopathic. To conservatives, liberal behavior toward their loved ones looks sociopathic. (Liberals see themselves as merely trying to be nice to everyone, of course, whereas conservatives see no real point in being nice to people who might try to kill them.)

Now, I feel I should stop for a moment and note that there are plenty of strangers toward whom conservatives are not openly hostile. Conservatives do a lot of charity work. There are many parts of the world where religious groups are pretty much the only people trying to help people and make their lives less desperately poor. They also adopt more kids than liberals. But the flipside of that greater willingness to cooperate is coming down much harder on defectors.

Liberal and conservative philosophical approaches to the world and political positions make a lot of sense in this light. Conservatives emphasize the importance of personal sacrifice and duty, that is, reciprocating to those who have shown you kindness in the past. For example, a conservative would argue that you should make personal sacrifices to help a parent in need, even if that parent is kind of an ass, because they are your parent and they used to wipe shit off your butt. For the conservative, group memberships and strong relationships with others are of prime importance, and trying to change all that is not tolerated.

Liberals tend toward anomie. They believe that relationships between people should be voluntary and mutually beneficial (fun, a virus-value,) and that you don’t “owe” people for past kindnesses that you didn’t necessarily want or even ask for, or that may have been delivered under some form of duress that made you unable to say no (being a child who can’t wipe their own behind counts as a form of duress.) Liberals believe that it is acceptable to sever relationships that do not benefit the individual, and are more likely to see others as individuals, rather than as members of some group.

To the conservative, a mother has a duty to her unborn child (and the child, a future duty to their mother.) To the liberal, there is no such duty.

I hope it is obvious that both views, if taken to extremes, cause problems. Society functions best when people have some flexibility to determine their duties and obligations, rather than having everything dictated to them at birth, and it also requires that people have some confidence that others will reciprocate altruism, otherwise everything falls apart.