Apparently Most People Live in A Strange Time Warp Where Neither Past nor Future Actually Exist

Forget the Piraha. It appears that most Americans are only vaguely aware of these things called “past” and “future”:

Source: CNN poll conducted by SSRS,

A majority of people now report that George W. Bush, whom they once thought was a colossal failure of a president, whose approval ratings bottomed out at 33% when he left office, was actually good. By what measure? He broke the economy, destabilized the Middle East, spent trillions of dollars, and got thousands of Americans and Iraqis killed.

Apparently the logic here is “Sure, Bush might have murdered Iraqi children and tortured prisoners, but at least he didn’t call Haiti a shithole.” We Americans have standards, you know.

He’s just a huggable guy.

I’d be more forgiving if Bush’s good numbers all came from 18 year olds who were 10 when he left office and so weren’t actually paying attention at the time. I’d also be more forgiving if Bush had some really stupid scandals, like Bill Clinton–I can understand why someone might have given Clinton a bad rating in the midst of the Monica Lewinsky scandal, but looking back a decade later, might reflect that Monica didn’t matter that much and as far as president goes, Clinton was fine.

But if you thought invading Iraq was a bad idea back in 2008 then you ought to STILL think it is a bad idea right now.

Note: If you thought it was a good idea at the time, then it’s sensible to think it is still a good idea.

This post isn’t really about Bush. It’s about our human inability to perceive the flow of time and accurately remember the past and prepare for the future.

I recently texted a fellow mom: Would your kid like to come play with my kid? She texted back: My kid is down for a nap.


What about when the nap is over? I didn’t specify a time in the original text; tomorrow or next week would have been fine.

I don’t think these folks are trying to avoid me. They’re just really bad at scheduling.

People are especially bad at projecting current trends into the future. In a conversation with a liberal friend, he dismissed the idea that there could be any problems with demographic trends or immigration with, “That won’t happen for a hundred years. I’ll be dead then. I don’t care.”

An anthropologist working with the Bushmen noticed that they had to walk a long way each day between the watering hole, where the only water was, and the nut trees, where the food was. “Why don’t you just plant a nut tree near the watering hole?” asked the anthropologist.

“Why bother?” replied a Bushman. “By the time the tree was grown, I’d be dead.”

Of course, the tree would probably only take a decade to start producing, which is within even a Bushman’s lifetime, but even if it didn’t, plenty of people build up wealth, businesses, or otherwise make provisions to provide for their children–or grandchildren–after their deaths.

Likewise, current demographic trends in the West will have major effects within our lifetimes. Between the  1990 and 2010 censuses (twenty years), the number of Hispanics in the US doubled, from 22.4 million to 50.5 million. As a percent of the overall population, they went from 9% to 16%–making them America’s largest minority group, as blacks constitute only 12.6%.

If you’re a Boomer, then Hispanics were only 2-3% of the country during your childhood.

The idea that demographic changes will take a hundred years and therefore don’t matter makes as much sense as saying a tree that takes ten years to grow won’t produce within your lifetime and therefore isn’t worth planting.

Society can implement long term plans–dams are built with hundred year storms and floods in mind; building codes are written with hundred year earthquake risks in mind–but most people seem to exist in a strange time warp in which neither the past nor future really exist. What they do know about the past is oddly compressed–anything from a decade to a century ago is mushed into a vague sense of “before now.” Take this article from the Atlantic on how Micheal Brown (born in 1996,) was shot in 2014 because of the FHA’s redlining policies back in 1943.

I feel like I’m beating a dead horse at this point, but one of the world’s most successful ethnic groups was getting herded into gas chambers in 1943. Somehow the Jews managed to go from being worked to death in the mines below Buchenwald (slave labor dug the tunnels where von Braun’s rockets were developed) to not getting shot by the police on the streets of Ferguson in 2014, 71 years later. It’s a mystery.

And in another absurd case, “Artist reverses gender roles in 50s ads to ‘give men a taste of their own sexist poison’,” because clearly advertisements from over half a century ago are a pressing issue, relevant to the opinions of modern men.

I’m focusing here on political matters because they make the news, but I suspect this is a true psychological trait for most people–the past blurs fuzzily together, and the future is only vaguely knowable.

Politically, there is a tendency to simultaneously assume the past–which continued until last Tuesday–was a long, dark, morass of bigotry and unpleasantness, and that the current state of enlightened beauty will of course continue into the indefinite future without any unpleasant expenditures of effort.

In reality, our species is, more or less, 300,000 years old. Agriculture is only 10,000 years old.

100 years ago, the last great bubonic plague epidemic (yersinia pestis) was still going on. 10 million people died, including 119 Californians. 75 years ago, millions of people were dying in WWII. Sixty years ago, polio was still crippling children (my father caught it, suffering permanent nerve damage.)

In the 1800s, Germany’s infant mortality rate was 50%; in 1950, Europe’s rate was over 10%; today, infant mortality in the developed world is below 0.5%; globally, it’s 4.3%. The death of a child has gone from a universal hardship to an almost unknown suffering.

100 years ago, only one city in the US–Jersey City–routinely disinfected its drinking water. (Before disinfection and sewers, drinking water was routinely contaminated with deadly bacteria like cholera.) I’m still looking for data on the spread of running water, but chances are good your grandparents did not have an indoor toilet when they were children. (I have folks in my extended family who still have trouble when the water table drops and their well dries up.)

Hunger, famines, disease, death… I could continue enumerating, but my point is simple: the prosperity we enjoy is not only unprecedented in the course of human history, but it hasn’t even existed for one full human lifetime.

Rome was once an empire. In the year one hundred, the eternal city had over 1,500,000 citizens. By 500, it had fewer than 50,000. It would not recover for over a thousand years.

Everything we have can be wiped away in another human lifetime if we refuse to admit that the future exists.


The Facsimile of Meaning

Most of the activities our ancestors spent the majority of their time on have been automated or largely replaced by technology. Chances are good that the majority of your great-great grandparents were farmers, but few of us today hunt, gather, plant, harvest, or otherwise spend our days physically producing food; few of us will ever build our own houses or even sew our own clothes.

Evolution has (probably) equipped us with neurofeedback loops that reward us for doing the sorts of things we need to do to survive, like hunt down prey or build shelters (even chimps build nests to sleep in,) but these are precisely the activities that we have largely automated and replaced. The closest analogues to these activities are now shopping, cooking, exercising, working on cars, and arts and crafts. (Even warfare has been largely replaced with professional sports fandom.)

Society has invented vicarious thrills: Books, movies, video games, even roller coasters. Our ability to administer vicarious emotions appears to be getting better and better.

And yet, it’s all kind of fake.

Exercising, for example, is in many ways a pointless activity–people literally buy machines so they can run in place. But if you have a job that requires you to be sedentary for most of the day and don’t fancy jogging around your neighborhood after dark, running in place inside your own home may be the best option you have for getting the post-running-down prey endorphin hit that evolution designed you to crave.

A sedentary lifestyle with supermarkets and restaurants deprives us of that successful-hunting endorphin hit and offers us no logical reason to go out and get it. But without that exercise, not only our physical health, but our mental health appears to suffer. According to the Mayo Clinic, exercise effectively decreases depression and anxiety–in other words, depression and anxiety may be caused in part by lack of exercise.

So what do we do? We have to make up some excuse and substitute faux exercise for the active farming/gardening/hunting/gathering lifestyles our ancestors lived.

By the way, about 20% of Americans are on psychiatric medications of some sort, [warning PDF] of which anti-depressants are one of the most commonly prescribed:

Overall, the number of Americans on medications used to treat psychological and behavioral disorders has substantially increased since 2001; more than one‐in‐five adults was on at least one
of these medications in 2010, up 22 percent from ten years earlier. Women are far more likely to take a drug to treat a mental health condition than men, with more than a quarter of the adult
female population on these drugs in 2010 as compared to 15 percent of men.

Women ages 45 and older showed the highest use of these drugs overall. …

The trends among children are opposite those of adults: boys are the higher utilizers of these medications overall but girls’ use has been increasing at a faster rate.

This is mind-boggling. 1 in 5 of us is mentally ill, (supposedly,) and the percent for young women in the “prime of their life” years is even higher. (The rates for Native Americans are astronomical.)

Lack of exercise isn’t the only problem, but I wager a decent chunk of it is that our lives have changed so radically over the past 100 years that we are critically lacking various activities that used to make us happy and provide meaning.

Take the rise of atheism. Irrespective of whether God exists or not, many functions–community events, socializing, charity, morality lessons, etc–have historically been done by religious groups. Atheists are working on replacements, but developing a full system that works without the compulsion of religious belief may take a long while.

Sports and video games replace war and personal competition. TV sitcoms replace friendship. Twitter replaces real life conversation. Politics replace friendship, conversation, and religion.

There’s something silly about most of these activities, and yet they seem to make us happy. I don’t think there’s anything wrong with enjoying knitting, even if you’re making toy octopuses instead of sweaters. Nor does there seem to be anything wrong with enjoying a movie or a game. The problem comes when people get addicted to these activities, which may be increasingly likely as our ability to make fake activities–like hyper-realistic special effects in movies–increases.

Given modernity, should we indulge? Or can we develop something better?

Gay marriage didn’t win; traditional marriage lost

From the evolutionist point of view, the point of marriage is the production of children.

Let’s quickly analogize to food. Humans have a tremendous variety of customs, habits, traditions, and taboos surrounding foods. Foods enjoyed in one culture, like pork, crickets, and dog, are regarded as disgusting, immoral, or forbidden in another. Cheese is, at heart, rotten vomit–the enzyme used to make cheese coagulate is actually extracted from a calf’s stomach lining–and yet the average American eats it eagerly.

Food can remind you of your childhood, the best day of your life, the worst day of your life. It can comfort the sick and the mourning, and it accompanies our biggest celebrations of life.

Eh, I’d be happy giving him a microstate and seeing how he does running it.

We eat comfort food, holiday food, even sacrificial food. We have decadent luxuries and everyday staples. Some people, like vegans and ascetics, avoid large classes of food generally eaten by their own society for moral reasons.

People enjoy soda because it has water and calories, but some of us purposefully trick our taste buds by drinking Diet Coke, which delivers the sensation of drinking calories without the calories themselves. We enjoy the taste of calories even when we don’t need any more.

But the evolutionary purpose of eating is to get enough calories and nutrients to survive. If tomorrow we all stopped needing to eat–say, we were all hooked into a Matrix-style click-farm in which all nutrients were delivered automatically via IV–all of the symbolic and emotional content attached to food would wither away.

The extended helplessness of human infants is unique in the animal kingdom. Even elephants, who gestate for an incredible two years and become mature at 18, can stand and begin walking around shortly after birth. Baby elephants are not raised solely by their mothers, as baby rats are, but by an entire herd of related female elephants.

Elephants are remarkable animals, clever, communicative, and caring, who mourn their dead and create art:

But from the evolutionist point of view, the point of elephants’ family systems is still the production of elephant children.

Love is a wonderful, sweet, many-splendored thing, but the purpose of marriage, in all its myriad forms–polygamy, monogamy, polyandry, serial monogamy–is still the production of children.

There are a few societies where marriage as we know it is not really practiced because people depend on alternative kin networks or women can largely provide for themselves. For example, 70% of African American children are born out of wedlock; and among the avuncular Apache:

In the Southwest United States, the Apache tribe practices a form of this, where the uncle is responsible for teaching the children social values and proper behavior while inheritance and ancestry is reckoned through the mother’s family alone. (Modern day influences have somewhat but not completely erased this tradition.)

source: BBC News

Despite the long public argument over the validity of gay marriage, very few gay people actually want to get married. Gallop reports that after the Obergefell v. Hodges ruling, the percent of married gay people jumped quickly from 7.9% to 9.5%, but then leveled off, rising to only 9.6% by June 2016.

In contrast, 46% of US adults are married.

Even this number, though, is in sharp decline: in 1960, 72% of adults were married; by 2010, only 51% were.

The situation is similar throughout the Western world. Only 51% of Brits are married. In Italy, the crude marriage rate (the number of new marriages per 1,000 people), has fallen from 7.35 in 1970 to only 4.21 in 2007. Only 58.9% of Japanese are married.

Declining marriage rates across the developed world have been accompanied by declining fertility rates and rising illegitimacy rates:

Graph showing children per woman rate over the years 1960 – 2009 in USA, China, India, Germany, Russia population rates.
H/T: Share of Births to Unmarried Mothers by Race

As Wikipedia notes:

Only 2% of [Japanese] births occur outside of marriage[35] (compared to 30-60% in Europe and North America) due to social taboos, legal pressure, and financial hurdles.[32] Half of Japan’s single mothers live below the poverty line, among the highest for OECD countries.[36][37][38][39]

In other words, the Japanese welfare state, while generous, does not encourage single motherhood. Wikipedia also provides a discussion of the causes of declining Japanese marriage rates:

The annual number of marriages has dropped since the early 1970s, while divorces have shown a general upward trend.[29] …

The decline of marriage in Japan, as fewer people marry and do so later in life, is a widely cited explanation for the plummeting birth rate.[29][30][31][32] Although the total fertility rate has dropped since the 1970s (to 1.43 in 2013[33]), birth statistics for married women have remained fairly constant (at around 2.1) and most married couples have two or more children. Economic factors, such as the cost of raising a child, work-family conflicts, and insufficient housing, are the most common reasons for young mothers (under 34) to have fewer children than desired. …

Between 1990 and 2010, the percentage of 50-year-old people who had never married roughly quadrupled for men to 20.1% and doubled for women to 10.6%.[41][42] The Welfare Ministry predicts these numbers to rise to 29% of men and 19.2% of women by 2035.[43] The government’s population institute estimated in 2014 that women in their early 20s had a one-in-four chance of never marrying, and a two-in-five chance of remaining childless.[44]

Recent media coverage has sensationalized surveys from the Japan Family Planning Association and the Cabinet Office that show a declining interest in dating and sexual relationships among young people, especially among men.[44][45][46] However, changes in sexuality and fertility are more likely an outcome of the decline in family formation than its cause.[47][48] Since the usual purpose of dating in Japan is marriage, the reluctance to marry often translates to a reluctance to engage in more casual relationships.[30]

In other words, marriage is functionally about providing a supportive way of raising children. In a society where birth control does not exist, children born out of wedlock tend not to survive, and people can easily get jobs to support their families, people tended to get married and have children. In a society where people do not want children, cannot afford them, are purposefully delaying childbearing as long as possible, or have found ways to provide for them without getting married, people simply see no need for marriage.

“Marriage” ceases to mean what it once did, reserved for old-fashioned romantics and the few lucky enough to afford it.

Mass acceptance of gay marriage did change how people think of marriage, but it’s downstream from what the massive, societal-wide decrease in child-bearing and increase in illegitimacy have done to our ideas about marriage.

Musical Mystery

Singer Tom Jones, famous recipient of ladies’ panties

There are three categories of supersars who seem to attract excessive female interest. The first is actors, who of course are selected for being abnormally attractive and put into romantic and exciting narratives that our brains subconsciously interpret as real. The second are sports stars and other athletes, whose ritualized combat and displays of strength obviously indicate their genetic “fitness” for siring and providing for children.

The third and strangest category is professional musicians, especially rock stars.

I understand why people want to pass athletic abilities on to their children, but what is the evolutionary importance of musical talent? Does music tap into some deep, fundamental instinct like a bird’s attraction to the courtship song of its mate? And if so, why?

There’s no denying the importance of music to American courtship rituals–not only do people visit bars, clubs, and concerts where music is being played in order to meet potential partners, but they also display musical tastes on dating profiles in order to meet musically-like-minded people.

Of all the traits to look for in a mate, why rate musical taste so highly? And why do some people describe their taste as, “Anything but rap,” or “Anything but country”?

Mick Jagger and Chuck Berry

At least when I was a teen, musical taste was an important part of one’s “identity.” There were goths and punks, indie scene kids and the aforementioned rap and country fans.

Is there actually any correlation between musical taste and personality? Do people who like slow jazz get along with other slow jazz fans better than fans of classical Indian? Or is this all compounded by different ethnic groups identifying with specific musical styles?

Obviously country correlates with Amerikaner ancestry; rap with African American. I’m not sure what ancestry is biggest fans of Die Antwoord. Heavy Metal is popular in Finno-Scandia. Rock ‘n Roll got its start in the African American community as “Race Music” and became popular with white audiences after Elvis Presley took up the guitar.

While Europe has a long and lovely musical heritage, it’s indisputable that African Americans have contributed tremendously to American musical innovation.

Here are two excerpts on the subject of music and dance in African societies:

source: A Voyage to Senegal: The Isle of Goreé, and the River Gambia by  Michel Adanson, Correspondent of the Royal Academy of Sciences


source: Africana: The Encyclopedia of the African and African American Experience Aardvark-Catholic. Vol. 1
Elvis’s pelvis, considered too sexy for TV

Both of these h/t HBD Chick and my apologies in advance if I got the sources reversed.

One of the major HBD theories holds that the three races vary–on average–in the distribution of certain traits, such as age of first tooth eruption or intensity of an infant’s response to a tissue placed over its face. Sub-Saharan Africans and Asians are considered two extremes in this distribution, with whites somewhere in between.

If traditional African dancing involves more variety in rhythmic expression than traditional European, does traditional Asian dance involve less? I really know very little about traditional Asian music or dance of any kind, but I would not be surprised to see some kind of continuum affected by whether a society traditionally practiced arranged marriages. Where people chose their own mates, it seems like they display a preference for athletic or musically talented mates (“sexy” mates;) when parents chose mates, they seem to prefer hard-working, devout, “good providers.”

Natasha Rostova and Andrei Bolkonsky, from War and Peace by Tolstoy

Even in traditional European and American society, where parents played more of a role in courtship than they do today, music still played a major part. Young women, if their families could afford it, learned to play the piano or other instruments in order to be “accomplished” and thus more attractive to higher-status men; young men and women often met and courted at musical events or dances organized by the adults.

It is undoubtedly true that music stirs the soul and speaks to the heart, but why?


Is Racism an Instinct?

Everyone is a little bit racist–Hillary Clinton

If everyone in the world exhibits a particular behavior, chances are it’s innate. But I have been informed–by Harvard-educated people, no less–that humans do not have instincts. We are so smart, you see, that we don’t need instincts anymore.

This is nonsense, of course.

One amusing and well-documented human instinct is the nesting instinct, experienced by pregnant women shortly before going into labor. (As my father put it, “When shes starts rearranging the furniture, get the ready to head to the hospital.”) Having personally experienced this sudden, overwhelming urge to CLEAN ALL THE THINGS multiple times, I can testify that it is a real phenomenon.

Humans have other instincts–babies will not only pick up and try to eat pretty much anything they run across, to every parent’s consternation, but they will also crawl right up to puddles and attempt to drink out of them.

But we’re getting ahead of ourselves: What, exactly, is an instinct? According to Wikipedia:

Instinct or innate behavior is the inherent inclination of a living organism towards a particular complex behavior. The simplest example of an instinctive behavior is a fixed action pattern (FAP), in which a very short to medium length sequence of actions, without variation, are carried out in response to a clearly defined stimulus.

Any behavior is instinctive if it is performed without being based upon prior experience (that is, in the absence of learning), and is therefore an expression of innate biological factors. …

Instincts are inborn complex patterns of behavior that exist in most members of the species, and should be distinguished from reflexes, which are simple responses of an organism to a specific stimulus, such as the contraction of the pupil in response to bright light or the spasmodic movement of the lower leg when the knee is tapped.

The go-to example of an instinct is the gosling’s imprinting instinct. Typically, goslings imprint on their mothers, but a baby gosling doesn’t actually know what its mother is supposed to look like, and can accidentally imprint on other random objects, provided they are moving slowly around the nest around the time the gosling hatches.

Stray dog nursing kittens
Stray dog nursing kittens

Here we come to something I think may be useful for distinguishing an instinct from other behaviors: an instinct, once triggered, tends to keep going even if it has been accidentally or incorrectly triggered. Goslings look like they have an instinct to follow their mothers, but they actually have an instinct to imprint on the first large, slowly moving object near their nest when they hatch.

So if you find people strangely compelled to do something that makes no sense but which everyone else seems to think makes perfect sense, you may be dealing with an instinct. For example, women enjoy celebrity gossip because humans have an instinct to keep track of social ranks and dynamics within their own tribe; men enjoy watching other men play sports because it conveys the vicarious feeling of defeating a neighboring tribe at war.

So what about racism? Is it an instinct?

Strictly speaking–and I know I have to define racism, just a moment–I don’t see how we could have evolved such an instinct. Races exist because major human groups were geographically separated for thousands of years–prior to 1492, the average person never even met a person of another race in their entire life. So how could we evolve an instinct in response to something our ancestors never encountered?

Unfortunately, “racism” is a chimera, always changing whenever we attempt to pin it down, but the Urban Dictionary gives a reasonable definition:

An irrational bias towards members of a racial background. The bias can be positive (e.g. one race can prefer the company of its own race or even another) or it can be negative (e.g. one race can hate another). To qualify as racism, the bias must be irrational. That is, it cannot have a factual basis for preference.

Of course, instincts exist because they ensured our ancestors’ survival, so if racism is an instinct, it can’t exactly be “irrational.” We might call a gosling who follows a scientist instead of its mother “irrational,” but this is a misunderstanding of the gosling’s motivation. Since “racist” is a term of moral judgment, people are prone to defending their actions/beliefs towards others on the grounds that it can’t possibly be immoral to believe something that is actually true.

The claim that people are “racist” against members of other races implies, in converse, that they exhibit no similar behaviors toward members of their own race. But even the most perfunctory overview of history reveals people acting in extremely “racist” ways toward members of their own race. During the Anglo-Boer wars, the English committed genocide against the Dutch South Africans (Afrikaners.) During WWII, Germans allied with the the Japanese and slaughtered their neighbors, Poles and Jews. (Ashkenazim are genetically Caucasian and half Italian.) If Hitler were really racist, he’d have teamed up with Stalin and Einstein–his fellow whites–and dropped atomic bombs on Hiroshima. (And for their part, the Japanese would have allied with the Chinese against the Germans.)

picture-2Some quotes from the NewScientist article:

The murder victim, a West African chimpanzee called Foudouko, had been beaten with rocks and sticks, stomped on and then cannibalised by his own community. …

“When you reverse that and have almost two males per every female — that really intensifies the competition for reproduction. That seems to be a key factor here,” says Wilson.

Jill Pruetz at Iowa State University, who has been studying this group of chimpanzees in south-eastern Senegal since 2001, agrees. She suggests that human influence may have caused this skewed gender ratio that is likely to have been behind this attack. In Senegal, female chimpanzees are poached to provide infants for the pet trade. …

Early one morning, Pruetz and her team heard loud screams and hoots from the chimps’ nearby sleep nest. At dawn, they found Foudouko dead, bleeding profusely from a bite to his right foot. He also had a large gash in his back and a ripped anus. Later he was found to have cracked ribs. Pruetz says Foudouko probably died of internal injuries or bled out from his foot wound.

Foudouko also had wounds on his fingers. These were likely to have been caused by chimps clamping them in their teeth to stretch his arms out and hold him down during the attack, says Pruetz.

After his death, the gang continued to abuse Foudouko’s body, throwing rocks and poking it with sticks, breaking its limbs, biting it and eventually eating some of the flesh.

“It was striking. The female that cannibalised the body the most, she’s the mother of the top two high-ranking males. Her sons were the only ones that really didn’t attack the body aggressively,” Pruetz says …

Historically, the vast majority of wars and genocides were waged by one group of people against their neighbors–people they were likely to be closely related to in the grand scheme of things–not against distant peoples they’d never met. If you’re a chimp, the chimp most likely to steal your banana is the one standing right in front of you, not some strange chimp you’ve never met before who lives in another forest.

Indeed, in Jane Goodall’s account of the Gombe Chimpanzee War, the combatants were not members of two unrelated communities that had recently encountered each other, but members of a single community that had split in two. Chimps who had formerly lived peacefully together, groomed each other, shared bananas, etc., now bashed each other’s brains out and cannibalized their young. Poor Jane was traumatized.

I think there is an instinct to form in-groups and out-groups. People often have multiple defined in-groups (“I am a progressive, a Christian, a baker, and a Swede,”) but one of these identities generally trumps the others in importance. Ethnicity and gender are major groups most people seem to have, but I don’t see a lot of evidence suggesting that the grouping of “race” is uniquely special, globally, in people’s ideas of in- and out-.

For example, as I am writing today, people are concerned that Donald Trump is enacting racist policies toward Muslims, even though “Muslim” is not a race and most of the countries targeted by Trump’s travel/immigration ban are filled with fellow Caucasians, not Sub-Saharan Africans or Asians.

Race is a largely American obsession, because our nation (like the other North and South American nations,) has always had whites, blacks, and Asians (Native Americans). But many countries don’t have this arrangement. Certainly Ireland didn’t have an historical black community, nor Japan a white one. Irish identity was formed in contrast to English identity; Japanese in contrast to Chinese and Korean.

Only in the context where different races live in close proximity to each other does it seem that people develop strong racial identities; otherwise people don’t think much about race.

Napoleon Chagnon, a white man, has spent years living among the Yanomamo, one of the world’s most murderous tribes, folks who go and slaughter their neighbors and neighbors’ children all the time, and they still haven’t murdered him.

Why do people insist on claiming that Trump’s “Muslim ban” is racist when Muslims aren’t a race? Because Islam is an identity group that appears to function similarly to race, even though Muslims come in white, black, and Asian.

If you’ve read any of the comments on my old post about Turkic DNA, Turkey: Not very Turkic, you’ll have noted that Turks are quite passionate about their Turkic identity, even though “Turkic” clearly doesn’t correspond to any particular ethnic groups. (It’s even more mixed up than Jewish, and that’s a pretty mixed up one after thousands of years of inter-breeding with non-Jews.)

Group identities are fluid. When threatened, groups merged. When resources are abundant and times are good, groups split.

What about evidence that infants identify–stare longer at–faces of people of different races than their parents? This may be true, but all it really tells us is that babies are attuned to novelty. It certainly doesn’t tell us that babies are racist just because they find people interesting who look different from the people they’re used to.

What happens when people encounter others of a different race for the first time?

We have many accounts of “first contacts” between different races during the Age of Exploration. For example, when escaped English convict William Buckley wandered into an uncontacted Aborigine tribe, they assumed he was a ghost, adopted him, taught him to survive, and protected him for 30 years. By contrast, the last guy who landed on North Sentinel Island and tried to chat with the natives there got a spear to the chest and a shallow grave for his efforts. (But I am not certain the North Sentinelese haven’t encountered outsiders at some point.)

But what about the lunchroom seating habits of the wild American teenager?

If people have an instinct to form in-groups and out-groups, then races (or religions?) may represent the furthest bounds of this, at least until we encounter aliens. All else held equal, perhaps we are most inclined to like the people most like ourselves, and least inclined to like the people least like ourselves–racism would thus be the strongest manifestation of this broader instinct. But what about people who have a great dislike for one race, but seem just fine with another, eg, a white person who likes Asians but not blacks, or a black who like Asians but not whites? And can we say–per our definition above–that these preferences are irrational, or are they born of some lived experience of positive or negative interactions?

Again, we are only likely to have strong opinions about members of other races if we are in direct conflict or competition with them. Most of the time, people are in competition with their neighbors, not people on the other side of the world. I certainly don’t sit here thinking negative thoughts about Pygmies or Aborigines, even though we are very genetically distant from each other, and I doubt they spend their free time thinking negatively about me.

Just because flamingos prefer to flock with other flamingos doesn’t mean they dislike horses; for the most part, I think people are largely indifferent to folks outside their own lives.

Capitalism Wins

A recent article in Stanford Magazine highlighted the work of psychologist Richard Lampiere. Back in 1931, Lampiere, a Chinese student of his, and his student’s Chinese wife drove cross-country, visiting 250 hotels and restaurants.

One business refused them service, presumably because of race.

Then Lampiere sent surveys to the businesses they’d visited (plus controls) asking if they served Chinese people. The businesses responded:

235 said NO,

18 said maybe,

and only 2 said YES.

Basically the complete opposite of reality.

Social signalling is cheap; losing actual customers on the ground is expensive.

People today still say whatever they think will gain them approval, though our politics have changed a lot since 1931. For example, 89% of people these days report being willing to marry someone of another race:


but of marriages conducted in 2013, only 12% actually were. By contrast, while a similar number of people said they would be unhappy about a cross-political marriage in their family:


but about 30% of (all) married people (in the 30 states that track party affiliation) are in a cross-ideological marriage.

Likewise, recall that much of the poll data coming out before the 2016 Election showed Hillary Clinton winning and Donald Trump losing.

Open Thread, Comment of the Week, etc.

Hello my friends. How has your week been?

I thought this was really interesting:


Since I don’t watch much TV that doesn’t involve Thomas the Tank Engine, I’ve never seen John Oliver and don’t really know who he is, but I am generally aware of Colbert and the Daily Show and such.

Comments of the week go to SFC Ton and Rhetocrates:

t3_5citho-3“I have seen a lot of failed nation states, up close and personal over the years. They always break down over tribal/ racial lines. …

It is inevitable that the usa will go through some version of Yugoslavia. The question is when and to what degree.” — SFC Ton

“I don’t think we’re going to turn into Yugoslavia. I think we’re going to turn into Syria.

The main difference is that Yugoslavia was already mostly segregated when the violence broke out. …

The US (speaking here of the largest segments of the population…) are not ethnically segregated.

… our dissolution – if not stopped – will look like Syria.” — Rhetocrates



The liberal solution to ethnic breakdown is “Stop being racist.” The conservative solution is “Avoid people you dislike.” Both solutions kind of work–until they don’t.

Hrm. Any interesting articles this week? How about a somewhat speculative but still very interesting reconstruction of an ancient Greek warrior’s face, plus a discussion of his grave goods?

One of my relatives died this week, so I’m going to go be sad, now. Please, if you have any fights with your relatives, try to make up if you can before they die. Sometimes people die a lot younger than you think they will.

And don’t let all of this election bullshittery drive you apart. Just don’t.


Sticky Brains and Forgiveness

(Warning: this post is based on personal, entirely anecdotal observations of other humans.)

I interact, on a fairly regular basis, with people from a wide range of backgounds: folks who’ve spent decades living on the streets; emotionally disabled folks and folks who were emotionally traumatized but recovered; working, middle, and upper class folks.

“Functionality” may not be the easiest term to define, but you know it when you see it: people who manage to pick up the pieces when bad shit happens and continue on with their lives. Non-functionality does not automatically make you poor, (nor does functionality make you rich,) but it is often a major contributing factor.

I’m not going to claim that we all go through equal amounts of trauma; certainly some of us, like infants who were dropped on their heads, have truly shitty lives. Still, almost all of us endure at least some trauma, and there is great variation in our responses to the tragedies we endure.

Among the people I know personally, I’ve noticed that the less-functional tend to have “sticky brains.” When trauma happens, they gloom onto it and get stuck. Years, sometimes decades later, you hear these people still talking about things other people did to them.

For example: two people I know (we’ll call them Foxtrot and Golf, following my alias convention,) had rough childhoods.  Foxtrot is still quite bitter over things that happened over 50 years ago, committed by relatives who are long dead. He’s is also bitter about things that happened recently; I often hear about very minor conflicts that normal people would just be angry about for a day or two that Foxtrot is still losing sleep over a month later. Unsurprisingly, he is an unstable emotional wreck with no job, a string of divorces, and virtually no contact with his family.

Golf’s childhood was, by all objective measures, far worse than Foxtrot’s. But Golf doesn’t talk much about his childhood and is today a functional person. When bad things happen to Golf, he deals with them, he might get angry, and then he finishes with them and puts them aside. He has his bad spells–times when things are going badly and he gets really depressed. He also has his good times. But he has managed to keep himself together well enough, even through these bad times, to stay married and employed (to the same person and at the same job, for decades,) is in contact with most of his family, and enjoys a decent reputation in the community.

The homeless people I interact with also have “sticky brains.” When bad things happen to them (and, yes, being homeless is like a permanent bad thing happening to you,) they get really focused on that bad thing. For example, one homeless woman I know has worried for decades about a possible indiscretion she might have committed back in highschool–it is a very minor thing of less importance than copying a few answers on a math test, but she is still worried that she is a cheater and dishonest member of society. Another is fixated on a bad interaction with an aid worker that happened over a year ago. Most people would say, “yeah, that guy was a jerk,” and then stop worrying about it after a week or so; in this case, the hurt is reviewed and re-felt almost every day.

And, of course, I have many personal friends who’ve endured or dealt with traumas in their own more or less useful ways. (Not to mention the various ups and downs of my own life.)

Because trauma is common–some, like the death of a loved one, strike almost everyone who makes it to adulthood–societies tend to adopt guidelines for trauma response, such as a funeral for the dead followed by a six-month mourning period for widows, official days of mourning or remembrance for people who died in wars, therapy and anti-depressants, confession and forgiveness, head-hunting (among head-hunters), or sympathy cards among the less violently inclined. My own family has a tradition of visiting the graveyard where many of our older relatives are buried once a year and cleaning the gravestones. (The children have a tradition of pretending to be zombies.)

Anthropologists like to call these things “rituals” and “customs.” Different societies have different customs, but all of the ones listed exist for the purpose of helping people cope with trauma and grief. (Or at least, that’s what the head-hunters claimed.)

Watching people attempt to cope with life has made me appreciate (most of) these customs. “Six months of mourning,” may seem arbitrary, but it is also pretty useful: it dictates that yes, it is very normal to feel terrible for a while and everyone will be understanding of that, but now the time has passed and it is time to get on with life.

Christianity and Judaism (and probably other religions) command forgiveness:

Do not seek revenge or bear a grudge against anyone among your people, but love your neighbor as yourself. I am the LORD. — Leviticus 19: 18

Then Peter came to Jesus and asked, “Lord, how many times shall I forgive my brother when he sins against me? Up to seven times? “Jesus answered, “I tell you, not seven times, but seventy-seven times.” — Matthew 18: 21-22

This is ostensibly for practical reasons:

For if you forgive men when they sin against you, your heavenly Father will also forgive you. But if you do not forgive men their sins, your Father will not forgive your sins. — Matthew 6:14-15

On Yom Kippur, Jews observe a tradition of forgiving others and asking forgiveness for themselves. (It is not surprising that forgiveness should be handled similarly in two religions that share much of their scriptures; Christianity seems to differ primarily in making the institution of forgiveness a more personal matter rather than an annual ritual.)

I’m pretty sure forgiveness is a big deal in Buddhism, as well, but I don’t know much about Hinduism and other belief systems, so I can’t comment on them.

But why should God require forgiveness? It seems rather unfair to say to someone who was raped as a child and has done nothing worse than tell a few lies in their life, “If you don’t forgive your rapist, God won’t forgive you for lying.”

But this assumes that forgiveness exists for the forgiven. In some cases, of course, it does. But forgiveness also serves a function for the forgiver. I shall leave the concept of spiritual purity to the spiritual; as a practical matter, forgiveness allows the hurt party to stop focusing on their pain and resume life. Most people do this fairly naturally, but some of us need a bit of encouragement–and perhaps ritual focus and faith–to heal.

3 Blessed are the poor in spirit,
    for theirs is the kingdom of heaven.
Blessed are those who mourn,
    for they will be comforted.
Blessed are the meek,
    for they will inherit the earth.
Blessed are those who hunger and thirst for righteousness,
    for they will be filled.
Blessed are the merciful,
    for they will be shown mercy.
Blessed are the pure in heart,
    for they will see God.
Blessed are the peacemakers,
    for they will be called children of God.
10 Blessed are those who are persecuted because of righteousness,
    for theirs is the kingdom of heaven.

11 “Blessed are you when people insult you, persecute you and falsely say all kinds of evil against you because of me. 12 Rejoice and be glad, because great is your reward in heaven, for in the same way they persecuted the prophets who were before you. — Matthew 5: 3-12

I don’t think the point of this is that it is morally superior to be insulted or hurt or poor, but reassure and comfort those who have been.

Weight, Taste, and Politics: A Theory of Republican Over-Indulgence

So I was thinking about taste (flavor) and disgust (emotion.)

As I mentioned about a month ago, 25% of people are “supertasters,” that is, better at tasting than the other 75% of people. Supertasters experience flavors more intensely than ordinary tasters, resulting in a preference for “bland” food (food with too much flavor is “overwhelming” to them.) They also have a more difficult time getting used to new foods.

One of my work acquaintances of many years –we’ll call her Echo–is obese, constantly on a diet, and constantly eats sweets. She knows she should eat vegetables and tries to do so, but finds them bitter and unpleasant, and so the general outcome is as you expect: she doesn’t eat them.

Since I find most vegetables quite tasty, I find this attitude very strange–but I am willing to admit that I may be the one with unusual attitudes toward food.

Echo is also quite conservative.

This got me thinking about vegetarians vs. people who think vegetarians are crazy. Why (aside from novelty of the idea) should vegetarians be liberals? Why aren’t vegetarians just people who happen to really like vegetables?

What if there were something in preference for vegetables themselves that correlated with political ideology?

Certainly we can theorize that “supertaster” => “vegetables taste bitter” => “dislike of vegetables” => “thinks vegetarians are crazy.” (Some supertasters might think meat tastes bad, but anecdotal evidence doesn’t support this; see also Wikipedia, where supertasting is clearly associated with responses to plants:

Any evolutionary advantage to supertasting is unclear. In some environments, heightened taste response, particularly to bitterness, would represent an important advantage in avoiding potentially toxic plant alkaloids. In other environments, increased response to bitterness may have limited the range of palatable foods. …

Although individual food preference for supertasters cannot be typified, documented examples for either lessened preference or consumption include:

Mushrooms? Echo was just complaining about mushrooms.

Let’s talk about disgust. Disgust is an important reaction to things that might infect or poison you, triggering reactions from scrunching up your face to vomiting (ie, expelling the poison.) We process disgust in our amygdalas, and some people appear to have bigger or smaller amygdalas than others, with the result that the folks with more amygdalas feel more disgust.

Humans also route a variety of social situations through their amygdalas, resulting in the feeling of “disgust” in response to things that are not rotten food, like other people’s sexual behaviors, criminals, or particularly unattractive people. People with larger amygdalas also tend to find more human behaviors disgusting, and this disgust correlates with social conservatism.

To what extent are “taste” and “disgust” independent of each other? I don’t know; perhaps they are intimately linked into a single feedback system, where disgust and taste sensitivity cause each other, or perhaps they are relatively independent, so that a few unlucky people are both super-sensitive to taste and easily disgusted.

People who find other people’s behavior disgusting and off-putting may also be people who find flavors overwhelming, prefer bland or sweet foods over bitter ones, think vegetables are icky, vegetarians are crazy, and struggle to stay on diets.

What’s that, you say, I’ve just constructed a just-so story?

Well, this is the part where I go looking for evidence. It turns out that obesity and political orientation do correlate:

Michael Shin and William McCarthy, researchers from UCLA, have found an association between counties with higher levels of support for the 2012 Republican presidential candidate and higher levels of obesity in those counties.

Shin and McCarthy's map of obesity vs. political orientation
Shin and McCarthy’s map of obesity vs. political orientation

Looks like the Mormons and Southern blacks are outliers.

(I don’t really like maps like this for displaying data; I would much prefer a simple graph showing orientation on one axis and obesity on the other, with each county as a datapoint.)

(Unsurprisingly, the first 49 hits I got when searching for correlations between political orientation and obesity were almost all about what other people think of fat people, not what fat people think. This is probably because researchers tend to be skinny people who want to fight “fat phobia” but aren’t actually interested in the opinions of fat people.)

The 15 most caffeinated cities, from I love Coffee
The 15 most caffeinated cities, from I love Coffee–note that Phoenix is #7, not #1.

Disgust also correlates with political belief, but we already knew that.

A not entirely scientific survey also indicates that liberals seem to like vegetables better than conservatives:

  • Liberals are 28 percent more likely than conservatives to eat fresh fruit daily, and 17 percent more likely to eat toast or a bagel in the morning, while conservatives are 20 percent more likely to skip breakfast.
  • Ten percent of liberals surveyed indicated they are vegetarians, compared with 3 percent of conservatives.
  • Liberals are 28 percent more likely than conservatives to enjoy beer, with 60 percent of liberals indicating they like beer.

(See above where Wikipedia noted that supertasters dislike beer.) I will also note that coffee, which supertasters tend to dislike because it is too bitter, is very popular in the ultra-liberal cities of Portland and Seattle, whereas heavily sweetened iced tea is practically the official beverage of the South.

The only remaining question is if supertasters are conservative. That may take some research.

Update: I have not found, to my disappointment, a simple study that just looks at correlation between ideology and supertasting (or nontasting.) However, I have found a couple of useful items.

In Verbal priming and taste sensitivity make moral transgressions gross, Herz writes:

Standard tests of disgust sensitivity, a questionnaire developed for this research assessing different types of moral transgressions (nonvisceral, implied-visceral, visceral) with the terms “angry” and “grossed-out,” and a taste sensitivity test of 6-n-propylthiouracil (PROP) were administered to 102 participants. [PROP is commonly used to test for “supertasters.”] Results confirmed past findings that the more sensitive to PROP a participant was the more disgusted they were by visceral, but not moral, disgust elicitors. Importantly, the findings newly revealed that taste sensitivity had no bearing on evaluations of moral transgressions, regardless of their visceral nature, when “angry” was the emotion primed. However, when “grossed-out” was primed for evaluating moral violations, the more intense PROP tasted to a participant the more “grossed-out” they were by all transgressions. Women were generally more disgust sensitive and morally condemning than men, … The present findings support the proposition that moral and visceral disgust do not share a common oral origin, but show that linguistic priming can transform a moral transgression into a viscerally repulsive event and that susceptibility to this priming varies as a function of an individual’s sensitivity to the origins of visceral disgust—bitter taste. [bold mine.]

In other words, supertasters are more easily disgusted, and with verbal priming will transfer that disgust to moral transgressions. (And easily disgusted people tend to be conservatives.)

The Effect of Calorie Information on Consumers’ Food Choice: Sources of Observed Gender Heterogeneity, by Heiman and Lowengart, states:

While previous studies found that inherited taste-blindness to bitter compounds such
as PROP may be a risk factor for obesity, this literature has been hotly disputed
(Keller et al. 2010).

(Always remember, of course, that a great many social-science studies ultimately do not replicate.)

I’ll let you know if I find anything else.

Quick thoughts on the “replication crisis” and calls to make the field more mathematically rigorous

If you aren’t familiar with the “replication crisis,” in social psychology, start here, here, and here.

I consider the courses I took in college on quantitative and qualitative methods the most important of my undergraduate years. I learned thereby a great many important things about how not to conduct an experiment and how to think about experimental methodology (not to mention statistics.)

If I were putting together a list of “general education” requirements I wanted all students to to take in order to declare them well-educated and ready to go out into the world, it’d be a course on Quantitative and Qualitative Methods. (Much like current “gen ed” and “distribution requirements,” the level of mathematical ability required would likely vary by field, though no one should be obtaining a college degree without some degree of numerical competence.)

But the real problem with the social science fields is not lack of rigorous statistical background, but overwhelming ideological conformity, enforced by the elders of the fields–advisers, hiring committees, textbook writers, journal editors, etc., who all believe in the same ideology and so have come to see their field as “proving” their ideology.

Ideology drives both the publication biases and the wishful thinking that underlie this crisis. For example, everyone in “Women’s studies” is a feminist who believes that “science” proves that women are oppressed because everyone they know has done studies “proving” it. You’re not going to find a lot of Women’s Studies professors aiming for tenure on the basis of their successful publication of a bunch of studies that failed to find any evidence of bias against women. Findings like that => no publication => no tenure. And besides, feminist professors see it as their moral duty to prove that discrimination exists, not to waste their time on studies that just happened not to be good enough to find the effect.

In the Social Sciences more generally, we get this “post modern” mish-mash of everything from Marxists to Freudians to folks who like Foucault and Said, where the goal is to mush up long-winded descriptions of otherwise simple phenomena into endless Chomsky Sentences.

(Just reading the Wikipedia pages on a variety of Social Science oriented topics reveals how very little real research or knowledge is generated in these fields, and how much is based on individual theorists’ personal views. It is often obvious that virtually anyone not long steeped in the academic literature of these fields would not come up with these theories, but with something far more mundane and sensible. Economists, for all their political bias, at least provide a counterpoint to many of these theories.)

Obviously different fields study different aspects of phenomena, but entire fields should not become reduced to trying to prove one political ideology or another. If they are, they should label themselves explicitly, rather than make a pretense of neutrality.

When ideology rather than correctness become the standard for publication (not to mention hiring and tenure,) the natural result is incorrectness.

More statistical knowledge is not, by itself, going to resolve the problem. The fields must first recognize that they have an ideological bias problem, and then work to remedy it by letting in and publishing work by researchers outside the social science ideological mainstream. It is very easy to think your ideas sound rigorous when you are only debating with people who already agree with you; it is much more difficult to defend your views against people who disagree, or come from very different intellectual backgrounds.

They could start with–hahahaha–letting in a Republican.