Musical Mystery

Singer Tom Jones, famous recipient of ladies’ panties

There are three categories of supersars who seem to attract excessive female interest. The first is actors, who of course are selected for being abnormally attractive and put into romantic and exciting narratives that our brains subconsciously interpret as real. The second are sports stars and other athletes, whose ritualized combat and displays of strength obviously indicate their genetic “fitness” for siring and providing for children.

The third and strangest category is professional musicians, especially rock stars.

I understand why people want to pass athletic abilities on to their children, but what is the evolutionary importance of musical talent? Does music tap into some deep, fundamental instinct like a bird’s attraction to the courtship song of its mate? And if so, why?

There’s no denying the importance of music to American courtship rituals–not only do people visit bars, clubs, and concerts where music is being played in order to meet potential partners, but they also display musical tastes on dating profiles in order to meet musically-like-minded people.

Of all the traits to look for in a mate, why rate musical taste so highly? And why do some people describe their taste as, “Anything but rap,” or “Anything but country”?

Mick Jagger and Chuck Berry

At least when I was a teen, musical taste was an important part of one’s “identity.” There were goths and punks, indie scene kids and the aforementioned rap and country fans.

Is there actually any correlation between musical taste and personality? Do people who like slow jazz get along with other slow jazz fans better than fans of classical Indian? Or is this all compounded by different ethnic groups identifying with specific musical styles?

Obviously country correlates with Amerikaner ancestry; rap with African American. I’m not sure what ancestry is biggest fans of Die Antwoord. Heavy Metal is popular in Finno-Scandia. Rock ‘n Roll got its start in the African American community as “Race Music” and became popular with white audiences after Elvis Presley took up the guitar.

While Europe has a long and lovely musical heritage, it’s indisputable that African Americans have contributed tremendously to American musical innovation.

Here are two excerpts on the subject of music and dance in African societies:

source: A Voyage to Senegal: The Isle of Goreé, and the River Gambia by  Michel Adanson, Correspondent of the Royal Academy of Sciences

and:

source: Africana: The Encyclopedia of the African and African American Experience Aardvark-Catholic. Vol. 1
Elvis’s pelvis, considered too sexy for TV

Both of these h/t HBD Chick and my apologies in advance if I got the sources reversed.

One of the major HBD theories holds that the three races vary–on average–in the distribution of certain traits, such as age of first tooth eruption or intensity of an infant’s response to a tissue placed over its face. Sub-Saharan Africans and Asians are considered two extremes in this distribution, with whites somewhere in between.

If traditional African dancing involves more variety in rhythmic expression than traditional European, does traditional Asian dance involve less? I really know very little about traditional Asian music or dance of any kind, but I would not be surprised to see some kind of continuum affected by whether a society traditionally practiced arranged marriages. Where people chose their own mates, it seems like they display a preference for athletic or musically talented mates (“sexy” mates;) when parents chose mates, they seem to prefer hard-working, devout, “good providers.”

Natasha Rostova and Andrei Bolkonsky, from War and Peace by Tolstoy

Even in traditional European and American society, where parents played more of a role in courtship than they do today, music still played a major part. Young women, if their families could afford it, learned to play the piano or other instruments in order to be “accomplished” and thus more attractive to higher-status men; young men and women often met and courted at musical events or dances organized by the adults.

It is undoubtedly true that music stirs the soul and speaks to the heart, but why?

 

Advertisements

On Socialization

As a parent, I spend much of my day attempting to “socialize” my kids–“Don’t hit your brother! Stop jumping on the couch! For the umpteenth time, ‘yeah, right!’ is sarcasm.”

There are a lot of things that don’t come naturally to little kids. Many of them struggle to understand that these wiggly lines on paper can turn into words or that tiny, invisible things on their hands can make them sick.

“Yes, you have to brush your teeth and go to bed, no, I’m not explaining why again.”

And they definitely don’t understand why I won’t let them have ice cream for dinner.

“Don’t ride your bike down the hill and into the street like that! You could get hit by a car and DIE!”

Despite all of the effort I have devoted to transforming this wiggly bunch of feral children into respectable adults (someday, I hope,) I have never found myself concerned with the task of teaching them about gender. As a practical matter, whether the children behave like “girls” or “boys” makes little difference to the running of the household, because we have both–by contrast, whether the children put their dishes away after meals and do their homework without me having to threaten or cajole them makes a big difference.

Honestly, I can’t convince them not to pick their noses in public or that broccoli is tasty, but I’m supposed to somehow subtly convince them that they’ve got to play Minecraft because they’re boys (even while explicitly saying, “Hey, you’ve been playing that for two hours, go ride your bike,” or that they’re supposed to be walking doormats because they’re girls (even while saying, “Next time he pushes you, push him back!”)

And yet the boys still act like boys, the girls like girls–statistically speaking.

“Ah,” I hear some of you saying, “But you are just one parent! How do you know there aren’t legions of other parents who are out there doing everything they can to ensure that their sons succeed and daughters fail in life?”

This is, if you will excuse me, a very strange objection. What parent desires failure from their children?

I read a book and it’s Friday: Homicide, by Daly and Wilson

Today’s selection, Homicide, is ev psych with a side of anthropology; I am excerpting the chapter on people-who-murder-children. (You are officially forewarned.)

Way back in middle school, I happened across (I forget how) my first university-level textbook, on historical European families and family law. I got through the chapter on infanticide before giving up, horrified that enough Germans were smushing their infants under mattresses or tossing them into the family hearth that the Holy Roman Empire needed to be laws specifically on the subject.

It was a disillusioning moment.

Daly and Wilson’s Homicide, 1988, contributes some (slightly) more recent data to the subject, (though of course it would be nice to have even more recent data.

Picture 6 Picture 5 Picture 4 Picture 2 Picture 1 CgxAZrOUYAEeANF

(I think some of the oddities in # of incidents per year may be due to ages being estimated when the child’s true age isn’t known, eg, “headless torso of a boy about 6 years old found floating in the Thames.”)

We begin with a conversation on the subject of which child parents would favor in an emergency:

If parental motives are such as to promote the parent’s own fitness, then we should expect that parents will often be inclined to act so that neither sibling’s interests prevail completely. Typically, parental imposition of equity will involve supporting the younger, weaker competitor, even when the parent would favor the older if forced to choose between the two. It is this latter sort of situation–“Which do you save when one must be sacrificed?”–in which parents’ differential valuation of their children really comes to the fore. Recall that there were 11 societies in the ethnographic review of Chapter 3 for which it was reported that a newborn might be killed if the birth interval were too short or the brood too numerous. It should come as no surprise that there were no societies in which the prescribed solution to such a dilemma was said to be the death of an older child. … this reaction merely illustrates that one takes for granted the phenomenon under discussion, namely the gradual deepening of parental commitment and love.

*Thinks about question for a while* *flails* “BUT MY CHILDREN ARE ALL WONDERFUL HOW COULD I CHOSE?” *flails some more*

That said, I think there’s an alternative possibility besides just affection growing over time: the eldest child has already proven their ability to survive; an infant has not. The harsher the conditions of life (and thus, the more likelihood of actually facing a real situation in which you genuinely don’t have enough food for all of your children,) the higher the infant mortality rate. The eldest children have already run the infant mortality gauntlet and so are reasonably likely to make it to adulthood; the infants still stand a high chance of dying. Sacrificing the child you know is healthy and strong for the one with a high chance of dying is just stupid.

Whereas infant mortality is not one of my personal concerns.

Figure 4.4 shows that the risk of parental homicide is indeed a declining function of the child’s age. As we wold anticipate, the most dramatic decrease occurs between infants and 1-year-old children. One reason for expecting this is that the lion’s share of the prepubertal increase in reproductive value in natural environments occurs within the first year.

(I think “prepubertal increase in reproductive value” means “decreased likelihood of dying.”)

Moreover, if parental disinclination reflects any sort of assessment of the child’s quality or the mother’s situation, then an evolved assessment mechanisms should be such as to terminate any hopeless reproductive episode as early as possible, rather than to squander parental effort in an enterprise that will eventually be abandoned. … Mothers killed 61 in the first 6 months compared to just 27 in the second 6 months. For fathers, the corresponding numbers are 24 vs. 14. [See figure 4.4] … This pattern of victimization contrasts dramatically with the risk of homicide at the hands of nonrelatives (Figure 4.5)…

I would like to propose an alternative possibility: just as a child who attempts to drive a car is much more likely to crash immediately than to successfully navigate onto the highway and then crash, so a murderous person who gets their hands onto a child is more likely to kill it immediately than to wait a few years.

A similar mechanism may be at play in the apparent increase and then decrease in homicides of children by nonrelatives during toddlerhood. Without knowing anything about these cases, I can only speculate, but 1-4 are the ages when children are most commonly put into daycares or left with sitters while their moms return to work. The homicidally-minded among these caretakers, then, are likely to kill their charges sooner rather than later. (School-aged children, by contrast, are both better at running away from attackers and highly unlikely to be killed by their teachers.)

Teenagers are highly conflictual creatures, and the rate at which nonrelatives kill them explodes after puberty. When we consider the conspicuous, tempestuous conflicts that occur between teenagers and their parents–conflicts that apparently dwarf those of the preadolescent period–it is all the more remarkable that the risk of parental homicide continues its relentless decline to near zero.

… When mothers killed infants, the victims had been born to them at a mean age of 22.7 years, whereas older victims had been born at a mean maternal age of 24.5. Thi is a significant difference, but both means are signficantly below the 25.8 year that was the average age of all new Candian mothers during the same period, accoding to Cadian Vital Statistics.

In other words, impulsive fuckups who get accidentally pregnant are likely to be violent impulsive fuckups.

We find a similar result with respect to marital status: Mothers who killed older children are again intermediate between infanticidal women and the population-at-large. Whereas 51% of mothers committing infanticide were unmarried, the same was true of just 34% of those killing older children. This is still substantially above the 12% of Canadian births in which the new mother was unmarried …

Killing of an older child is often associated with maternal depression. Of the 95 mothers who killed a child beynd its infancy, 15.8% also committed suicide. … By contrast, only 2 of 88 infanticidal mothers committed suicide (and even this meager 2.3% probably overestimates the assocation of infanticide with suicide, since infanticides are the only category of homicides in which a significant incidence of undetected cases is likely.) … one of thee 2 killed three older children as well.

Anyone else thinking of Andrea Yates and her idiot husband?

In the Canadian data, it is also noteworthy that 35% of maternal infanticides were attributed by the investigating police force … [as] “mentally ill or mentally retarded (insane),” verses 58% of maternal homicides of older children. Here and elsewhere, it seems that the sots of cases that are simultaneously rare and seemingly contrary to the actor’s interests–in both the Darwinian and the commonsense meaning of interest–also happen t be the sorts of cases most likely to be attributed to some sort of mental incompetence. … We identify as mad those people who lack a species-typical nepotistic perception of their interests or who no longer care to pursue them. …

Violent people go ahead and kill their kids; people who go crazy later kill theirs later.

We do at least know the ages of the 38 men who killed heir infant children: the mean was 26.3 years. Moreover, we know that fathers averaged 4 years older than mothers for that substantial majority of Canadian births that occurred within marriages… . Since the mean age for all new Canadian mothers during the relevant period… was 25.8, it seems clear that infanticidal fathers are indeed relatively young. And as was the case with mothers, infanticidal fathers were significantly younger than those fathers who killed older offspring. (mean age at the victim’s birth = 29.2 years). …

As with mothers, fathers who killed older children killed themselves as well significantly more often (43.6% of 101) than did those who killed their infant children (10.5% of 38). Also like mothers is the fact that those infanticidal fathers who did commit suicide were significantly older (mean age = 30.5 years) than those who did not (mean = 25.8). Likewise, the paternal age at which older victims had been born was also significantly greater for suicidal (mean = 31.1 years; N = 71) than for nonsuicidal (mean =27.5; N = 67) homicidal fathers. And men who killed their older children were a little more likely to be deemed mentally incompetent (20.8%) than those who killed their infants (15.8%). …

Fathers, however, were significantly less likely to commit suicide after killing an adult offspring (19% of 21 men) than a child (50% of 80 men.) … 20 of the 22 adult victims of their father were sons… three of the four adult victims of mothers were daughters. … There is no hint of such a same-ex bias in the killings of either infants… or older children. …

An infrequent but regular variety of homicide is that in which a man destroys his wife and children. A corresponding act of familicide by the wife is almost unheard of. …

No big surprises in this section.

Perhaps the most obvious prediction from a Darwinian view of parental motives is this: Substitute parents will generally tend to care less profoundly for their children than natural parents, with the result that children reared by people other than their natural parents will be more often exploited and otherwise at risk. Parental investment is a precious resource, and selection must favor those parental psyches that do not squander it on nonrelatives.

Disclaimer: obviously there are good stepparents who care deeply for their stepchilden. I’ve known quite a few. But I’ve also met some horrible stepparents. Given the inherent vulnerability of children, I find distasteful our society’s pushing of stepparenting as normal without cautions against its dangers. In most cases, remarriage seems to be undertaken to satisfy the parent, not the  child.

In an interview study of stepparents in Cleveland, Ohio, for example–a study of predominantly middle-class group suffering no particular distress or dysfunction–Loise Duberman (1975) found that only 53% of stepfathers and 25% of stepmothers could claim to have “parental feeling” toward their stepchildren, and still fewer to “love” them.

Some of this may be influenced by the kinds of people who are likely to become stepparents–people with strong family instincts probably have better luck getting married to people like themselves and staying that way than people who are bad at relationships.

In an observational study of Trinidadian villagers, Mark Flinn (1988) found that stepfathers interacted less with “their” children than did natural fathers; that interactions were more likely to be aggressive within steprelationships than within the corresponding natural relationships; and that stepchildren left home at an earlier age.

Pop psychology and how-to manuals for stepfamilies have become a growth industry. Serious study of “reconstituted” families is also burgeoning. Virtually all of this literature is dominated by a single theme: coping with the antagonisms…

Here the authors stops to differentiate between between stepparenting and adoption, which they suspect is more functional due to adoptive parents actually wanting to be parents in the first place. However,

such children have sometimes been found to suffer when natural children are subsequently born to the adopting couple, a result that has led some professionals to counsel against adoption by childless couples until infertility is definitely established. …

Continuing on with stepparents:

The negative characterization of stepparents is by no means peculiar to our culture. … From Eskimos to Indonesians, through dozens of tales, the stepparent is the villain of every piece. … We have already encountered the Tikopia or Yanomamo husband who demands the death of his new wife’s prior children. Other solutions have included leaving the children with postmenopausal matrilineal relatives, and the levirate, a wide-spread custom by which a widow and her children are inherited by the dead man’s brother or other near relative. …

Social scientists have turned this scenario on its head. The difficulties attending steprelationships–insofar as they are acknowledged at all–are presumed to be caused by the “myth of the cruel stepparent” and the child’s fears.

See: Freud.

Why this bizarre counterintuitive view is the conventional wisdom would be  a topic for a longer book than this; suffice to say that the answer surely has more to do with ideology than with evidence. In any event, social scientists have staunchly ignored the question of the factual basis for the negative “stereotyping” of stepparents.

Under Freud’s logic, all sorts of people who’d been genuinely hurt by others were summarily dismissed, told that they were the ones who actually harbored ill-will against others and were just “projecting” their emotions onto their desired victims.

Freudianism is a crock of shit, but in this case, it helped social “reformers” (who of course don’t believe in silly ideas like evolution) discredit people’s perfectly reasonable fears in order to push the notion that “family” doesn’t need to follow traditional (ie, biological) forms, but can be reinvented in all sorts of novel ways.

So are children at risk in stepparent homes in contemporary North America? [see Figures 4.7 and 4.8.] … There is … no appreciable statistical confounding between steprelationships and poverty in North America. … Stepparenthood per se remains the single most powerful risk factor for child abuse that has yet been identified. (here and throughout this discussion “stepparents” include both legal and common-law spouses of the natural parent.) …

Speaking of Figures 4.7 and 4.8, I must say that the kinds of people who get divorced (or were never married) and remarried within a year of their kid’s birth are likely to be unstable people who tend to pick particularly bad partners, and the kinds of people willing to enter into a relationship with someone who has a newborn is also likely to be, well, unusual. Apparently homicidal.

By contrast, the people who are willing to marry someone who already has, say, a ten year old, may be relatively normal folks.

Just how great an elevation of risk are we talking about? Our efforts to answer that question have been bedeviled by a lack of good information in the living arrangements of children in the general population. … there are no official statistics [as of when this was written] on the numbers of children of each age who live in each household type. There is no question that the 43% of murdered American child abuse victims who dwelt with substitute parents is far more than would be expected by chance, but estimates of that expected percentage can only be derived from surveys that were designed to answer other questions. For a random sample of American children in 1976, … the best available national survey… indicates that only about 1% or fewer would be expected to have dwelt with a substitute parent. An American child living with one or more substitute parents in 1976 was therefore approximately 100 times as likely to be fatally abused as a child living with natural parents only…

Results for Canada are similar. In Hamilton, Ontario in 1983, for example, 16% of child abuse victims under 5 years of age lived with a natural parent and a stepparent… Since small children very rarely have stepparents–less than 1% of preschoolers in Hamilton in 1983, for example–that 16% represents forty times the abuse rate for children of the same age living with natural parents. … 147 Canadian children between the ages of 1 and 4 were killed by someone in loco parentis between 1974 and 1983; 37 of those children (25.2%) were the victims of their stepparents, and another 5 (3.4%) were killed by unrelated foster parents.

…The survey shows, for example, that 0.4% of 2,852 Canadian children, aged 1-4 in 1984, lived with a stepparent. … For the youngest age group in Figure 4.9, those 2 years of age and younger, the risk from a stepparent is approximately 70 times that from a natural parent (even though the later category includes all infanticides by natural mothers.)

Now we need updated data. I wonder if abortion has had any effect on the rates of infanticide and if increased public acceptance of stepfamilies has led to more abused children or higher quality people being willing to become stepparents.

Why do people watch so much TV?

Honestly, left to my own devices, I wouldn’t own a TV. (With Mythbusters canceled, what’s the point anymore?)

Don’t get me wrong. I have watched (and even enjoyed) the occasional sitcom. I’ve even tried watching football. I like comedies. They’re funny. But after they end, I get that creeping feeling of emptiness inside, like when you’ve eaten a bowl of leftover Halloween candy instead of lunch. There is no “meat” to these programs–or vegan-friendly vegetable protein, if you prefer.

I do enjoy documentaries, though I often end up fast-forwarding through large chunks of them because they are full of filler shots of rotating galaxies or astronomers parking their telescopes or people… taalkiiing… sooo… sloooowwwwlllly… And sadly, if you’ve seen one documentary about ancient Egypt, you’ve seen them all.

Ultimately, time is a big factor: I am always running short. Once I’m done with the non-negotiables (like “take care of the kids” and “pay the bills,”) there’s only so much time left, and time spent watching TV is time not spent writing. Since becoming a competent writer is one of my personal goals, TV gets punted to the bottom of the list, slightly below doing the dishes.

Obviously not everyone writes, but I have a dozen other backup projects for when I’m not writing, everything from “read more books” to “volunteer” to “exercise.”

I think it is a common fallacy to default to assuming that other people are like oneself. I default to assuming that other people are time-crunched, running on 8 shots of espresso and trying to cram in a little time to read Tolstoy and get the tomatoes planted before they fall asleep. (And I’m not even one of those Type-A people.)

Obviously everyone isn’t like me. They come home from work, take care of their kids, make dinner, and flip on the TV.

Why?

An acquaintance recently made a sad but illuminating comment regarding their favorite TV shows, “I know they’re not real, but it feels like they are. It’s like they’re my friends.”

I think the simple answer is that we process the pictures on the TV as though they were real. TV people look like people and sound like people, so who cares if they don’t smell like people? Under normal (pre-TV) circumstances, if you hung out with some friendly, laughing people every day in your living room, they were your family. You liked them, they liked you, and you were happy together.

Today, in our atomized world of single parents, only children, spinsters and eternal bachelors, what families do we have? Sure, we see endless quantities of people on our way to work, but we barely speak, nod, or glance at each other, encapsulated within our own cars or occupied with checking Facebook on our cellphones while the train rumbles on.

As our connections to other people have withered away, we’ve replaced them with fake ones.

Google “America’s Favorite Family“:

OZZIE & HARRIET: The Adventures of America’s Favorite Family

The Adventures of Ozzie and Harriet was the first and longest-running family situational comedy in television history. The Nelsons came to represent the idealized American family of the 1950s – where mom was a content homemaker, dad’s biggest decision was whether to give his sons the keys to the car, and the boys’ biggest problem was getting a date to the high school prom. …When it premiered, Ozzie & Harriet: The Adventures of America’s Favorite Family was the highest-rated documentary in A&E’s history.

(According to Wikipedia, Ozzie and Harriet started on the radio back in the 30s, got a comedy show (still on radio) in 1944, and were on TV from 1952-1966.) It was, to some extent, about a real family–the actors in the show were an actual husband and wife + their kids, but the show itself was fictionalized.

It even makes sense to people to ask them, “Who is your favorite TV personality?“–to which the most common answer isn’t Adam Savage or James Hyneman, but Mark Harmon, who plays some made-up guy named Leroy Jethro Gibbs.

The rise of “reality TV” only makes the “people want to think of the TV people as real people they’re actually hanging out with” all the more palpable–and then there’s the incessant newsstand harping of celebrity gossip. The only thing I want out of a movie star (besides talent) is that I not recognize them; it appears that the only thing everyone else wants is that they do recognize them.

According to The Way of the Blockbuster: In entertainment, big bets on likely winners rule:

in Blockbusters: Hit-Making, Risk-Taking, and the Big Business of Entertainment, the new book by Anita Elberse, Filene professor of business administration. Elberse (el-BER-see) spent 10 years interviewing and observing film, television, publishing, and sports executives to distill the most profitable strategy for these high-profile, unpredictable marketplaces. … The most profitable business strategy, she says, is not the “long tail,” but its converse: blockbusters like Star Wars, Avatar, Friends, the Harry Potter series, and sports superstars like Tom Brady.

Strategically, the blockbuster approach involves “making disproportionately big investments in a few products designed to appeal to mass audiences,” … “Production value” means star actors and special effects. … a studio can afford only a few “event movies” per year. But Horn’s big bets for Warner Brothers—the Harry Potter series, The Dark Knight, The Hangover and its sequel, Ocean’s Eleven and its two sequels, Sherlock Holmes—drew huge audiences. By 2011, Warner became the first movie studio to surpass $1 billion in domestic box-office receipts for 11 consecutive years. …

Jeff Zucker ’86 put a contrasting plan into place as CEO at NBC Universal. In 2007 he led a push to cut the television network’s programming costs: … Silverman began cutting back on expensive dramatic content, instead acquiring rights to more reasonably priced properties; eschewing star actors and prominent TV producers, who commanded hefty fees; and authorizing fewer costly pilots for new series. The result was that by 2010, NBC was no longer the top-rated TV network, but had fallen to fourth place behind ABC, CBS, and Fox, and “was farther behind on all the metrics that mattered,” writes Elberse, “including, by all accounts, the profit margins Zucker and Silverman had sought most.” Zucker was asked to leave his job in 2010. …

From a business perspective, “bankable” movies stars like Julia Roberts, Johnny Depp, or George Clooney function in much the way Harry Potter and Superman do: providing a known, well-liked persona.

So people like seeing familiar faces in their movies (except Oprah Winfrey, who is apparently not a draw:

the 1998 film Beloved, starring Oprah Winfrey, based on Nobel Prize-winner Toni Morrison’s eponymous 1987 novel and directed by Oscar-winner Jonathan Demme … flopped resoundingly: produced for $80 million, it sold only $23 million in tickets.

Or maybe Beloved isn’t just the kind of feel-good action flick that drives movie audiences the way Batman is.)

But what about sports?

Here I am on even shakier ground than sitcoms. I can understand playing sports–they’re live action versions of video games, after all. You get to move around, exercise, have fun with your friends, and triumphantly beat them at something. (Or if you’re me, lose.) I can understand cheering for your kids and being proud of them as they get better and better at some athletic skill (or at least try hard at it.)

I don’t understand caring about strangers playing a game.

I have no friends on the Yankees or the Mets, the Phillies or the Marlins. I’ve never met a member of the Alabama Crimson Tide or the Clemson Tigers, and I harbor no illusions that my children will ever play on such teams. I feel no loyalty to the athletes-drawn-from-all-over-the-country who play on my “hometown” team, and I consider athlete salaries vaguely obscene.

I find televised sports about as interesting as watching someone do math. If the point of the game is to win, then why not just watch a 5-minute summary at the end of the day of all the teams’ wins and losses?

But according to The Way of the Blockbuster:

Perhaps no entertainment realm takes greater care in building a brand name than professional sports: fan loyalty reliably builds repeat business. “The NFL is blockbuster content,” Elberse says. “It’s the most sought-after content we have in this country. Four of the five highest-rated television shows [in the United States] ever are Super Bowls. NFL fans spend an average of 9.5 hours per week on games and related content. That gives the league enormous power when it comes to negotiating contracts with television networks.”

Holy shit. No wonder Borders went under.

Elberse has studied American football and basketball and European soccer, and found that selling pro sports has much in common with selling movies, TV shows, or books. Look at the Real Madrid soccer club—the world’s richest, with annual revenues of $693 million and a valuation of $3.3 billion. Like Hollywood studios, Real Madrid attracts fan interest by engaging superstars—such as Cristiano Ronaldo, the Portuguese forward the club acquired from Manchester United for a record $131.6 million in 2009. “We think of ourselves as content producers,” a Real Madrid executive told Elberse, “and we think of our product—the match—as a movie.” As she puts it: “It might not have Tom Cruise in it, but they do have Cristiano Ronaldo starring.

In America, sports stars are famous enough that even I know some of their names, like Peyton Manning, Serena Williams, and Michel Jackson Jordan.

I think the basic drive behind people’s love of TV sports is the same as their love of sitcoms (and dramas): they process it as real. And not just real, but as people they know: their family, their tribe. Those are their boys out there, battling for glory and victory against that other tribes’s boys. It’s vicarious warfare with psuedo armies, a domesticated expression of the tribal urge to slaughter your enemies, drive off their cattle and abduct their women. So what if the army isn’t “real,” if the heroes aren’t your brother or cousin but paid gladiators shipped in from thousands of miles away to perform for the masses? Your brain still interprets it as though it were; you still enjoy it.

Football is man-fiction.

Is Capitalism the only reason to care about Intelligence? pt 2

Continuing with yesterday’s discussion (in response to a reader’s question):

  1. Why are people snobs about intelligence?
  2. Is math ability better than verbal?
  3. Do people only care about intelligence in the context of making money?

1. People are snobs. Not all of them, obviously–just a lot of them.

So we’re going to have to back this up a step and ask why are people snobs, period.

Paying attention to social status–both one’s own and others’–is probably instinctual. We process social status in our prefrontal cortexes–the part of our brain generally involved in complex thought, imagination, long-term planning, personality, not being a psychopath, etc. Our brains respond positively to images of high-status items–activating reward-feedback loop that make us feel good–and negatively to images of low-status items–activating feedback loops that make us feel bad.

The mental effect is stronger when we perform high-status actions in front of others:

…researchers asked a person if the following statement was an accurate description of themselves: “I wouldn’t hesitate to go out of my way to help someone in trouble.” Some of the participants answered the question without anyone else seeing their response. Others knowingly revealed their answer to two strangers who were watching in a room next to them via video feed. The result? When the test subjects revealed an affirmative answer to an audience, their [medial prefrontal cortexes] lit up more strongly than when they kept their answers to themselves. Furthermore, when the participants revealed their positive answers not to strangers, but to those they personally held in high regard, their MPFCs and reward striatums activated even more strongly. This confirms something you’ve assuredly noticed in your own life: while we generally care about the opinions of others, we particularly care about the opinions of people who really matter to us.

(Note what constitutes a high-status activity.)

But this alone does not prove that paying attention to social status is instinctual. After all, I can also point to the part of your brain that processes written words (the Visual Word Form Area,) and yet I don’t assert that literacy is an instinct. For that matter, anything we think about has to be processed in our brains somewhere, whether instinct or not.

Better evidence comes from anthropology and zoology. According to Wikipedia, “All societies have a form of social status,” even hunter-gatherers. If something shows up in every single human society, that’s a pretty good sign that it is probably instinctual–and if it isn’t, it is so useful a thing that no society exists without it.

Even animals have social status–“Social status hierarchies have been documented in a wide range of animals: apes,[7] baboons,[8] wolves,[9] cows/bulls,[10] hens,[11] even fish,[12] and ants.[13]” We may also add horses, many monkey species, elephants, killer whales, reindeer, and probably just about all animals that live in large groups.

Among animals, social status is generally determined by a combination of physical dominance, age,  relationship, and intelligence. Killer whale pods, for example, are led by the eldest female in the family; leadership in elephant herds is passed down from a deceased matriarch to her eldest daughter, even if the matriarch has surviving sisters. Male lions assert dominance by being larger and stronger than other lions.

In all of these cases, the social structure exists because it benefits the group, even if it harms some of the individuals in it. If having no social structure were beneficial for wolves, then wolf packs without alpha wolves would out-compete packs with alphas. This is the essence of natural selection.

Among humans, social status comes in two main forms, which I will call “earned” and “background.”

“Earned” social status stems from things you do, like rescuing people from burning buildings, inventing quantum physics, or stealing wallets. High status activities are generally things that benefit others, and low-status activities are generally those that harm others. This is why teachers are praised and thieves are put in prison.

Earned social status is a good thing, because it reward people for being helpful.

“Background” social status is basically stuff you were born into or have no effect over, like your race, gender, the part of the country you grew up in, your accent, name, family reputation, health/disability, etc.

Americans generally believe that you should not judge people based on background social status, but they do it, anyway.

Interestingly, high-status people are not generally violent. (Just compare crime rates by neighborhood SES.) Outside of military conquest, violence is the domain of the low-class and those afraid they are slipping in social class, not the high class. Compare Andrea Merkel to the average German far-right protester. Obviously the protester would win in a fist-fight, but Merkel is still in charge. High class people go out of their way to donate to charity, do volunteer work, and talk about how much they love refugees. In the traditional societies of the Pacific Northwest, they held potlatches at which they distributed accumulated wealth to their neighbors; in our society, the wealthy donate millions to education. Ideally, in a well-functioning system, status is the thanks rich people get for doing things that benefit the community instead of spending their billions on gold-plated toilets.

You may recall from “Slate Star Codex finds Aristocracy, doesn’t notice,” the quoted descriptions of social status among birds in “Contra Simler on Prestige,” (from Kevin Simler’s Social Status: Down The Rabbit Hole):

The Arabian babbler … spends most of its life in small groups of three to 20 members. These groups lay their eggs in a communal nest and defend a small territory of trees and shrubs that provide much-needed safety from predators.

When it’s living as part of a group, a babbler does fairly well for itself. But babblers who get kicked out of a group have much bleaker prospects. These “non-territorials” are typically badgered away from other territories and forced out into the open, where they often fall prey to hawks, falcons, and other raptors. So it really pays to be part of a group. … Within a group, babblers assort themselves into a linear and fairly rigid dominance hierarchy, i.e., a pecking order. When push comes to shove, adult males always dominate adult females — but mostly males compete with males and females with females. Very occasionally, an intense “all-out” fight will erupt between two babblers of adjacent rank, typically the two highest-ranked males or the two highest-ranked females. …

Most of the time, however, babblers get along pretty well with each other. In fact, they spend a lot of effort actively helping one another and taking risks for the benefit of the group. They’ll often donate food to other group members, for example, or to the communal nestlings. They’ll also attack foreign babblers and predators who have intruded on the group’s territory, assuming personal risk in an effort to keep others safe. One particularly helpful activity is “guard duty,” in which one babbler stands sentinel at the top of a tree, watching for predators while the rest of the group scrounges for food. The babbler on guard duty not only foregoes food, but also assumes a greater risk of being preyed upon, e.g., by a hawk or falcon. …

Unlike chickens, who compete to secure more food and better roosting sites for themselves, babblers compete to give food away and to take the worst roosting sites. Each tries to be more helpful than the next. And because it’s a competition, higher-ranked (more dominant) babblers typically win, i.e., by using their dominance to interfere with the helpful activities of lower-ranked babblers. This competition is fiercest between babblers of adjacent rank. So the alpha male, for example, is especially eager to be more helpful than the beta male, but doesn’t compete nearly as much with the gamma male. Similar dynamics occur within the female ranks.

And from Jim’s Blog, “A Lost Military Technology“:

In the eighteenth and early nineteenth century, wealthy private individuals substantially supported the military, with a particular wealthy men buying stuff for a particular regiment or particular fort.

Noblemen paid high prices for military commands, and these posts were no sinecure.  You got the obligation to substantially supply the logistics for your men, the duty to obey stupid orders that would very likely lead to your death, the duty to lead your men from in front while wearing a costume designed to make you particularly conspicuous, and the duty to engage in honorable personal combat, man to man, with your opposite number who was also leading his troops from in front.

A vestige of this tradition remains in that every English prince has been sent to war and has placed himself very much in harm’s way.

It seems obvious to me that a soldier being led by a member of the ruling class who is soaking up the bullets from in front is a lot more likely to be loyal and brave than a soldier sent into battle by distant rulers safely in Washington who despise him as a sexist homophobic racist murderer, that a soldier who sees his commander, a member of the ruling classes, fighting right in front of him, is reflexively likely to fight.

(Note, however, that magnanimity is not the same as niceness. The only people who are nice to everyone are store clerks and waitresses, and they’re only nice because they have to be or they’ll get fired.)

Most people are generally aware of each others’ social statuses, using contextual clues like clothing and accents to make quick, rough estimates. These contextual clues are generally completely neutral–they just happen to correlate with other behaviors.

For example, there is nothing objectively good or bad for society about wearing your pants belted beneath your buttocks, aside from it being an awkward way to wear your pants. But the style correlates with other behaviors, like crime, drug use, and aggression, low paternal investment, and unemployment, all of which are detrimental to society, and so the mere sight of underwear spilling out of a man’s pants automatically assigns him low status. There is nothing causal in this relationship–being a criminal does not make you bad at buckling your pants, nor does wearing your pants around your knees somehow inspire you to do drugs. But these things correlate, and humans are very good at learning patterns.

"The New Age Traveler"
The New Age Traveler” can be yours for a mere $26,000!

Likewise, there is nothing objectively better about operas than Disney movies, no real difference between a cup of coffee brewed in the microwave and one from Starbucks; a Harley Davidson and a Vespa are both motorcycles; and you can carry stuff around in just about any bag or backpack, but only the hoity-toity can afford something as objectively hideous as a $26,000 Louis Vutton backpack.

All of these things are fairly arbitrary and culturally dependent–the way you belt your pants can’t convey social status in a society where people don’t wear pants; your taste in movies couldn’t matter before movies were invented. Among hunter-gatherers, social status is based on things like one’s skills at hunting, and if I showed up to the next PTA meeting wearing a tophat and monocle, I wouldn’t get any status points at all.

We tend to aggregate the different social status markers into three broad classes (middle, upper, and lower.) As Scott Alexander says in his post about Siderea’s essay on class in America, which divides the US into 10% Underclass, 65% Working Class, 23.5% Gentry Class, and 1.5% Elite:

Siderea notes that Church’s analysis independently reached about the same conclusion as Paul Fussell’s famous guide. I’m not entirely sure how you’d judge this (everybody’s going to include lower, middle, and upper classes), but eyeballing Fussell it does look a lot like Church, so let’s grant this.

It also doesn’t sound too different from Marx. Elites sound like capitalists, Gentry like bourgeoisie, Labor like the proletariat, and the Underclass like the lumpenproletariat. Or maybe I’m making up patterns where they don’t exist; why should the class system of 21st century America be the same as that of 19th century industrial Europe?

There’s one more discussion of class I remember being influenced by, and that’s Unqualified Reservations’ Castes of the United States. Another one that you should read but that I’ll summarize in case you don’t:

1. Dalits are the underclass, … 2. Vaisyas are standard middle-class people … 3. Brahmins are very educated people … 4. Optimates are very rich WASPs … now they’re either extinct or endangered, having been pretty much absorbed into the Brahmins. …

Michael Church’s system (henceforth MC) and the Unqualified Reservation system (henceforth UR) are similar in some ways. MC’s Underclass matches Dalits, MC’s Labor matches Vaisyas, MC’s Gentry matches Brahmins, and MC’s Elite matches Optimates. This is a promising start. It’s a fourth independent pair of eyes that’s found the same thing as all the others. (commenters bring up Joel Kotkin and Archdruid Report as similar convergent perspectives).

I suspect the tendency to try to describe society as consisting of three broad classes (with the admission that other, perhaps tiny classes that don’t exactly fit into the others might exist) is actually just an artifact of being a three-biased society that likes to group things in threes (the Trinity, three-beat joke structure, three bears, Three Musketeers, three notes in a chord, etc.) This three-bias isn’t a human universal (or so I have read) but has probably been handed down to us from the Indo-Europeans, (“Many Indo-European societies know a threefold division of priests, a warrior class, and a class of peasants or husbandmen. Georges Dumézil has suggested such a division for Proto-Indo-European society,”) so we’re so used to it that we don’t even notice ourselves doing it.

(For more information on our culture’s three-bias and different number biases in other cultures, see Alan Dundes’s Interpreting Folklore, though I should note that I read it back in highschool and so my memory of it is fuzzy.)

c5933(Also, everyone is probably at least subconsciously cribbing Marx, who was probably cribbing from some earlier guy who cribbed from another earlier guy, who set out with the intention of demonstrating that society–divided into nobles, serfs, and villagers–reflected the Trinity, just like those Medieval maps that show the world divided into three parts or the conception of Heaven, Hell, and Purgatory.)

At any rate, I am skeptical of any system that lumps 65% of people into one social class and 0.5% of people into a different social class as being potentially too-finely grained at one end of the scale and not enough at the other. Determining the exact number of social classes in American society may ultimately be futile–perhaps there really are three (or four) highly distinct groups, or perhaps social classes transition smoothly from one to the next with no sharp divisions.

I lean toward the latter theory, with broad social classes as merely a convenient shorthand for extremely broad generalizations about society. If you look any closer, you tend to find that people do draw finer-grained distinctions between themselves and others than “65% Working Class” would imply. For example, a friend who works in agriculture in Greater Appalachia once referred dismissively to other people they had to deal with as “red necks.” I might not be able to tell what differentiates them, but clearly my friend could. Similarly, I am informed that there are different sorts of homelessness, from true street living to surviving in shelters, and that lifetime homeless people are a different breed altogether. I might call them all “homeless,” but to the homeless, these distinctions are important.

Is social class evil?

This question was suggested by a different friend.

I suspect that social class is basically, for the most part, neutral-to-useful. I base this on the fact that most people do not work very hard to erase markers of class distinction, but instead actively embrace particular class markers. (Besides, you can’t get rid of it, anyway.)

It is not all that hard to learn the norms and values of a different social class and strategically employ them. Black people frequently switch between speaking African American Vernacular English at home and standard English at work; I can discuss religion with Christian conservatives and malevolent AI risk with nerds; you can purchase a Harley Davidson t-shirt as easily as a French beret and scarf.

(I am reminded here of an experiment in which researchers were looking to document cab drivers refusing to pick up black passengers; they found that when the black passengers were dressed nicely, drivers would pick them up, but when they wore “ghetto” clothes, the cabs wouldn’t. Cabbies: responding more to perceived class than race.)

And yet, people don’t–for the most part–mass adopt the social markers of the upper class just to fool them. They love their motorcycle t-shirts, their pumpkin lattes, even their regional accents. Class markers are an important part of peoples’ cultural / tribal identities.

But what about class conflicts?

Because every class has its own norms and values, every class is, to some degree, disagreeing with the other classes. People for whom frugality and thrift are virtues will naturally think that people who drink overpriced coffee are lacking in moral character. People for whom anti-racism is the highest virtue will naturally think that Trump voters are despicable racists. A Southern Baptist sees atheists as morally depraved fetus murderers; nerds and jocks are famously opposed to each other; and people who believe that you should graduate from college, become established in your career, get married, and then have 0-1.5 children disapprove of people who drop out of highschool, have a bunch of children with a bunch of different people, and go on welfare.

A moderate sense of pride in one’s own culture is probably good and healthy, but spending too much energy hating other groups is probably negative–you may end up needlessly hurting people whose cooperation you would have benefited from, reducing everyone’s well-being.

(A good chunk of our political system’s dysfunctions are probably due to some social classes believing that other social classes despise them and are voting against their interests, and so counter-voting to screw over the first social class. I know at least one person who switched allegiance from Hillary to Trump almost entirely to stick it to liberals they think look down on them for classist reasons.)

Ultimately, though, social class is with us whether we like it or not. Even if a full generation of orphan children were raised with no knowledge of their origins and completely equal treatment by society at large, each would end up marrying/associating with people who have personalities similar to themselves (and remember that genetics plays a large role in personality.) Just as current social classes in America are ethnically different, (Southern whites are drawn from different European populations than Northern whites, for example,) so would the society resulting from our orphanage experiment differentiate into genetically and personalityish-similar groups.

Why do Americans generally proclaim their opposition to judging others based on background status, and then act classist, anyway? There are two main reasons.

  1. As already discussed, different classes have real disagreements with each other. Even if I think I shouldn’t judge others, I can’t put aside my moral disgust at certain behaviors just because they happen to correlate with different classes.
  2. It sounds good to say nice, magnanimous things that make you sound more socially sensitive and aware than others, like, “I wouldn’t hesitate to go out of my way to help someone in trouble.” So people like to say these things whether they really mean them or not.

In reality, people are far less magnanimous than they like to claim they are in front of their friends. People like to say that we should help the homeless and save the whales and feed all of the starving children in Africa, but few people actually go out of their way to do such things.

There is a reason Mother Teresa is considered a saint, not an archetype.

In real life, not only does magnanimity has a cost, (which the rich can better afford,) but if you don’t live up to your claims, people will notice. If you talk a good talk about loving others but actually mistreat them, people will decide that you’re a hypocrite. On the internet, you can post memes for free without havng to back them up with real action, causing discussions to descend into competitive-virtue signalling in which no one wants to be the first person to admit that they actually are occasionally self-interested. (Cory Doctorow has a relevant discussion about how “reputations economies”–especially internet-based ones–can go horribly wrong.)

Unfortunately, people often confuse background and achieved status.

American society officially has no hereditary social classes–no nobility, no professions limited legally to certain ethnicities, no serfs, no Dalits, no castes, etc. Officially, if you can do the job, you are supposed to get it.

Most of us believe, at least abstractly, that you shouldn’t judge or discriminate against others for background status factors they have no control over, like where they were born, the accent thy speak with, or their skin tone. If I have two resumes, one from someone named Lakeesha, and the other from someone named Ian William Esquire III, I am supposed to consider each on their merits, rather than the connotations their names invoke.

But because “status” is complicated, people often go beyond advocating against “background” status and also advocate that we shouldn’t accord social status for any reasons. That is, full social equality.

This is not possible and would be deeply immoral in practice.

When you need heart surgery, you really hope that the guy cutting you open is a top-notch heart surgeon. When you’re flying in an airplane, you hope that both the pilot and the guys who built the plane are highly skilled. Chefs must be good at cooking and authors good at writing.

These are all forms of earned status, and they are good.

Smart people are valuable to society because they do nice things like save you from heart attacks or invent cell-phones. This is not “winning at capitalism;” this is benefiting everyone around them. In this context, I’m happy to let smart people have high status.

In a hunter-gatherer society, smart people are the ones who know the most about where animals live and how to track them, how to get water during a drought, and where that 1-inch stem they spotted last season that means a tasty underground tuber is located. Among nomads, smart people are the ones with the biggest mental maps of the territory, the folks who know the safest and quickest routes from good summer pasture to good winter pasture, how to save an animal from dying and how to heal a sick person. Among pre-literate people, smart people composed epic poems that entertained their neighbors for many winters’ nights, and among literate ones, the smart people became scribes and accountants. Even the communists valued smart people, when they weren’t chopping their heads off for being bourgeois scum.

So even if we say, abstractly, “I value all people, no matter how smart they are,” the smart people do more of the stuff that benefits society than the dumb people, which means they end up with higher social status.

So, yes, high IQ is a high social status marker, and low IQ is a low social status marker, and thus at least some people will be snobs about signaling their IQ and their disdain for dumb people.

BUT.

I am speaking here very abstractly. There are plenty of “high status” people who are not benefiting society at all. Plenty of people who use their status to destroy society while simultaneously enriching themselves. And yes, someone can come into a community, strip out all of its resources and leave behind pollution and unemployment, and happily call it “capitalism” and enjoy high status as a result.

I would be very happy if we could stop engaging in competitive holiness spirals and stop lionizing people who became wealthy by destroying communities. I don’t want capitalism at the expense of having a pleasant place to live in.

Part 3 tomorrow.

 

The Neurology of Cross-Cultural Authority? pt 2

As we were discussing yesterday, I theorize that people have neural feedback loops that reward them for conforming/imitating others/obeying authorities and punish them for disobeying/not conforming.

This leads people to obey authorities or go along with groups even when they know, logically, that they shouldn’t.

There are certainly many situations in which we want people to conform even though they don’t want to, like when my kids have to go to bed or buckle their seatbelts–as I said yesterday, the feedback loop exists because it is useful.

But there are plenty of situations where we don’t want people to conform, like when trying to brainstorm new ideas.

Under what conditions will people disobey authority?

As we previously discussed, using technology to create anonymous, a-reputational conversations may allow us to avoid some of the factors that lead to group think.

But in person, people may disobey authorities when they have some other social systtem to fall back on. If disobeying an authority in Society A means I lose social status in Society A, I will be more likely to disobey if I am a member in good standing in Society B.

If I can use my disobedience against Authority A as social leverage to increase my standing in Society B, then I am all the more likely to disobey. A person who can effectively stand up to an authority figure without getting punished must be, our brains reason, a powerful person, an authority in their own right.

Teenagers do this all the time, using their defiance against adults, school, teachers, and society in general to curry higher social status among other teenagers, the people they actually care about impressing.

SJWs do this, too:



I normally consider the president of Princeton an authority figure, and even though I probably disagree with him on far more political matters than these students do, I’d be highly unlikely to be rude to him in real life–especially if I were a student he could get expelled from college.

But if I had an outside audience–Society B–clapping and cheering for me behind the scenes, the urge to obey would be weaker. And if yelling at the President of Princeton could guarantee me high social status, approval, job offers, etc., then there’s a good chance I’d do it.

But then I got to thinking: Are there any circumstances under which these students would have accepted the president’s authority?

Obviously if the man had a proven track record of competently performing a particular skill the students wished to learn, they might follow hi example.

Or not.

If authority works via neural feedback loops, employing some form of “mirror neurons,” do these systems activate more strongly when the people we are perceiving look more like ourselves (or our internalized notion of people in our “tribe” look like, since mirrors are a recent invention)?

In other words, what would a cross-racial version of the Milgram experiment look like?

Unfortunately, it doesn’t look like anyone has tried it (and to do it properly, it’d need to be a big experiment, involving several “scientists” of different races [so that the study isn’t biased by one “scientist” just being bad at projecting authority] interacting with dozens of students of different races, which would be a rather large undertaking.) I’m also not finding any studies on cross-racial authority (I did find plenty of websites offering practical advice about different groups’ leadership styles,) though I’m sure someone has studied it.

However, I did find cross-racial experiments on empathy, which may involve the same brain systems, and so are suggestive:

From Racial Bias Reduces Empathic Sensorimotor Resonance with Other-Race Pain, by Avenanti et al:

Using transcranial magnetic stimulation, we explored sensorimotor empathic brain responses in black and white individuals who exhibited implicit but not explicit ingroup preference and race-specific autonomic reactivity. We found that observing the pain of ingroup models inhibited the onlookers’ corticospinal system as if they were feeling the pain. Both black and white individuals exhibited empathic reactivity also when viewing the pain of stranger, very unfamiliar, violet-hand models. By contrast, no vicarious mapping of the pain of individuals culturally marked as outgroup members on the basis of their skin color was found. Importantly, group-specific lack of empathic reactivity was higher in the onlookers who exhibited stronger implicit racial bias.

From Taking one’s time in feeling other-race pain: an event-related potential investigation on the time-course of cross-racial empathy, by Sessa et al.:

Using the event-related potential (ERP) approach, we tracked the time-course of white participants’ empathic reactions to white (own-race) and black (other-race) faces displayed in a painful condition (i.e. with a needle penetrating the skin) and in a nonpainful condition (i.e. with Q-tip touching the skin). In a 280–340 ms time-window, neural responses to the pain of own-race individuals under needle penetration conditions were amplified relative to neural responses to the pain of other-race individuals displayed under analogous conditions.

In Seeing is believing: neural mechanisms of action-perception are biased by team membership, Molenberghs et al. write:

In this study, we used functional magnetic resonance imaging (fMRI) to investigate how people perceive the actions of in-group and out-group members, and how their biased view in favor of own team members manifests itself in the brain. We divided participants into two teams and had them judge the relative speeds of hand actions performed by an in-group and an out-group member in a competitive situation. Participants judged hand actions performed by in-group members as being faster than those of out-group members, even when the two actions were performed at physically identical speeds. In an additional fMRI experiment, we showed that, contrary to common belief, such skewed impressions arise from a subtle bias in perception and associated brain activity rather than decision-making processes, and that this bias develops rapidly and involuntarily as a consequence of group affiliation. Our findings suggest that the neural mechanisms that underlie human perception are shaped by social context.

None of these studies shows definitevely whether or not in-group vs. out-group biases are an inherent feature of neurological systems, or Avenanti’s finding that people were more empathetic toward a purple-skinned person than to a member of a racial out-group suggests that some amount of learning is involved in the process–and that rather than comparing people against one’s in-group, we may be comparing them against our out-group.

At any rate, you may get similar outcomes either way.

In cases where you want to promote group cohesion and obedience, it may be beneficial to sort people by self-identity.

In cases where you want to guard against groupthink, obedience, or conformity, it may be beneficial to mix up the groups. Intellectual diversity is great, but even ethnic diversity may help people resist defaulting to obedience, especially when they know they shouldn’t.

A study by McKinsey and Company suggests that mixed-race companies outperform more homogenous companies:

web_diversity_matters_ex_mk_v2

but I can find other studies that suggest the opposite, eg, Women Don’t Mean Business? Gender Penalty in Board Appointments, by Isabelle Solal:

Using data from two panel studies on U.S. firms and an online experiment, we examine investor reactions to increases in board diversity. Contrary to conventional wisdom, we find that appointing female directors has no impact on objective measures of performance, such as ROA, but does result in a systematic decrease in market value.

(Solal argues that investors may perceive the hiring of women–even competent ones–as a sign that the company is pursuing social justice goals instead of money-making goals and dump the stock.)

Additionally, diverse companies may find it difficult to work together toward a common goal–there is a good quantity of evidence that increasing diversity decreases trust and inhibits group cohesion. EG, from The downside of diversity:

IT HAS BECOME increasingly popular to speak of racial and ethnic diversity as a civic strength. From multicultural festivals to pronouncements from political leaders, the message is the same: our differences make us stronger.

But a massive new study, based on detailed interviews of nearly 30,000 people across America, has concluded just the opposite. Harvard political scientist Robert Putnam — famous for “Bowling Alone,” his 2000 book on declining civic engagement — has found that the greater the diversity in a community, the fewer people vote and the less they volunteer, the less they give to charity and work on community projects. In the most diverse communities, neighbors trust one another about half as much as they do in the most homogenous settings. The study, the largest ever on civic engagement in America, found that virtually all measures of civic health are lower in more diverse settings.

As usual, I suspect there is an optimum level of diversity–depending on a group’s purpose and its members’ preferences–that helps minimize groupthink while still preserving most of the benefits of cohesion.

The neurology of cross-cultural authority?

So I was thinking the other day about the question of why do people go along with others and do things even when they know they believe (or know) they shouldn’t. As Tolstoy asks, why did the French army go along with this mad idea to invade Russia in 1812? Why did Milgram’s subjects obey his orders to “electrocute” people? Why do I feel emotionally distressed when refusing to do something, even when I have very good reasons to refuse?

As I mentioned ages ago, I suspect that normal people have neural circuits that reward them for imitating others and punish them for failing to imitate. Mirror neurons probably play a critical role in this process, but probably aren’t the complete story.

A mirror neuron is a neuron that fires both when an animal acts and when the animal observes the same action performed by another.[1][2][3] Thus, the neuron “mirrors” the behavior of the other, as though the observer were itself acting. …  In humans, brain activity consistent with that of mirror neurons has been found in the premotor cortex, the supplementary motor area, the primary somatosensory cortex and the inferior parietal cortex.[6] (Wikipedia)

These feedback loops are critical for learning–infants only a few months old begin the process of learning to talk by moving their mouths and making “ba ba” noises in imitation of their parents. (Hence why it is called “babbling.”) They do not consciously say to themselves, “let me try to communicate with the big people by making their noises;” they just automatically move their faces to match the faces you make at them. It’s an instinct.

You probably do this, too. Just watch what happens when one person in a room yawns and then everyone else feels compelled to do it, too. Or if you suddenly turn and look at something behind the group of people you’re with–others will likely turn and look, too.

Autistic infants have trouble with imitation, (and according to Wikipedia, several studies have found abnormalities in their mirror neuron systems, though I suspect the matter is far from settled–among other things, I am not convinced that everyone with an ASD diagnosis actually has the same thing going on.) Nevertheless, there is probably a direct link between autistic infants’ difficulties with imitation and their difficulties learning to talk.

For adults, imitation is less critical (you can, after all, consciously decide to learn a new language,) but still important for survival. If everyone in your village drinks out of one well and avoids the other well, even if no one can explain why, it’s probably a good idea to go along and only drink out of the “good” well. Something pretty bad probably happened to the last guy who drank out of the “bad” well, otherwise the entire village wouldn’t have stopped drinking out of it. If you’re out picking berries with your friends when suddenly one of them runs by yelling “Tiger!” you don’t want to stand there and yell, “Are you sure?” You want to imitate them, and fast.

Highly non-conformist people probably have “defective” or low-functioning feedback loops. They simply feel less compulsion to imitate others–it doesn’t even occur to them to imitate others! These folks might die in interesting ways, but in the meanwhile, they’re good sources for ideas other people just wouldn’t have thought of. I suspect they are concentrated in the arts, though clearly some of them are in programming.

Normal people’s feedback loops kick in when they are not imitating others around them, making them feel embarrassed, awkward, or guilty. When they imitate others, their brains reward them, making them feel happy. This leads people to enjoy a variety of group-based activities, from football games to prayer circles to line dancing to political rallies.

Normal people having fun by synchronizing their bodily movements.
Normal people having fun by synchronizing their bodily movements.

At its extreme, these groups become “mobs,” committing violent acts that many of the folks involved wouldn’t under normal circumstances.

Highly conformist people’s feedback loops are probably over-active, making them feel awkward or uncomfortable while simply observing other people not imitating the group. This discomfort can only be relieved by getting those other people to conform. These folks tend to favor more restrictive social policies and can’t understand why other people would possibly want to do those horrible, non-conforming things.

To reiterate: this feedback system exists because helped your ancestors survive. It is not people being “sheep;” it is a perfectly sensible approach to learning about the world and avoiding dangers. And different people have stronger or weaker feedback loops, resulting in more or less instinctual desire to go along with and imitate others.

However, there are times when you shouldn’t imitate others. Times when, in fact, everyone else is wrong.

The Milgram Experiment places the subject in a situation where their instinct to obey the experimenter (an “authority figure”) is in conflict with their rational desire not to harm others (and their instinctual empathizing with the person being “electrocuted.”)

In case you have forgotten the Milgram Experiment, it went like this: an unaware subject is brought into the lab, where he meets the “scientist” and a “student,” who are really in cahoots. The subject is told that he is going to assist with an experiment to see whether administering electric shocks to the “student” will make him learn faster. The “student” also tells the student, in confidence, that he has a heart condition.

The real experiment is to see if the subject will shock the “student” to death at the “scientist’s” urging.

No actual shocks are administered, but the “student” is a good actor, making out that he is in terrible pain and then suddenly going silent, etc.

Before the experiment, Milgram polled various people, both students and “experts” in psychology, and pretty much everyone agreed that virtually no one would administer all of the shocks, even when pressured by the “scientist.”

In Milgram’s first set of experiments, 65 percent (26 of 40) of experiment participants administered the experiment’s final massive 450-volt shock,[1] though many were very uncomfortable doing so; at some point, every participant paused and questioned the experiment; some said they would refund the money they were paid for participating in the experiment. Throughout the experiment, subjects displayed varying degrees of tension and stress. Subjects were sweating, trembling, stuttering, biting their lips, groaning, digging their fingernails into their skin, and some were even having nervous laughing fits or seizures. (bold mine)

I’m skeptical about the seizures, but the rest sounds about right. Resisting one’s own instinctual desire to obey–or putting the desire to obey in conflict with one’s other desires–creates great emotional discomfort.

To Be Continued.

Why are people so keen on pets?

The Lady with an Ermine
The Lady with an Ermine

Don’t get me wrong. I like animals; I just don’t like them in my house. Every time I petsit for friends with cats, I am reminded of why I don’t own cats: scooping feces is repulsive (and don’t get me started on toxoplasma Gondii!) Dogs are marginally better, in that the homes of dog owners don’t always smell of feces, but unfortunately they often smell of dog.

For this post, I am defining “pet” as animals that people keep solely for companionship. Animals kept because they do useful things or materially benefit their owners, like seeing eye dogs, egg-laying chickens, mouse-hunting cats, race horses, or dancing bears are not “pets.” Medical “therapy animals” are basically pets. It makes plenty of sense for people to keep around work animals, but pets seem to be kept around simply for the enjoyment of their company.

According to Wikipedia, Americans own approximately 94 million cats, 78 million dogs, 172 million fish, and 45 million small mammals, fish, reptiles, etc. (Though of course some of these are “useful” animals that I wouldn’t count.) This comes out to about 4x as many pets as children, concentrated in 60% of the households (most pet owners have more than one.)

Pets cost quite a bit of money–the average small dog costs about $7,000 to $13,000 over its 14 year lifespan; the average large dog costs $6,000 to $8,000 over its much shorter 8 year lifespan. [source] (Note that it is cheaper per year to own a small dog; the lower lifetime cost is due entirely to their shorter lifespans.) Cats cost about the same as dogs–people don’t spend much on “outdoor” cats, but “indoor” cats cost about $9,000 to $11,000 over their lifetimes.

Just making some rough estimates, I’d say it looks people spend $700 per year per dog or cat, which comes out to about 120 billion dollars per year. That’s a lot of money! (And this doesn’t count the expenses incurred by shelters and animal control agencies to take care of the excess pets people don’t want.)

Americans are probably exception in the number of pets they have. According to Wikipedia, 46% of the world’s pet dog population lives in the US. (By contrast, only 4.4% of the world’s human population lives in the US.) The ratio gets even more skewed if we break it down by race–63% of America’s whites own pets, versus only 49% of the non-whites. [source]

However, other countries similar to the US don’t seem as keen on pets: the %pets/%people ratio for the US is 10.5, for Canada 7.5, and for Britain, 5.8. This might have to do with factors like Britain being a more crowded country where people have less space for pets, or with the Wikipedia data being inaccurate. Either way, I think it’s safe to say that pets are very characteristically American, and especially a white American thing.

One theory about why people own so many pets is that they’re substitute children/companions/friends for lonely people who don’t have kids/spouses/friends, perhaps as a side effect of our highly atomized culture. I came into this post expecting to confirm this, but it looks like Crazy Cat Ladies are actually a relatively small percent of the overall pet-owning population.

According to Gallop, 50% of married people own a dog, and 33% own a cat (some people own both.) By contrast, only 37% of unmarried people own dogs and only 25% own cats. People with children under 18 are more likely to own pets than people without. And people from the “East” are less likely to own pets than people from the “West.” (Interestingly, “westerners” are disproportionately more likely to own cats.)

So it looks to me like most pet ownership is actually motivated by the idea that kids should have pets, with pets more common in suburban or rural areas where they have more room to run around. This is probably particularly so for cats, who are probably more likely to be “outdoor” pets or mouse-catching farm cats in rural areas (ie, the “West.”)

There is an extensive belief–perhaps folk belief–that pet ownership is good for people. Gallop found that 60% of people believe that pet owners lead more satisfying lives than non-pet owners; numerous studies claim that pet ownership–or even just occasional interaction–makes people healthier. There even exists an “animal therapy” industry. Unfortunately, the studies on the subject look rather unreliable–the ones about pet ownership are confounded by healthier people being more likely to have pets in the first place, for example.

And yet, there’s something about the notion that I find appealing; something about playing with happy puppies or petting a bunny that I find downright pleasant. Maybe it’s something as simple as animals being nice and therefore making people happy.

It’s getting late, so I’ll continue this tomorrow.

Homeostasis, personality, and life (part 2)

Warning: This post may get a little fuzzy, due to discussion of things like personality, psychology, and philosophy.

Yesterday we discussed homeostatic systems for normal organism/organization maintenance and defense, as well as pathological malfunctions of over or under-response from the homeostatic systems.

But humans are not mere action-reaction systems; they have qualia, an inner experience of being.

One of my themes here is the idea that various psychological traits, like anxiety, guilt, depression, or disgust, might not be just random things we feel, but exist for evolutionary reasons. Each of these emotions, when experienced moderately, may have beneficial effects. Guilt (and its cousin, shame,) helps us maintain our social relationships with other people, aiding in the maintenance of large societies. Disgust protects us from disease and helps direct sexual interest at one’s spouse, rather than random people. Anxiety helps people pay attention to crucial, important details, and mild depression may help people concentrate, stay out of trouble, or–very speculatively–have helped our ancestors hibernate during the winter.

In excess, each of these traits is damaging, but a shortage of each trait may also be harmful.

I have commented before on the remarkable statistic that 25% of women are on anti-depressants, and if we exclude women over 60 (and below 20,) the number of women with an “anxiety disorder” jumps over 30%.

The idea that a full quarter of us are actually mentally ill is simply staggering. I see three potential causes for the statistic:

  1. Doctors prescribe anti-depressants willy-nilly to everyone who asks, whether they’re actually depressed or not;
  2. Something about modern life is making people especially depressed and anxious;
  3. Mental illnesses are side effects of common, beneficial conditions (similar to how sickle cell anemia is a side effect of protection from malaria.)

As you probably already know, sickle cell anemia is a genetic mutation that protects carriers from malaria. Imagine a population where 100% of people are sickle cell carriers–that is, they have one mutated gene, and one regular gene. The next generation in this population will be roughly 25% people who have two regular genes (and so die of malaria,) 50% of people who have one sickle cell and one regular gene (and so are protected,) and 25% of people will have two sickle cell genes and so die of sickle cell anemia. (I’m sure this is a very simplified scenario.)

So I consider it technically possible for 25% of people to suffer a pathological genetic condition, but unlikely–malaria is a particularly ruthless killer compared to being too cheerful.

Skipping to the point, I think there’s a little of all three going on. Each of us probably has some kind of personality “set point” that is basically determined by some combination of genetics, environmental assaults, and childhood experiences. People deviate from their set points due to random stuff that happens in their lives, (job promotions, visits from friends, car accidents, etc.,) but the way they respond to adversity and the mood they tend to return to afterwards is largely determined by their “set point.” This is all a fancy way of saying that people have personalities.

The influence of random chance on these genetic/environmental factors suggests that there should be variation in people’s emotional set points–we should see that some people are more prone to anxiety, some less prone, and some of average anxiousness.

Please note that this is a statistical should, in the same sense that, “If people are exposed to asbestos, some of them should get cancer,” not a moral should, as in, “If someone gives you a gift, you should send a thank-you note.”

Natural variation in a trait does not automatically imply pathology, but being more anxious or depressive or guilt-ridden than others can be highly unpleasant. I see nothing wrong, a priori, with people doing things that make their lives more pleasant and manageable (and don’t hurt others); this is, after all, why I enjoy a cup of coffee every morning. If you are a better, happier, more productive person with medication (or without it,) then carry on; this post is not intended as a critique of anyone’s personal mental health management, nor a suggestion for how to take care of your mental health.

Our medical/psychological health system, however, operates on the assumption that medications are for pathologies only. There is not form to fill out that says, “Patient would like anti-anxiety drugs in order to live a fuller, more productive life.”

That said, all of these emotions are obviously responses to actual stuff that happens in real life, and if 25% of women are coming down with depression or anxiety disorders, I think we should critically examine whether anxiety and depression are really the disease we need to be treating, or the body’s responses to some external threat.

I am reminded here of Peter Frost’s On the Adaptive Value of “Aw Shucks:

In a mixed group, women become quieter, less assertive, and more compliant. This deference is shown only to men and not to other women in the group. A related phenomenon is the sex gap in self-esteem: women tend to feel less self-esteem in all social settings. The gap begins at puberty and is greatest in the 15-18 age range (Hopcroft, 2009).

If more women enter the workforce–either because they think they ought to or because circumstances force them to–and the workforce triggers depression, then as the percent of women formally employed goes up, we should see a parallel rise in mental illness rates among women. Just as Adderal and Ritalin help little boys conform to the requirements of modern classrooms, Prozac and Lithium help women cope with the stress of employment.

As we discussed yesterday, fever is not a disease, but part of your body’s system for re-asserting homeostasis by killing disease microbes and making it more difficult for them to reproduce. Extreme fevers are an over-reaction and can kill you, but a normal fever below 104 degrees or so is merely unpleasant and should be allowed to do its work of making you better. Treating a normal fever (trying to lower it) interferes with the body’s ability to fight the disease and results in longer sicknesses.

Likewise, these sorts of emotions, while definitely unpleasant, may serve some real purpose.

We humans are social beings (and political animals.) We do not exist on our own; historically, loneliness was not merely unpleasant, but a death sentence. Humans everywhere live in communities and depend on each other for survival. Without refrigeration or modern storage methods, saving food was difficult. (Unless you were an Eskimo.) If you managed to kill a deer while on your own, chances are you couldn’t eat it all before it began to rot, and then your chances of killing another deer before you started getting seriously hungry were low. But if you share your deer with your tribesmates, none of the deer goes to waste, and if they share their deer with yours, you are far less likely to go hungry.

If you end up alienated from the rest of your tribe, there’s a good chance you’ll die. It doesn’t matter if they were wrong and you were right; it doesn’t matter if they were jerks and you were the nicest person ever. If you can’t depend on them for food (and mates!) you’re dead. This is when your emotions kick in.

People complain a lot that emotions are irrational. Yes, they are. They’re probably supposed to be. There is nothing “logical” or “rational” about feeling bad because someone is mad at you over something they did wrong! And yet it happens. Not because it is logical, but because being part of the tribe is more important than who did what to whom. Your emotions exist to keep you alive, not to prove rightness or wrongness.

This is, of course, an oversimplification. Men and women have been subject to different evolutionary pressures, for example. But this is close enough for the purposes of the current conversation.

If modern people are coming down with mental illnesses at astonishing rates, then maybe there is something about modern life that is making people ill. If so, treating the symptoms may make life more bearable for people while they are subject to the disease, but still does not fundamentally address whatever it is that is making them sick in the first place.

It is my own opinion that modern life is pathological, not (in most cases,) people’s reactions to it. Modern life is pathological because it is new and therefore you aren’t adapted to it. Your ancestors have probably only lived in cities of millions of people for a few generations at most (chances are good that at least one of your great-grandparents was a farmer, if not all of them.) Naturescapes are calming and peaceful; cities noisy, crowded, and full of pollution. There is some reason why schizophrenics are found in cities and not on farms. This doesn’t mean that we should just throw out cities, but it does mean we should be thoughtful about them and their effects.

People seem to do best, emotionally, when they have the support of their kin, some degree of ethnic or national pride, economic and physical security, attend religious services, and avoid crowded cities. (Here I am, an atheist, recommending church for people.) The knowledge you are at peace with your tribe and your tribe has your back seems almost entirely absent from most people’s modern lives; instead, people are increasingly pushed into environments where they have no tribe and most people they encounter in daily life have no connection to them. Indeed, tribalism and city living don’t seem to get along very well.

To return to healthy lives, we may need to re-think the details of modernity.

Politics

Philosophically and politically, I am a great believer in moderation and virtue as the ethical, conscious application of homeostatic systems to the self and to organizations that exist for the sake of humans. Please understand that this is not moderation in the conventional sense of “sometimes I like the Republicans and sometimes I like the Democrats,” but the self-moderation necessary for bodily homeostasis reflected at the social/organizational/national level.

For example, I have posted a bit on the dangers of mass immigration, but this is not a call to close the borders and allow no one in. Rather, I suspect that there is an optimal amount–and kind–of immigration that benefits a community (and this optimal quantity will depend on various features of the community itself, like size and resources.) Thus, each community should aim for its optimal level. But since virtually no one–certainly no one in a position of influence–advocates for zero immigration, I don’t devote much time to writing against it; it is only mass immigration that is getting pushed on us, and thus mass immigration that I respond to.

Similarly, there is probably an optimal level of communal genetic diversity. Too low, and inbreeding results. Too high, and fetuses miscarry due to incompatible genes. (Rh- mothers have difficulty carrying Rh+ fetuses, for example, because their immune systems identify the fetus’s blood as foreign and therefore attack it, killing the fetus.) As in agriculture, monocultures are at great risk of getting wiped out by disease; genetic heterogeneity helps ensure that some members of a population can survive a plague. Homogeneity helps people get along with their neighbors, but too much may lead to everyone thinking through problems in similar ways. New ideas and novel ways of attacking problems often come from people who are outliers in some way, including genetics.

There is a lot of talk ’round these parts that basically blames all the crimes of modern civilization on females. Obviously I have a certain bias against such arguments–I of course prefer to believe that women are superbly competent at all things, though I do not wish to stake the functioning of civilization on that assumption. If women are good at math, they will do math; if they are good at leading, they will lead. A society that tries to force women into professions they are not inclined to is out of kilter; likewise, so is a society where women are forced out of fields they are good at. Ultimately, I care about my doctor’s competence, not their gender.

In a properly balanced society, male and female personalities complement each other, contributing to the group’s long-term survival.

Women are not accidents of nature; they are as they are because their personalities succeeded where women with different personalities did not. Women have a strong urge to be compassionate and nurturing toward others, maintain social relations, and care for those in need of help. These instincts have, for thousands of years, helped keep their families alive.

When the masculine element becomes too strong, society becomes too aggressive. Crime goes up; unwinable wars are waged; people are left to die. When the feminine element becomes too strong, society becomes too passive; invasions go unresisted; welfare spending becomes unsustainable. Society can’t solve this problem by continuing to give both sides everything they want, (this is likely to be economically disastrous,) but must actually find a way to direct them and curb their excesses.

I remember an article on the now-defunct neuropolitics (now that I think of it, the Wayback Machine probably has it somewhere,) on an experiment where groups with varying numbers of ‘liberals” and “conservatives” had to work together to accomplish tasks. The “conservatives” tended to solve their problems by creating hierarchies that organized their labor, with the leader/s giving everyone specific tasks. The “liberals” solved their problems by incorporating new members until they had enough people to solve specific tasks. The groups that performed best, overall, were those that had a mix of ideologies, allowing them to both make hierarchical structures to organize their labor and incorporate new members when needed. I don’t remember much else of the article, nor did I read the original study, so I don’t know what exactly the tasks were, or how reliable this study really was, but the basic idea of it is appealing: organize when necessary; form alliances when necessary. A good leader recognizes the skills of different people in their group and uses their authority to direct the best use of these skills.

Our current society greatly lacks in this kind of coherent, organizing direction. Most communities have very little in the way of leadership–moral, spiritual, philosophical, or material–and our society seems constantly intent on attacking and tearing down any kind of hierarchies, even those based on pure skill and competence. Likewise, much of what passes for “leadership” is people demanding that you do what they say, not demonstrating any kind of competence. But when we do find competent leaders, we would do well to let them lead.

Back to part one.

Thoughts on Frost’s The Adaptive Value of “Aw Shucks”

Peter Frost recently posted on female shyness among men–more specifically, on the observation that adolescent white females appear to become very shy among groups of males and suffer depression, but adolescent black females don’t.

Frost theorizes that women are instinctually deferential to men, especially when they are economically dependent on them, and that whites show more of this deference than blacks because traditional white marriage patterns–monogamy–have brought women into more contact with men while making them more economically dependent on them than traditional African marriage patterns–polygyny–and therefore white women have evolved to have more shyness.

This explanation is decent, but feels incomplete.

Did anyone bother to ask the girls why they felt shy around the boys? Probably someone has, but that information wasn’t included in the post. But I can share my own experiences.

For starters, I’ve never felt–and this may just be me–particularly shyer around males than around females, nor do I recall ever talking less in highschool due to class composition. Rather, the amount I talked had entirely to do with how much I liked the subject matter vs. how tired I was. However, in non-school settings, I am less likely to talk when conversations are dominated by men, simply because men tend to talk about things I find boring, like cars, sports, or finance. (I suspect I have an unusually high tolerance for finance/economic discussions for a female, but there are limits to what even I can stand, and the other two topics drive me to tears of boredom. Sports, as far as I am concerned, are the Kardashians of men.) I am sure the same is true in reverse–when groups of women get together, they talk about stuff that men find horribly dull.

Even in classroom conversations that are ostensibly led by the teacher, male students may make responses that just aren’t interesting to the female students, leading to the females getting bored or having little to say in response.

So, do black adolescent girls and boys have more conversation topics in common than whites?

Second, related to Frost’s observations, men tend to be more aggressive while talking than women. They are louder, they interrupt more, they put less effort into assuaging people’s feelings, etc. I am sure women do things men find annoying, like ramble on forever without getting to the point or talking about their feelings in these weirdly associative ways. Regardless, I suspect that women/adolescents (at least white ones) often find the male style overwhelming, and their response is to retreat.

When feminists say they need “safe spaces” away from men to discuss their feminism things, they aren’t entirely inaccurate. It’s just that society used to have these “safe spaces” for women back before the feminists themselves destroyed them! Even now, it is easy to join a Mommy Meetup group or find an all-female Bible study club. But, oh wait, these are regressive! What we need are all-female lawyers, or doctors, or mathematicians…

*Ahem* back on subject, if testosterone => aggression, it would be interesting to see if the difference in black vs white females is simply a result of different testosterone levels (though of course that is just kicking the ball back a bit, because we then must ask what causes different testosterone levels.)

I suspect that Frost is on the right track looking at polygyny vs. monogamy, but I think his mechanism (increased time around/dependence on men => increase shyness) is incomplete. He’s missed something from his own work: polygynous males have higher testosterone than monogamous ones (even within their own society.) (See: The Contradictions of Polygyny and Polygyny Makes Men Bigger, Tougher, and Meaner.) Even if women in polygynous societies were expected to behave exactly like women from monogamous societies, I’d expect some “spillover” effect from the higher testosterone in their men–that is, everyone in society ought to have higher testosterone levels than they would otherwise.

Additionally, let us consider that polygyny is not practiced the same everywhere. In the Middle East, sexual access to women is tightly controlled–to the point where women may be killed for extra-marital sexual activity. In this case, the women are effectively monogamous, while the men are not. By contrast, in the societies Frost describes from Sub-Saharan Africa, it sounds like both men and women have a great many sexual partners during adolescence and early adulthood (which explains the high STD rates.)

If polygamy increases male aggression and testosterone levels because them men have to invest more energy into finding mates, then it stands to reason that women who have lots of mates are also investing lots of energy into finding them, and so would also have increased levels of aggression and testosterone.

Speaking again from personal experience, I observed that my own desire to talk to men basically cratered after I got married (and then had kids.) Suddenly something about it seemed vaguely tawdry. Of course, this leaves me in a bit of a pickle, because there aren’t that many moms who want to discuss HBD or related topics. (Thankfully I have the internet, because talking to words on a screen is a very different dynamic.) Of course, if I were back on the dating market again (god forbid!) I’d have to talk to lots of men again.

So I think the equation here shouldn’t be +time with men => +shyness, -time with men => -shyness, but +pursuit of partners => +aggression, -pursuit of partners => -aggression.

None of this gets into the “depression” issue. What’s up with that?

Personally, while I felt plenty of annoying things during highschool, the only ones triggered by boys were of the wanting to fall in love variety and the feeling sad if someone didn’t like me variety. I did feel some distress over wanting the adults to treat me like an adult, but that has nothing to do with boys. But this may just be me being odd.

We know that whites, women, and the subset of white women suffer from depression, anxiety, and other forms of mental illness at higher rates than blacks, men, and pretty much everyone else. I speculate that anxiety, shyness, disgust, and possibly even depression are part of a suite of traits that help women women avoid male aggression, perform otherwise dull tasks like writing English papers or washing dishes, keep out of trouble, and stay interested in their husbands and only their husbands.

In a society where monogamy is enforced, people (or their parents) may even preferrentially chose partners who seem unlikely to stray–that is, women (or men) who display little interest in actively pursuing the opposite sex. So just as women in polygamous societies may be under selective pressure to become more aggressive, women in monogamous societies may be under selective pressure to have less interest in talking to men.

Eventually, you get Japan.

Amusingly, the studies Frost quotes view white female shyness as a bad thing to be corrected, and black female non-shyness as a good thing that mysteriously exists despite adverse conditions. But what are the effects of white female shyness? Do white women go to prison, become pregnant out of wedlock, or get killed by their partners at higher rates than black women? Do they get worse grades, graduate from school at lower rates, or end up in worse professions?

Or maybe shy girls are perfectly fine the way they are and don’t need fixing.