I read a book and it’s Friday: Homicide, by Daly and Wilson

Today’s selection, Homicide, is ev psych with a side of anthropology; I am excerpting the chapter on people-who-murder-children. (You are officially forewarned.)

Way back in middle school, I happened across (I forget how) my first university-level textbook, on historical European families and family law. I got through the chapter on infanticide before giving up, horrified that enough Germans were smushing their infants under mattresses or tossing them into the family hearth that the Holy Roman Empire needed to be laws specifically on the subject.

It was a disillusioning moment.

Daly and Wilson’s Homicide, 1988, contributes some (slightly) more recent data to the subject, (though of course it would be nice to have even more recent data.

Picture 6 Picture 5 Picture 4 Picture 2 Picture 1 CgxAZrOUYAEeANF

(I think some of the oddities in # of incidents per year may be due to ages being estimated when the child’s true age isn’t known, eg, “headless torso of a boy about 6 years old found floating in the Thames.”)

We begin with a conversation on the subject of which child parents would favor in an emergency:

If parental motives are such as to promote the parent’s own fitness, then we should expect that parents will often be inclined to act so that neither sibling’s interests prevail completely. Typically, parental imposition of equity will involve supporting the younger, weaker competitor, even when the parent would favor the older if forced to choose between the two. It is this latter sort of situation–“Which do you save when one must be sacrificed?”–in which parents’ differential valuation of their children really comes to the fore. Recall that there were 11 societies in the ethnographic review of Chapter 3 for which it was reported that a newborn might be killed if the birth interval were too short or the brood too numerous. It should come as no surprise that there were no societies in which the prescribed solution to such a dilemma was said to be the death of an older child. … this reaction merely illustrates that one takes for granted the phenomenon under discussion, namely the gradual deepening of parental commitment and love.

*Thinks about question for a while* *flails* “BUT MY CHILDREN ARE ALL WONDERFUL HOW COULD I CHOSE?” *flails some more*

That said, I think there’s an alternative possibility besides just affection growing over time: the eldest child has already proven their ability to survive; an infant has not. The harsher the conditions of life (and thus, the more likelihood of actually facing a real situation in which you genuinely don’t have enough food for all of your children,) the higher the infant mortality rate. The eldest children have already run the infant mortality gauntlet and so are reasonably likely to make it to adulthood; the infants still stand a high chance of dying. Sacrificing the child you know is healthy and strong for the one with a high chance of dying is just stupid.

Whereas infant mortality is not one of my personal concerns.

Figure 4.4 shows that the risk of parental homicide is indeed a declining function of the child’s age. As we wold anticipate, the most dramatic decrease occurs between infants and 1-year-old children. One reason for expecting this is that the lion’s share of the prepubertal increase in reproductive value in natural environments occurs within the first year.

(I think “prepubertal increase in reproductive value” means “decreased likelihood of dying.”)

Moreover, if parental disinclination reflects any sort of assessment of the child’s quality or the mother’s situation, then an evolved assessment mechanisms should be such as to terminate any hopeless reproductive episode as early as possible, rather than to squander parental effort in an enterprise that will eventually be abandoned. … Mothers killed 61 in the first 6 months compared to just 27 in the second 6 months. For fathers, the corresponding numbers are 24 vs. 14. [See figure 4.4] … This pattern of victimization contrasts dramatically with the risk of homicide at the hands of nonrelatives (Figure 4.5)…

I would like to propose an alternative possibility: just as a child who attempts to drive a car is much more likely to crash immediately than to successfully navigate onto the highway and then crash, so a murderous person who gets their hands onto a child is more likely to kill it immediately than to wait a few years.

A similar mechanism may be at play in the apparent increase and then decrease in homicides of children by nonrelatives during toddlerhood. Without knowing anything about these cases, I can only speculate, but 1-4 are the ages when children are most commonly put into daycares or left with sitters while their moms return to work. The homicidally-minded among these caretakers, then, are likely to kill their charges sooner rather than later. (School-aged children, by contrast, are both better at running away from attackers and highly unlikely to be killed by their teachers.)

Teenagers are highly conflictual creatures, and the rate at which nonrelatives kill them explodes after puberty. When we consider the conspicuous, tempestuous conflicts that occur between teenagers and their parents–conflicts that apparently dwarf those of the preadolescent period–it is all the more remarkable that the risk of parental homicide continues its relentless decline to near zero.

… When mothers killed infants, the victims had been born to them at a mean age of 22.7 years, whereas older victims had been born at a mean maternal age of 24.5. Thi is a significant difference, but both means are signficantly below the 25.8 year that was the average age of all new Candian mothers during the same period, accoding to Cadian Vital Statistics.

In other words, impulsive fuckups who get accidentally pregnant are likely to be violent impulsive fuckups.

We find a similar result with respect to marital status: Mothers who killed older children are again intermediate between infanticidal women and the population-at-large. Whereas 51% of mothers committing infanticide were unmarried, the same was true of just 34% of those killing older children. This is still substantially above the 12% of Canadian births in which the new mother was unmarried …

Killing of an older child is often associated with maternal depression. Of the 95 mothers who killed a child beynd its infancy, 15.8% also committed suicide. … By contrast, only 2 of 88 infanticidal mothers committed suicide (and even this meager 2.3% probably overestimates the assocation of infanticide with suicide, since infanticides are the only category of homicides in which a significant incidence of undetected cases is likely.) … one of thee 2 killed three older children as well.

Anyone else thinking of Andrea Yates and her idiot husband?

In the Canadian data, it is also noteworthy that 35% of maternal infanticides were attributed by the investigating police force … [as] “mentally ill or mentally retarded (insane),” verses 58% of maternal homicides of older children. Here and elsewhere, it seems that the sots of cases that are simultaneously rare and seemingly contrary to the actor’s interests–in both the Darwinian and the commonsense meaning of interest–also happen t be the sorts of cases most likely to be attributed to some sort of mental incompetence. … We identify as mad those people who lack a species-typical nepotistic perception of their interests or who no longer care to pursue them. …

Violent people go ahead and kill their kids; people who go crazy later kill theirs later.

We do at least know the ages of the 38 men who killed heir infant children: the mean was 26.3 years. Moreover, we know that fathers averaged 4 years older than mothers for that substantial majority of Canadian births that occurred within marriages… . Since the mean age for all new Canadian mothers during the relevant period… was 25.8, it seems clear that infanticidal fathers are indeed relatively young. And as was the case with mothers, infanticidal fathers were significantly younger than those fathers who killed older offspring. (mean age at the victim’s birth = 29.2 years). …

As with mothers, fathers who killed older children killed themselves as well significantly more often (43.6% of 101) than did those who killed their infant children (10.5% of 38). Also like mothers is the fact that those infanticidal fathers who did commit suicide were significantly older (mean age = 30.5 years) than those who did not (mean = 25.8). Likewise, the paternal age at which older victims had been born was also significantly greater for suicidal (mean = 31.1 years; N = 71) than for nonsuicidal (mean =27.5; N = 67) homicidal fathers. And men who killed their older children were a little more likely to be deemed mentally incompetent (20.8%) than those who killed their infants (15.8%). …

Fathers, however, were significantly less likely to commit suicide after killing an adult offspring (19% of 21 men) than a child (50% of 80 men.) … 20 of the 22 adult victims of their father were sons… three of the four adult victims of mothers were daughters. … There is no hint of such a same-ex bias in the killings of either infants… or older children. …

An infrequent but regular variety of homicide is that in which a man destroys his wife and children. A corresponding act of familicide by the wife is almost unheard of. …

No big surprises in this section.

Perhaps the most obvious prediction from a Darwinian view of parental motives is this: Substitute parents will generally tend to care less profoundly for their children than natural parents, with the result that children reared by people other than their natural parents will be more often exploited and otherwise at risk. Parental investment is a precious resource, and selection must favor those parental psyches that do not squander it on nonrelatives.

Disclaimer: obviously there are good stepparents who care deeply for their stepchilden. I’ve known quite a few. But I’ve also met some horrible stepparents. Given the inherent vulnerability of children, I find distasteful our society’s pushing of stepparenting as normal without cautions against its dangers. In most cases, remarriage seems to be undertaken to satisfy the parent, not the  child.

In an interview study of stepparents in Cleveland, Ohio, for example–a study of predominantly middle-class group suffering no particular distress or dysfunction–Loise Duberman (1975) found that only 53% of stepfathers and 25% of stepmothers could claim to have “parental feeling” toward their stepchildren, and still fewer to “love” them.

Some of this may be influenced by the kinds of people who are likely to become stepparents–people with strong family instincts probably have better luck getting married to people like themselves and staying that way than people who are bad at relationships.

In an observational study of Trinidadian villagers, Mark Flinn (1988) found that stepfathers interacted less with “their” children than did natural fathers; that interactions were more likely to be aggressive within steprelationships than within the corresponding natural relationships; and that stepchildren left home at an earlier age.

Pop psychology and how-to manuals for stepfamilies have become a growth industry. Serious study of “reconstituted” families is also burgeoning. Virtually all of this literature is dominated by a single theme: coping with the antagonisms…

Here the authors stops to differentiate between between stepparenting and adoption, which they suspect is more functional due to adoptive parents actually wanting to be parents in the first place. However,

such children have sometimes been found to suffer when natural children are subsequently born to the adopting couple, a result that has led some professionals to counsel against adoption by childless couples until infertility is definitely established. …

Continuing on with stepparents:

The negative characterization of stepparents is by no means peculiar to our culture. … From Eskimos to Indonesians, through dozens of tales, the stepparent is the villain of every piece. … We have already encountered the Tikopia or Yanomamo husband who demands the death of his new wife’s prior children. Other solutions have included leaving the children with postmenopausal matrilineal relatives, and the levirate, a wide-spread custom by which a widow and her children are inherited by the dead man’s brother or other near relative. …

Social scientists have turned this scenario on its head. The difficulties attending steprelationships–insofar as they are acknowledged at all–are presumed to be caused by the “myth of the cruel stepparent” and the child’s fears.

See: Freud.

Why this bizarre counterintuitive view is the conventional wisdom would be  a topic for a longer book than this; suffice to say that the answer surely has more to do with ideology than with evidence. In any event, social scientists have staunchly ignored the question of the factual basis for the negative “stereotyping” of stepparents.

Under Freud’s logic, all sorts of people who’d been genuinely hurt by others were summarily dismissed, told that they were the ones who actually harbored ill-will against others and were just “projecting” their emotions onto their desired victims.

Freudianism is a crock of shit, but in this case, it helped social “reformers” (who of course don’t believe in silly ideas like evolution) discredit people’s perfectly reasonable fears in order to push the notion that “family” doesn’t need to follow traditional (ie, biological) forms, but can be reinvented in all sorts of novel ways.

So are children at risk in stepparent homes in contemporary North America? [see Figures 4.7 and 4.8.] … There is … no appreciable statistical confounding between steprelationships and poverty in North America. … Stepparenthood per se remains the single most powerful risk factor for child abuse that has yet been identified. (here and throughout this discussion “stepparents” include both legal and common-law spouses of the natural parent.) …

Speaking of Figures 4.7 and 4.8, I must say that the kinds of people who get divorced (or were never married) and remarried within a year of their kid’s birth are likely to be unstable people who tend to pick particularly bad partners, and the kinds of people willing to enter into a relationship with someone who has a newborn is also likely to be, well, unusual. Apparently homicidal.

By contrast, the people who are willing to marry someone who already has, say, a ten year old, may be relatively normal folks.

Just how great an elevation of risk are we talking about? Our efforts to answer that question have been bedeviled by a lack of good information in the living arrangements of children in the general population. … there are no official statistics [as of when this was written] on the numbers of children of each age who live in each household type. There is no question that the 43% of murdered American child abuse victims who dwelt with substitute parents is far more than would be expected by chance, but estimates of that expected percentage can only be derived from surveys that were designed to answer other questions. For a random sample of American children in 1976, … the best available national survey… indicates that only about 1% or fewer would be expected to have dwelt with a substitute parent. An American child living with one or more substitute parents in 1976 was therefore approximately 100 times as likely to be fatally abused as a child living with natural parents only…

Results for Canada are similar. In Hamilton, Ontario in 1983, for example, 16% of child abuse victims under 5 years of age lived with a natural parent and a stepparent… Since small children very rarely have stepparents–less than 1% of preschoolers in Hamilton in 1983, for example–that 16% represents forty times the abuse rate for children of the same age living with natural parents. … 147 Canadian children between the ages of 1 and 4 were killed by someone in loco parentis between 1974 and 1983; 37 of those children (25.2%) were the victims of their stepparents, and another 5 (3.4%) were killed by unrelated foster parents.

…The survey shows, for example, that 0.4% of 2,852 Canadian children, aged 1-4 in 1984, lived with a stepparent. … For the youngest age group in Figure 4.9, those 2 years of age and younger, the risk from a stepparent is approximately 70 times that from a natural parent (even though the later category includes all infanticides by natural mothers.)

Now we need updated data. I wonder if abortion has had any effect on the rates of infanticide and if increased public acceptance of stepfamilies has led to more abused children or higher quality people being willing to become stepparents.

Some quick thoughts about Angry Birds and a caution about LSD stories

Watching the liberals lose their shit over the Angry Birds Movie has been rather entertaining and proof of just how absurdly out of touch with reality they’ve become.

The movie is limited by the game’s single conceit: the pigs stole the birds’ eggs, and the birds are flinging themselves at the pigs to get back the eggs. You can’t have reconciliation between the pigs and birds because, as is obvious if you’ve played the game, the pigs steal those eggs over and over.

Critically, the pigs are not refugees or economic migrants seeking a better life. They are invaders stealing the birds’ eggs. Liberals can no longer distinguish between the two. They are not freaking out over the birds attacking a group of peaceful refugees, but over the birds defending themselves against actual invaders.

The right of self-defense against people who attacked you unprovoked is not even right-wing; it is accepted by almost all moralists and is about as mainstream a view as you can find. I can understand the left’s humanitarian logic for accepting refugees/economic migrants, but to toss out the right to self-defense is just plain delusional.

(Comment originally posted in reaction to Gregory Hood’s Review of the Angry Birds Movie.)

640I also feel compelled to note that, while people have been claiming that the chief pig, Leonard, has a “Middle Eastern” style beard, Middle Easterners typically have curly haired beards, whereas Leonard clearly has straight fur. Also, Leonard has only managed a chin-beard, whereas Middle Easterners tend to have much fuller beards.

Because this is an HBD-centric blog, I have maps:

1024px-PSM_V52_D323_Global_hair_texture_map Bodyhair_map_according_to_American_Journal_of_Physical_Anthropology_and_other_sources

 

 

 

Personally, I think he looks more French, eg Childeric II or Henry I–for a pig.

__________________________________

 

The comments on Slate Star Codex’s recent post, “Why Were Early Psychedelicists so Weird?” contain a fair number of stories along the lines of “I took LSD/shrooms/other illegal drugs and had interesting, positive effects,” and a few stories along the lines of “I knew a guy who tried LSD and it fried his brain and turned him into a drooling idiot.”

Normally, I think it best to rate “I did X”-style testimony more highly than “I knew a guy who did X.” In this case, however, I want to urge caution, because there is an obvious selection bias in the kinds of stories you are going to hear: drooling idiots are bad at writing.

The people whose brains got fried on illegal drugs do not have the ability to get on the internet and write coherent, entertaining posts on the subject, and they certainly do not have the IQ points left to be part of the regular readership/commentariat on Scott’s blog. In fact, they aren’t writing a whole lot of anything. Which means that if you are reading about LSD-experiences in the comments section of Scot’s blog, you are only going to read stories from people who are still mentally with it, or people warning that a bad thing happened to a guy they knew.

I have no idea what % of people who try LSD end up okay, better, or worse afterwards, but for the sake of argument, let’s assume that 50% are fine-to-better and 50% end up in droolsville. The 50% who are fine go post on Scott’s blog, and the 50% who are not fine never show up because they can’t type anymore, except as cautionary tales from the few guys who know the details about a former friend’s illegal activities.

Maybe LSD researchers can tell you what percentage of people fry their brains on it, shrooms, or other psychedelics. But you certainly can’t make any good estimation based on a biased sample like this–so don’t.

And yes, I know, everyone with positive stories would probably say that the key is to be very careful about how much you use, purity, and allowing enough time between uses. But the people who fried their brains probably thought that, too.

I am not saying that these drugs cannot possibly have any positive medical uses. I am saying that you should avoid using biased datasets when formulating any theories on the matter.

“I don’t hate minorities, I just hate liberals”

A lot of people are talking about the Trump candidacy “realigning” or “reshaping” the American political landscape and things like that. Like why would traditionally blue-state voters in places like NY vote for a guy who’s also carrying traditionally red-states like Kentucky? Is the whole Albion’s seed-style ethno/political makeup of the nation breaking down after nearly 400 years?

Nah.

Look, when it comes to politics, conservatives are basically just reactive. There are some smart conservatives, of course–I’d wager they do well in fields like economics, finance, sports broadcasting, and military strategy–but conservatives overall do not dominate the production of new social ideas. It’s the liberals, somewhat by nature, who keep coming up with ideas like, “What if we let women have abortions?” “What if we all took LSD?” “What if we didn’t eat animals?” or “What if we let gay people get married?”

So the conservatives devote themselves to opposing whatever the hell cockamamie scheme the liberals have come up with this time.

During the Cold War, I’m pretty sure the conservative opposed the liberals on the grounds that the liberals were commie peaceniks who weren’t doing enough to ensure that we would win the nuclear war against the USSR.

By the ’80s, conservatives were visibly concerned about shifting national attitudes toward religion, especially as it impacted things like abortion, divorce, the teaching of evolution in schools, whether local governments could make religious displays, etc. “Talk radio” became an important bastion in the “Religious Right,” which by the mid-90s had won a sweeping victory in Congress.

When people talk about how no president has ever been so hated as Obama, I wonder if they remember just how much the right hated Clinton.

And what did they hate him for?

Because he represented degenerate, godless atheism. (Never mind that Bill Clinton is probably actually Christian; that doesn’t really matter.)

Reagan and Bush I may have been religious conservatives, but religious conservatism was not a big part of their campaigns. By contrast, Bob Dole, Bush II, and mildly, Mitt Romney, all ran on the religious right platform, with strong planks based on ideas like “ban abortion” and “make sure gay marriage stays illegal.” Bush II even managed to establish an “Office of Faith-Based and Neighborhood Partnerships.”

Meanwhile, though, liberals were changing. The big liberal push of the past 8 years has not been atheism; atheism has largely won already and atheists have wandered off to fight other battles, taking to the streets to protest racism. Thus the campus protests, the Black Lives Matter campaigns, the increasing push for open borders. Today, Germany; tomorrow, the US.  Today’s liberals are, first and foremost, anti-racists.

The Republican establishment–folks like Ted Cruz and Ben Carson–fell so flat with voters precisely because most of them were still harping on religious issues like abortion and war with the Russians that were a concern with Reagan’s and Bob Dole’s voters, not today’s.

Today’s conservatives do not exactly want to come out and declare themselves racist bigots–in fact, the vast majority of them don’t see themselves as racists, and many are quite vehemently opposed to racism. This makes people reluctant to say anything negative about blacks, which gets instantly called out as racist. But you can still say things about immigrants, especially illegal immigrants. There’s just enough plausible deniability (both for others and yourself) to claim that you are not opposed to Mexicans, per se, you are just opposed to people breaking the law and think that if the law exists, then it ought to be enforced or else it is unfair to the people who did obey it. And for that matter, many of them really aren’t opposed to Mexicans; they are just broke and unable to find work and have enough brains in their heads to figure out what a massive flood of low-wage workers does to their chances of finding a well-paying job.

Of course, in the backs of people’s minds, it is not just about immigrants; it is also about BLM protestors, the November terrorist attacks in Paris, and the conviction that if elected, Hillary Clinton will follow in Angela Merkel’s footsteps and invite a million Muslims to the US.

This is why they say, “I don’t hate blacks; I don’t hate Mexicans. I just hate liberals.”

 

Potato Madness

this_is_a_potatoThis is a potato.

Bake it, and you have a healthy, nutritious dinner that you can serve your family and feel good about.

That seems simple enough.

However, the potato would like to clarify some potential confusion about its culinary uses:

 

18rnflcoijyr0jpgThis is a potato that has been chopped up and deep fried.

Make it from scratch in the morning, and you are not just a good mom, but an excellent mom.

You can’t really eat them for dinner, unless you’re at IHOP or celebrating Chanukah.

 

Picture 11This is a potato that has been chopped up, deep fried, and served with a side of pickled tomatoes.

It is never for breakfast, except maybe if you are on a roadtrip and there’s nothing else available. If so, we will pretend it never happened.

Serve it for breakfast any other time, and you are a bad mom.

It is fine for lunch, though.

Picture 12This is a potato that has been chopped up and baked.

You can never, ever serve it for breakfast. In fact, you are a bad mom if you even think about serving it for breakfast.

It is not a meal at all!

Why, this potato is so unhealthy, you should probably never eat it at all.

 

There. I hope that clears everything up.

 

Anthropology Friday: Sacrifice Among the Semites pt. 2

Hello! Today we’re continuing with more excerpts from Smith’s Sacrifice Among the Semites, with all attendant warnings that I don’t necessarily trust Smith’s accuracy.

“Now, if kinship means participation in common mass of flesh, blood, and bones, it is natural ha tit should be regarded as dependent, not merely on the fact that a man was born of his mother’s body, and so was from hi birth a part of her flesh, but also n the not less significant fact that he was nourished by her mil. And so we find that among the Arabs there is a tie of milk, as well as of blood, which unites the foster-child t his foster-mother and her kin. Again, after the child is weaned, his flesh and blood continue to be nourished and renewed by the food which he shares with his commensals, so that commensality can be thought of (1) as confirming or even (2) as constituting kinship in a very real sense.

“… Primarily the circle of common religion and of common social duties was identical with that of natural kinship, and the god himself was conceived as being of the same stock with his worshipers. It was natural, therefore, that the kinsmen and their kindred god should seal and strengthen their fellowship by meeting together from time to time to nourish their common life by a common meal, to which those outside the kin were not admitted.”

White House Passover Seder, 2011
White House Passover Seder, 2011

“… after several clans had begun to frequent the same sanctuary and worship the same god, the worshipers still grouped themselves for sacrificial purposes on the principle of kinship. In the days of Saul and David all the tribes of Israel had long been united in the worship of Jehovah, yet the clans still maintained their annual gentile sacrifice, at which every member of the group was bound to be present. But evidence more decisive comes to us from Arabia, where, as we have seen, men would not eat together at all unless they were united by kinship or by a covenant that had the same effect as natural kinship. Under such a rule the sacrificial feast must have been confined to kinsmen, and the clan was the largest circle that could unite in a sacrificial act. And so, though the great sanctuaries of heathen Arabia were frequented at the pilgrimage feasts by men of different tribes, who met peaceably for a season under the protection of the truce of God, we find that their participation in the worship of the same holy place did not bind alien clans together in any religious unity; they worshiped side by side, but not together.”

EvX: I wish this guy would cite his sources or otherwise back up his claims.

“It is only under Islam that the pilgrimage becomes a bond of religious fellowship, whereas in the times of heathenism it was the correct usage that the different tribes, before they broke up from the feast, should engage in a rivalry of self–exaltation and mutual abuse, which sent them home with all their old jealousies freshly inflamed.”

“…But the notion that the clan is only a larger household is not consistent with the results of modern research. Kinship is an older thing than family life, and in the mot primitive societies know n to us the family or household group was not a subdivision of a clan, but contained members of more than one kindred. As a rule the savage man may not marry a clanswoman, and the children are of the mother’s kin, and therefore have no communion of blood religion with their father. In such a society their is hardly any family life, and there can be no sacred household meal.

“… The rudest nations have religious rule about food, based on the principle of kinship, viz,, that a man may not eat the totem animal of his clan; and they generally have some rites of the nature of the sacrificial feast of kinsmen; but it is not the custom of savages to take their ordinary daily food in a social way, in regular domestic meals. Their habit is to eat irregularly and apart, and this habit is strengthened by the religious rules, which often forbid to one member of a household the food which is permitted to another.”

Frankly, I think he is wrong. Set “meals” may be a modern innovation, but I highly doubt the Bushmen would be so picky as to allow one person in a family to eat a specific animal but forbid it to their spouse; same for the Inuit. There is far too much chance of starvation and hunger in these groups to go turning down good food.

“In Egypt, down to the present day, many persons hardly ever eat with their wives and children, and among the Arabs, boys who are not of full age do not presume to eat in the presence of their parents, but take their meals separately or with the women of the house No doubt the seclusion of women has retarded the development of family life in Mohammedan countries; but for most purposes this seclusion has never taken much hold on the desert, and yet in northern Arabia no woman will eat before men. … in Arabia the daily family meal has never been an established institution with such  a religious significance as attaches to the Roman supper.”

EvX: I don’t know much about Roman suppers, to be honest. I hear the Jews are into their Friday evening meals, though.

“… even among the agricultural Semites there is no trace of a sacrificial character being attached to ordinary household meals. The domestic hearth among the Semites was not an altar as it was at Rome. Almost all varieties of human food were offered to the gods, and any kind of food suffices, according to the laws of Arabian hospitality, to establish that bond between two men which in the last resort rests on the principle that only kinsmen eat together. It may seem, therefore, that in the abstract any sort of meal publicly partaken of by a company of kinsmen may constitute a sacrifice feast. The distinction between the feast and an ordinary meal lie, it may seem, not in the material or the copiousness of the repast, but in its public character. When men eat alone they do not invite the god to share their food, but when the clan eats together as a kindred unity the kindred god must also be of the party.

(source)
(source)

EvX: I am reminded here of Elijah’s cup, filled with wine and placed on the Passover table just in case the Prophet Elijah decides to show up for dinner. According to Wikipedia:

In the Talmudic literature, Elijah would visit rabbis to help solve particularly difficult legal problems. Malachi had cited Elijah as the harbinger of the eschaton. Thus, when confronted with reconciling impossibly conflicting laws or rituals, the rabbis would set aside any decision “until Elijah comes.”[24]

One such decision was whether the Passover seder required four or five cups of wine. Each serving of wine corresponds to one of the “four expressions of redemption” in the Book of Exodus: … The next verse, “And I will bring you into the land which I swore to give to Abraham, to Isaac, and to Jacob; I will give it to you for a possession. I am the Lord.” (Exodus 6:8) was not fulfilled until the generation following the Passover story, and the rabbis could not decide whether this verse counted as part of the Passover celebration (thus deserving of another serving of wine). Thus, a cup was left for the arrival of Elijah.

In practice the fifth cup has come to be seen as a celebration of future redemption. Today, a place is reserved at the seder table and a cup of wine is placed there for Elijah. During the seder, the door of the house is opened and Elijah is invited in. Traditionally, the cup is viewed as Elijah’s and is used for no other purpose.[25][26]

Returning to Smith:

“Practically, however, there is no sacrificial feast according to Semitic usage except where a victim is slaughtered. The rule of the Levitical law, that a cereal oblation, when offered alone, belongs wholly to the god and gives no occasion for a feast of worshipers, agrees with the older history, in which we never find a sacrificial meal of which flesh does not form a part. Among the Arabs the usage is the same; a religious banquet implies a victim.”

???

When anyone brings a grain offering to the Lord, their offering is to be of the finest flour. They are to pour olive oil on it, put incense on it and take it to Aaron’s sons the priests. The priest shall take a handful of the flour and oil, together with all the incense, and burn this as a memorial[a] portion on the altar, a food offering, an aroma pleasing to the Lord. The rest of the grain offering belongs to Aaron and his sons; it is a most holy part of the food offerings presented to the Lord.–Leviticus 2:1-3

“‘If the offering is a burnt offering from the herd, you are to offer a male without defect. You must present it at the entrance to the tent of meeting so that it will be acceptable to the Lord. You are to skin the burnt offering and cut it into pieces. The sons of Aaron the priest are to put fire on the altar and arrange wood on the fire. Then Aaron’s sons the priests shall arrange the pieces, including the head and the fat, on the wood that is burning on the altar. You are to wash the internal organs and the legs with water, and the priest is to burn all of it on the altar.–Leviticus 1:3-9

Am I misunderstanding Leviticus, or did Smith mix up the two forms of sacrifice?

Saint Nilus of Sinai
Saint Nilus of Sinai

Now Smith draws upon Nilus, “As to the habits of the Arabs of the Sinaitic desert towards the close of the fourth Christian century”

“The ordinary sustenance of these Saracens was derived from pillage or from hunting, to which, no doubt, must be added, as a main element, the milk of their herds. When these supplies failed they fell back on the flesh of their camels, one of which was slain for each clan … or for each group which habitually pitched their tents together… which according to known Arab usage would always be a fraction of a clan–and the flesh was hastily devoured by the kinsmen…”

According to Wikipedia:

About the year 390[2] or perhaps 404,[3] Nilus left his wife and one son and took the other, Theodulos, with him to Mount Sinai to be a monk. They lived here till about the year 410[4] when the Saracens, invading the monastery, took Theodulos prisoner. The Saracens intended to sacrifice him to their gods, but eventually sold him as a slave, so that he came into the possession of the Bishop of Elusa in Palestine. The Bishop received Theodulos among his clergy and made him door-keeper of the church. Meanwhile, Nilus, having left his monastery to find his son, at last met him at Elusa. The bishop then ordained them both priests and allowed them to return to Sinai.

Continuing with Smith: “To grasp the force of this evidence we must remember that, beyond question, the was at this time among the Saracens private property in camels, and that therefore, so far as the law of property went, there could be no reason why a man should not kill a beast for the use of his own family. And though a whole camel might be too much for a single household to eat fresh, the Arabs knew and practiced the art of preserving flesh by cutting it into strips and drying them in the sun. Under these circumstances private slaughter could not have failed to be customary, unless it was absolutely forbidden by tribal usage. In short, it appears that while milk, game, and the fruits of pillage were private food which might be eaten in any way, the camel was not allowed to be killed and eaten except in a public rite, at which all the kinsmen assisted.”

From his monastery at Sinai Nilus was a well known person throughout the Eastern Church; by his writings and correspondence he played an important part in the history of his time. He was known as a theologian, Biblical scholar and ascetic writer, so people of all kinds, from the emperor down, wrote to consult him. His numerous works, including a multitude of letters, consist of denunciations of heresy, paganism, abuses of discipline and crimes, of rules and principles of asceticism, especially maxims about the religious life. He warns and threatens people in high places, abbots and bishops, governors and princes, even the emperor himself, without fear. He kept up a correspondence with Gainas, a leader of the Goths, endeavouring to convert him from Arianism;[6] he denounced vigorously the persecution of St. John Chrysostom both to the Emperor Arcadius[7] and to his courtiers.[8]

Nilus must be counted as one of the leading ascetic writers of the 5th century.–Wikipedia

“This evidence is all the more remarkable because, among the Saracens of whom Nilus speaks, the slaughter of a camel in times of hunger does not seem to have been considered as a sacrifice to the gods. For a couple of pages later he speaks expressly of he sacrifices which these Arabs offered to the morning star, the sole deity they acknowledged. These could be performed only when the star was visible, and the whole victim–flesh, skin, and bones–had to be devoured before the sun rose upon it and the day-star disappeared. As this form of sacrifice was necessarily confined to seasons when the planet Venus was a morning star, while the necessity for slaughtering a camel as food might arise at any season, it is to be inferred that in the latter case the victim was not recognized as having a sacrificial character. … the Saracens of Nilus, like the Arabs generally in the last ages of heathenism, had ceased to do sacrifice to the tribal or clan god with whose worship the feast of kinsmen was originally connected. The planet Venus, or Lucifer, was not a tribal deity, but, as we know from a variety of sources, was worshiped by all the northern Arabs, to whatever kin they belonged. … ”

According to Wikipedia:

Ptolemy‘s Geography (2nd century CE) describes “Sarakene” as a region in the northern Sinai peninsula.[2] Ptolemy also mentions a people called the “Sarakenoi” living in north-western Arabia (near neighbor to the Sinai).[2] Eusebius of Caesarea refers to Saracens in his Ecclesiastical history, in which he narrates an account wherein Dionysius, Bishop of Alexandria, mentions Saracens in a letter while describing the persecution of Christians by the Roman emperor Decius: “Many were, in the Arabian mountain, enslaved by the barbarous ‘sarkenoi’.”[2]

But a few centuries after that, Europeans started using Saracen as a catch-all for Arabs and Muslims.

I have just started reading the Wikipedia page on Religion in pre-Islamic Arabia, but a quick search does not turn up “Venus” or “star.” I’ll be on the lookout for evidence one way or another regarding Smith’s claims.

Is there a correlation between intelligence and taste?

(I am annoyed by the lack of bands between 1200 and 1350)
(source)

De gustibus non disputandum est. — Confucius

We’re talking about foods, not whether you prefer Beethoven or Lil’ Wayne.

Certainly there are broad correlations between the foods people enjoy and their ethnicity/social class. If you know whether I chose fried okra, chicken feet, gefilte fish, escargot, or grasshoppers for dinner, you can make a pretty good guess about my background. (Actually, I have eaten all of these things. The grasshoppers were over-salted, but otherwise fine.) The world’s plethora of tasty (and not-so-tasty) cuisines is due primarily to regional variations in what grows well where (not a lot of chili peppers growing up in Nunavut, Canada,) and cost (the rich can always afford fancier fare than the poor,) with a side dish of seemingly random cultural taboos like “don’t eat pork” or “don’t eat cows” or “don’t eat grasshoppers.”

But do people vary in their experience of taste? Does intelligence influence how you perceive your meal, driving smarter (or less-smart) people to seek out particular flavor profiles or combinations? Or could there be other psychological or neurological factors at play n people’s eating decisions?

This post was inspired by a meal my husband, an older relative and I shared recently at McDonald’s. It had been a while since we’d last patronized McDonald’s, but older relative likes their burgers, so we went and ordered some new-to-us variety of meat-on-a-bun. As my husband and I sat there, deconstructing the novel taste experience and comparing it to other burgers, the older relative gave us this look of “Jeez, the idiots are discussing the flavor of a burger! Just eat it already!”

As we dined later that evening at my nemesis, Olive Garden, I began wondering whether we actually experienced the food the same way. Perhaps there is something in people that makes them prefer bland, predictable food. Perhaps some people are better at discerning different flavors, and the people who cannot discern them end up with worse food because they can’t tell?

Unfortunately, it appears that not a lot of people have studied whether there is any sort of correlation between IQ and taste (or smell.) There’s a fair amount of research on taste (and smell,) like “do relatives of schizophrenics have impaired senses of smell?” (More on Schizophrenics and their decreased ability to smell) or “can we get fat kids to eat more vegetables?” Oh, and apparently the nature of auditory hallucinations in epileptics varies with IQ (IIRC.) But not much that directly addresses the question.

I did find two references that, somewhat in passing, noted that they found no relationship between taste and IQ, but these weren’t studies designed to test for that. For example, in A Food Study of Monotony, published in 1958 (you know I am really looking for sources when I have to go back to 1958,) researchers restricted the diets of military personnel employed at an army hospital to only 4 menus to see how quickly and badly they’d get bored of the food. They found no correlation between boredom and IQ, but people employed at an army hospital are probably pre-selected for being pretty bright (and having certain personality traits in common, including ability to stand army food.)

Interestingly, three traits did correlate with (or against) boredom:

Fatter people got bored fastest (the authors speculate that they care the most about their food,) while depressed and feminine men (all subjects in the study were men) got bored the least. Depressed people are already disinterested in food, so it is hard to get less-interested, but no explanation was given of what they meant by “femininity” or how this might affect food preferences. (Also, the hypochondriacs got bored quickly.)

Some foods inspire boredom (or even disgust) quickly, while others are virtually immune. Milk and bread, for example, can be eaten every day without complaint (though you might get bored if bread were your only food.) Potted meat, by contrast, gets old fast.

Likewise, Personality Traits and Eating Habits (warning PDF) notes that:

Although self-reported eating practices were not associated with educational level, intelligence, nor various indices of psychopathology, they were related to the demographic variables of gender and age: older participants reported eating more fiber in their diets than did younger ones, and women reported more avoidance of fats from meats than did men.

Self-reported eating habits may not be all that reliable, though.

Autistic children do seem to be worse at distinguishing flavors (and smells) than non-autistic children, eg Olfaction and Taste Processing in Autism:

Participants with autism were significantly less accurate than control participants in identifying sour tastes and marginally less accurate for bitter tastes, but they were not different in identifying sweet and salty stimuli. … Olfactory identification was significantly worse among participants with autism. … True differences exist in taste and olfactory identification in autism. Impairment in taste identification with normal detection thresholds suggests cortical, rather than brainstem dysfunction.

(Another study of the eating habits of autistic kids found that the pickier ones were rated by their parents as more severely impaired than the less picky ones, but then severe food aversions are a form of life impairment. By the way, do not tell the parents of an autistic kid, “oh, he’ll eat when he’s hungry.” They will probably respond politely, but mentally they are stabbing you.)

On brainstem vs. cortical function–it appears that we do some of our basic flavor identification way down in the most instinctual part of the brain, as Facial Expressions in Response to Taste and Smell Stimulation explores. The authors found that pretty much everyone makes the same faces in response to sweet, sour, and bitter flavors–whites and blacks, old people and newborns, retarded people and blind people, even premature infants, blind infants, and infants born missing most of their brains. All of which is another point in favor of my theory that disgust is real. (And if that is not enough science of taste for you, I recommend Place and Taste Aversion Learning, in which animals with brain lesions lost their fear of new foods.)

Genetics obviously plays a role in taste. If you are one of the 14% or so of people who think cilantro tastes like soap (and I sympathize, because cilantro definitely tastes like soap,) then you’ve already discovered this in a very practical way. Genetics also obviously determine whether you continue producing the enzyme for milk digestion after infancy (lactase persistence). According to Why are you a picky eater? Blame genes, brains, and breastmilk:

In many cases, mom and dad have only themselves to blame for unwittingly passing on the genes that can govern finicky tastes. Studies show that genes play a major role in determining who becomes a picky eater, including recent research on a group of 4- to 7-year-old twins. Part of the pickiness can be attributed to specific genes that govern taste. Variants of the TAS2R38 gene, for example, have been found to encode for taste receptors that determine how strongly someone tastes bitter flavors.

Researchers at Philadelphia’s Monell Chemical Senses Center, a scientific institute dedicated to the study of smell and taste, have found that this same gene also predicts the strength of sweet-tooth cravings among children. Kids who were more sensitive to bitterness preferred sugary foods and drinks. However, adults with the bitter receptor genes remained picky about bitter foods but did not prefer more sweets, the Monell study found. This suggests that sometimes age and experience can override genetics.

I suspect that there is actually a sound biological, evolutionary reason why kids crave sweets more than grownups, and this desire for sweets is somewhat “turned off” as we age.

Picture 10

From a review of Why some like it hot: Food, Genetics, and Cultural Diversity:

Ethnobotanist Gary Paul Nabhan suggests that diet had a key role in human evolution, specifically, that human genetic diversity is predominately a product of regional differences in ancestral diets. Chemical compounds found within animals and plants varied depending on climate. These compounds induced changes in gene expression, which can vary depending on the amount within the particular food and its availability. The Agricultural Age led to further diet-based genetic diversity. Cultivation of foods led to the development of novel plants and animals that were not available in the ancestral environment. …

There are other fascinating examples of gene-diet interaction. Culturally specific recipes, semi-quantitative blending of locally available foods and herbs, and cooking directions needed in order to reduce toxins present in plants, emerged over time through a process of trial-and error and were transmitted through the ages. The effects on genes by foods can be extremely complex given the range of plant-derived compounds available within a given region. The advent of agriculture is suggested to have overridden natural selection by random changes in the environment. The results of human-driven selection can be highly unexpected. …

In sedentary herding societies, drinking water was frequently contaminated by livestock waste. The author suggests in order to avoid contaminated water, beverages made with fermented grains or fruit were drunk instead. Thus, alcohol resistance was selected for in populations that herded animals, such as Europeans. By contrast, those groups which did not practice herding, such as East Asians and Native Americans, did not need to utilize alcohol as a water substitute and are highly sensitive to the effects of alcohol.

Speaking of genetics:

(source?)
From Eating Green could be in your Genes

Indians and Africans are much more likely than Europeans and native South Americans to have an allele that lets them eat a vegetarian diet:

The vegetarian allele evolved in populations that have eaten a plant-based diet over hundreds of generations. The adaptation allows these people to efficiently process omega-3 and omega-6 fatty acids and convert them into compounds essential for early brain development and controlling inflammation. In populations that live on plant-based diets, this genetic variation provided an advantage and was positively selected in those groups.

In Inuit populations of Greenland, the researchers uncovered that a previously identified adaptation is opposite to the one found in long-standing vegetarian populations: While the vegetarian allele has an insertion of 22 bases (a base is a building block of DNA) within the gene, this insertion was found to be deleted in the seafood allele.

Of course, this sort of thing inspires a wealth of pop-psych investigations like Dr. Hirsch’s What Flavor is your Personality?  (from a review:

Dr. Hirsh, neurological director of the Smell and Taste Research and Treatment Foundation in Chicago, stands by his book that is based on over 24 years of scientific study and tests on more than 18,000 people’s food choices and personalities.)

that nonetheless may have some basis in fact, eg: Personality may predict if you like spicy foods:

Byrnes assessed the group using the Arnett Inventory of Sensation Seeking (AISS), a test for the personality trait of sensation-seeking, defined as desiring novel and intense stimulation and presumed to contribute to risk preferences. Those in the group who score above the mean AISS score are considered more open to risks and new experiences, while those scoring below the mean are considered less open to those things.

The subjects were given 25 micrometers of capsaicin, the active component of chili peppers, and asked to rate how much they liked a spicy meal as the burn from the capsaicin increased in intensity. Those in the group who fell below the mean AISS rapidly disliked the meal as the burn increased. People who were above the mean AISS had a consistently high liking of the meal even as the burn increased. Those in the mean group liked the meal less as the burn increased, but not nearly as rapidly as those below the mean.

And then there are the roughly 25% of us who are “supertasters“:

A supertaster is a person who experiences the sense of taste with far greater intensity than average. Women are more likely to be supertasters, as are those from Asia, South America and Africa.[1] The cause of this heightened response is unknown, although it is thought to be related to the presence of the TAS2R38 gene, the ability to taste PROP and PTC, and at least in part, due to an increased number of fungiform papillae.[2]

Perhaps the global distribution of supertasters is related to the distribution of vegetarian-friendly alleles. It’s not surprising that women are more likely to be supertasters, as they have a better sense of smell than men. What may be surprising is that supertasters tend not to be foodies who delight in flavoring their foods with all sorts of new spices, but instead tend toward more restricted, bland diets. Because their sense of taste is essentially on overdrive, flavors that taste “mild” to most people taste “overwhelming” on their tongues. As a result, they tend to prefer a much more subdued palette–which is, of course, perfectly tasty to them.

Picture 8A French study, Changes in Food Preferences and Food Neophobia during a Weight Reduction Session, measured kids’ ability to taste flavors, then the rate at which they became accustomed to new foods. The more sensitive the kids were to flavors, the less likely they were to adopt a new food; the less adept they were at tasting flavors, the more likely they were to start eating vegetables.

Speaking of pickiness again:

“During research back in the 1980s, we discovered that people are more reluctant to try new foods of animal origin than those of plant origin,” Pelchat says. “That’s ironic in two ways. As far as taste is concerned, the range of flavors in animal meat isn’t that large compared to plants, so there isn’t as much of a difference. And, of course, people are much more likely to be poisoned by eating plants than by animals, as long as the meat is properly cooked.” …

It’s also possible that reward mechanisms in our brain can drive changes in taste. Pelchat’s team once had test subjects sample tiny bits of unfamiliar food with no substantial nutritional value, and accompanied them with pills that contained either nothing or a potent cocktail of caloric sugar and fat. Subjects had no idea what was in the pills they swallowed. They learned to like the unfamiliar flavors more quickly when they were paired with a big caloric impact—suggesting that body and brain combined can alter tastes more easily when unappetizing foods deliver big benefits.

So trying to get people to adopt new foods while losing weight may not be the best idea.

(For all that people complain about kids’ pickiness, parents are much pickier. Kids will happily eat playdoh and crayons, but one stray chicken heart in your parents’ soup and suddenly it’s “no more eating at your house.”)

Of course, you can’t talk about food without encountering meddlers who are convinced that people should eat whatever they’re convinced is the perfect diet, like these probably well-meaning folks trying to get Latinos to eat fewer snacks:

Latinos are the largest racial and ethnic minority group in the United States and bear a disproportionate burden of obesity related chronic disease. Despite national efforts to improve dietary habits and prevent obesity among Latinos, obesity rates remain high. …

there is a need for more targeted health promotion and nutrition education efforts on the risks associated with soda and energy-dense food consumption to help improve dietary habits and obesity levels in low-income Latino communities.

Never mind that Latinos are one of the healthiest groups in the country, with longer life expectancies than whites! We’d better make sure they know that their food ways are not approved of!

I have been saving this graph for just such an occasion.
Only now I feel bad because I forgot to write down who made this graph so I can properly credit them. If you know, please tell me!

(Just in case it is not clear already: different people are adapted to and will be healthy on different diets. There is no magical, one-size-fits-all diet.)

And finally, to bring this full circle, it’s hard to miss the folks claiming that Kids Who Eat Fast Food Have Lower IQs:

4,000 Scottish children aged 3-5 years old were examined to compare the intelligence dampening effects of fast food consumption versus  “from scratch”  fare prepared with only fresh ingredients.

Higher fast food consumption by the children was linked with lower intelligence and this was even after adjustments for wealth and social status were taken into account.

It’d be better if they controlled for parental IQ.

The conclusions of this study confirm previous research which shows long lasting effects on IQ from a child’s diet. An Australian study from the University of Adelaide published in August 2012 showed that toddlers who consume junk food grow less smart as they get older. In that study, 7000 children were examined at the age of 6 months, 15 months, 2 years to examine their diet.

When the children were examined again at age 8, children who were consuming the most unhealthy food had IQs up to 2 points lower than children eating a wholesome diet.

 

 

Why do people watch so much TV?

Honestly, left to my own devices, I wouldn’t own a TV. (With Mythbusters canceled, what’s the point anymore?)

Don’t get me wrong. I have watched (and even enjoyed) the occasional sitcom. I’ve even tried watching football. I like comedies. They’re funny. But after they end, I get that creeping feeling of emptiness inside, like when you’ve eaten a bowl of leftover Halloween candy instead of lunch. There is no “meat” to these programs–or vegan-friendly vegetable protein, if you prefer.

I do enjoy documentaries, though I often end up fast-forwarding through large chunks of them because they are full of filler shots of rotating galaxies or astronomers parking their telescopes or people… taalkiiing… sooo… sloooowwwwlllly… And sadly, if you’ve seen one documentary about ancient Egypt, you’ve seen them all.

Ultimately, time is a big factor: I am always running short. Once I’m done with the non-negotiables (like “take care of the kids” and “pay the bills,”) there’s only so much time left, and time spent watching TV is time not spent writing. Since becoming a competent writer is one of my personal goals, TV gets punted to the bottom of the list, slightly below doing the dishes.

Obviously not everyone writes, but I have a dozen other backup projects for when I’m not writing, everything from “read more books” to “volunteer” to “exercise.”

I think it is a common fallacy to default to assuming that other people are like oneself. I default to assuming that other people are time-crunched, running on 8 shots of espresso and trying to cram in a little time to read Tolstoy and get the tomatoes planted before they fall asleep. (And I’m not even one of those Type-A people.)

Obviously everyone isn’t like me. They come home from work, take care of their kids, make dinner, and flip on the TV.

Why?

An acquaintance recently made a sad but illuminating comment regarding their favorite TV shows, “I know they’re not real, but it feels like they are. It’s like they’re my friends.”

I think the simple answer is that we process the pictures on the TV as though they were real. TV people look like people and sound like people, so who cares if they don’t smell like people? Under normal (pre-TV) circumstances, if you hung out with some friendly, laughing people every day in your living room, they were your family. You liked them, they liked you, and you were happy together.

Today, in our atomized world of single parents, only children, spinsters and eternal bachelors, what families do we have? Sure, we see endless quantities of people on our way to work, but we barely speak, nod, or glance at each other, encapsulated within our own cars or occupied with checking Facebook on our cellphones while the train rumbles on.

As our connections to other people have withered away, we’ve replaced them with fake ones.

Google “America’s Favorite Family“:

OZZIE & HARRIET: The Adventures of America’s Favorite Family

The Adventures of Ozzie and Harriet was the first and longest-running family situational comedy in television history. The Nelsons came to represent the idealized American family of the 1950s – where mom was a content homemaker, dad’s biggest decision was whether to give his sons the keys to the car, and the boys’ biggest problem was getting a date to the high school prom. …When it premiered, Ozzie & Harriet: The Adventures of America’s Favorite Family was the highest-rated documentary in A&E’s history.

(According to Wikipedia, Ozzie and Harriet started on the radio back in the 30s, got a comedy show (still on radio) in 1944, and were on TV from 1952-1966.) It was, to some extent, about a real family–the actors in the show were an actual husband and wife + their kids, but the show itself was fictionalized.

It even makes sense to people to ask them, “Who is your favorite TV personality?“–to which the most common answer isn’t Adam Savage or James Hyneman, but Mark Harmon, who plays some made-up guy named Leroy Jethro Gibbs.

The rise of “reality TV” only makes the “people want to think of the TV people as real people they’re actually hanging out with” all the more palpable–and then there’s the incessant newsstand harping of celebrity gossip. The only thing I want out of a movie star (besides talent) is that I not recognize them; it appears that the only thing everyone else wants is that they do recognize them.

According to The Way of the Blockbuster: In entertainment, big bets on likely winners rule:

in Blockbusters: Hit-Making, Risk-Taking, and the Big Business of Entertainment, the new book by Anita Elberse, Filene professor of business administration. Elberse (el-BER-see) spent 10 years interviewing and observing film, television, publishing, and sports executives to distill the most profitable strategy for these high-profile, unpredictable marketplaces. … The most profitable business strategy, she says, is not the “long tail,” but its converse: blockbusters like Star Wars, Avatar, Friends, the Harry Potter series, and sports superstars like Tom Brady.

Strategically, the blockbuster approach involves “making disproportionately big investments in a few products designed to appeal to mass audiences,” … “Production value” means star actors and special effects. … a studio can afford only a few “event movies” per year. But Horn’s big bets for Warner Brothers—the Harry Potter series, The Dark Knight, The Hangover and its sequel, Ocean’s Eleven and its two sequels, Sherlock Holmes—drew huge audiences. By 2011, Warner became the first movie studio to surpass $1 billion in domestic box-office receipts for 11 consecutive years. …

Jeff Zucker ’86 put a contrasting plan into place as CEO at NBC Universal. In 2007 he led a push to cut the television network’s programming costs: … Silverman began cutting back on expensive dramatic content, instead acquiring rights to more reasonably priced properties; eschewing star actors and prominent TV producers, who commanded hefty fees; and authorizing fewer costly pilots for new series. The result was that by 2010, NBC was no longer the top-rated TV network, but had fallen to fourth place behind ABC, CBS, and Fox, and “was farther behind on all the metrics that mattered,” writes Elberse, “including, by all accounts, the profit margins Zucker and Silverman had sought most.” Zucker was asked to leave his job in 2010. …

From a business perspective, “bankable” movies stars like Julia Roberts, Johnny Depp, or George Clooney function in much the way Harry Potter and Superman do: providing a known, well-liked persona.

So people like seeing familiar faces in their movies (except Oprah Winfrey, who is apparently not a draw:

the 1998 film Beloved, starring Oprah Winfrey, based on Nobel Prize-winner Toni Morrison’s eponymous 1987 novel and directed by Oscar-winner Jonathan Demme … flopped resoundingly: produced for $80 million, it sold only $23 million in tickets.

Or maybe Beloved isn’t just the kind of feel-good action flick that drives movie audiences the way Batman is.)

But what about sports?

Here I am on even shakier ground than sitcoms. I can understand playing sports–they’re live action versions of video games, after all. You get to move around, exercise, have fun with your friends, and triumphantly beat them at something. (Or if you’re me, lose.) I can understand cheering for your kids and being proud of them as they get better and better at some athletic skill (or at least try hard at it.)

I don’t understand caring about strangers playing a game.

I have no friends on the Yankees or the Mets, the Phillies or the Marlins. I’ve never met a member of the Alabama Crimson Tide or the Clemson Tigers, and I harbor no illusions that my children will ever play on such teams. I feel no loyalty to the athletes-drawn-from-all-over-the-country who play on my “hometown” team, and I consider athlete salaries vaguely obscene.

I find televised sports about as interesting as watching someone do math. If the point of the game is to win, then why not just watch a 5-minute summary at the end of the day of all the teams’ wins and losses?

But according to The Way of the Blockbuster:

Perhaps no entertainment realm takes greater care in building a brand name than professional sports: fan loyalty reliably builds repeat business. “The NFL is blockbuster content,” Elberse says. “It’s the most sought-after content we have in this country. Four of the five highest-rated television shows [in the United States] ever are Super Bowls. NFL fans spend an average of 9.5 hours per week on games and related content. That gives the league enormous power when it comes to negotiating contracts with television networks.”

Holy shit. No wonder Borders went under.

Elberse has studied American football and basketball and European soccer, and found that selling pro sports has much in common with selling movies, TV shows, or books. Look at the Real Madrid soccer club—the world’s richest, with annual revenues of $693 million and a valuation of $3.3 billion. Like Hollywood studios, Real Madrid attracts fan interest by engaging superstars—such as Cristiano Ronaldo, the Portuguese forward the club acquired from Manchester United for a record $131.6 million in 2009. “We think of ourselves as content producers,” a Real Madrid executive told Elberse, “and we think of our product—the match—as a movie.” As she puts it: “It might not have Tom Cruise in it, but they do have Cristiano Ronaldo starring.

In America, sports stars are famous enough that even I know some of their names, like Peyton Manning, Serena Williams, and Michel Jackson Jordan.

I think the basic drive behind people’s love of TV sports is the same as their love of sitcoms (and dramas): they process it as real. And not just real, but as people they know: their family, their tribe. Those are their boys out there, battling for glory and victory against that other tribes’s boys. It’s vicarious warfare with psuedo armies, a domesticated expression of the tribal urge to slaughter your enemies, drive off their cattle and abduct their women. So what if the army isn’t “real,” if the heroes aren’t your brother or cousin but paid gladiators shipped in from thousands of miles away to perform for the masses? Your brain still interprets it as though it were; you still enjoy it.

Football is man-fiction.

Micro solar panels for Detroit?

We Americans like to think we live in a first world country, but there are plenty of areas–like inner cities or far rural regions–where the complex supply chains people take for granted in the suburbs (“Of course I can buy raspberries in January. Why wouldn’t I?”) don’t work or don’t exist.

For example, relatives of mine who live in a rural part of the country and therefore are not hooked up to a city water pipe are dependent on well water. But a recent drought dried up their wells, and they ended up with no running water for several years. Thankfully the drought ended and they now have water, but droughts recur; I would not be surprised if they ended up without water again sometime within the next couple of decades.

Likewise, there are people in Detroit who lack running water, though for very different reasons (my relatives were amply willing to pay for water if anyone would pipe it over to them.)

I was reading the other day about the difficulties surrounding gentrification. Basically, you start with an urban neighborhood that’s run down or perhaps has always been kind of shitty, and eventually someone clever realizes that there’s no sensible reason why one piece of urban real estate should command higher prices than another piece of urban real estate and starts trying to fix things. So they buy up decrepit old buildings, clean them up, get new businesses to move into the area, and generally try to turn a profit–house flipping on the neighborhood scale. Of course, as soon as the neighborhood starts looking nicer and stops scaring people away, the rents go up and the original residents are forced out.

Which is a big win if you’re a developer, because those original residents were a large part of the reason why the neighborhood you’re trying to flip was so shitty in the first place, but kind of sucks if you are one of those people who can no longer afford rent. Which means, among other things, that you’ll often get  local kick-back against your gentrification schemes: (h/t Steve Sailer)

Hardline tactics succeed in keeping outsiders away from Boyle Heights, the Latino community that is the last holdout to Los Angeles gentrification.

A realtor who invited clients to tour the neighbourhood for bargain properties and enjoy “artisanal treats” felt the backlash within hours.

“I can’t help but hope that your 60-minute bike ride is a total disaster and that everyone who eats your artisanal treats pukes immediately,” said one message. “Stay outta my f****** hood,” said another.

Fearing violence, the realtor cancelled the event.

So you end up with a lot of articles about people who want to gentrify neighborhoods but swear up and down that they don’t want to drive out the local residents or destroy their lives, and some of these folks might actually be honest. But these goals are often incompatible: gentrification raises rents, which drives out the lowest classes of society.

As I see it, economically depressed areas, be they urban or rural, have one thing in common: low complexity. Rural areas have low complexity because that’s just a side effect of being far away from other people; urban areas end up with low complexity either because of shifts in economic production (eg, the death of American manufacturing leading to abandoned factories and unemployed people across the “rust belt,”) or because the folks in them can’t handle complexity.

Human society is complicated (and American society, doubly so.) Businesses don’t just get opened and people employed because someone wants to; there’s a whole lot of paperwork involved before anything gets done.

I am reminded here of a passage in Bourgois’s In Search of Respect: Selling crack in el Barrio, where a Harlem drug dealer who wanted to go straight and get a legal job attempted to open a small food store, but got shut down because his bathroom was not wheelchair accessible. So the guy went back to selling crack.

On a similar note, when my relatives ran out of water, there existed an obvious technical fix: deepen the well. But drilling wells is neither cheap nor easy, if you lack the right tools, and beyond the average individual’s abilities. How lucky, I thought, that there exist many charities devoted to drilling wells for people! How unlucky, I discovered, that these charities only drill wells in the third world. I made some inquiries and received a disheartening response: the charities did not have the necessary paperwork filled out and permits granted to drill wells here in the US.

Much regulation exists not because it benefits anyone (trust me, a wheelchair-bound person is better off with non-ADA compliant food store in their neighborhood than a crack house,) but to shut down smaller businesses that cannot handle the cost of compliance.

In simple terms: More regulation => more suffering poor people.

Everyone has a maximum level of complexity they can personally handle; collectively, so do groups of people. Hunter-gatherer groups have very low levels of complexity; Tokyo has a very high level of complexity. When complexity falls in a neighborhood (say, because the local industries move out and rents fall and businesses close,) the residents with the most resources (internal and external) tend to move out, leaving the area to the least competent–greatly increasing the percentage of criminals, druggies, prostitutes, homeless, and other transients among folks just trying to survive.

Attempting to raise the level of complexity in such an area beyond what the local people can manage (or beyond what the environment itself can handle) just doesn’t work. Sure, from the developers’ POV, it’s no big deal if people leave, but from the national perspective, we’re just shifting problems around.

Obviously, if you care about poor people and want to do something to help them, step number one is to decrease regulations/paperwork. Unfortunately, I don’t have much hope of this short of a total societal breakdown and reset, so in the meanwhile, l got to thinking about these small-scale development projects people are trying in the third world, like micro-solar panels, composting toilets, or extremely cheap water pumps. Now, I agree that most of these articles are pie-in-the-sky, “This time we’re totally going to solve poverty for realsies, not like all of those other times!” claptrap. The problem with most of these projects is, of course, complexity. You install a water pump in some remote village, a part breaks, and now the villagers have no idea how to get a new part to fix it.

If you’ve read Josephine and Frederick’s account of their attempt to drive from Lubumbashi to Kinshasa–a distance of about a thousand miles, or 1,500 km–in the DRC, then you’ve probably noticed how much of the infrastructure in parts of the third world was built by the colonizers, and has degenerated since then do to lack of maintenance. These systems are too complex for the people using them, so they de-complexify until they aren’t.

So for third-world development schemes to work, they can’t be too complex. You can’t expect people to spend three weeks trekking through the bush to order parts in the nearest cities or to read thick manuals, and they certainly don’t have a lot of money to invest.

So when these projects are successful, we know they have managed to deal adequately with the complexity problem.

Micro solar panels, for example, might provide enough power to charge a cell phone or run an electric light for a few hours, and can be easily “installed” by clipping them onto the outside of a high-rise tenement window, where they are relatively safe from random thieves. For people who can’t afford electricity, or who have to chose between things like paying rent and having hot showers, such panels could make a difference.

In rural areas with unreliable water supplies, cheap pumps could run water from local streams to toilets or filtration systems; composting toilets and the like provide low-water options.

Such projects need not be run as charities–in fact, they probably shouldn’t be; if a project increases peoples’ economic well-being, then they should be able to pay for it. If they can’t, then the project probably isn’t working. But they might require some kind of financing, as cost now, savings later is not a model most poor people can afford.

Cathedral Round-Up #10

… as Vattimo suggests, the “accomplished nihilism of the real (Western) world gives us nothing substantial for our rhetorics except an insubstantial rhetoric. .. I criticize intellectual practices that are too close to the narcissism of insiders, whose proposition and theories, despite their critical appearance, recode forms of stabilization; I seek instead to affirm the possibility of something like a nonrationalizing (counternarcissistic) intellectual endeavor. –Sande Cohen, Academia and the Luster of Capital

Chances are you recall the uprisings on college campuses around the country last fall, sparked by the Yale Halloween Costume Email controversy and the Missouri protest. The protestors presented their respective colleges with Demands, largely centering on public apologies for past injustice, mandatory SJW-indoctrination for all students and faculty, and more money for minority teachers, staff, students, and programs.

So I wanted to check up on how colleges have responded. (List is not inclusive; I have tried to focus on the most well-known institutions.)

Response to Amherst College Demands:

President Martin’s Statement on Campus Protests

On Thursday night I attended a student-organized protest against racism and other entrenched forms of prejudice and inequality. … Over the course of several days, a significant number of students have spoken eloquently and movingly about their experiences of racism and prejudice on and off campus.  The depth and intensity of their pain and exhaustion are evident. … It is good that our students have seized this opportunity to speak, rather than further internalizing the isolation and lack of caring they have described.  What we have heard requires a concerted, rigorous, and sustained response.

The organizers of the protests also presented me with a list of demands on Thursday evening.  While expressing support for their goals, I explained that the formulation of those demands assumed more authority and control than a president has or should have. … I explained that I did not intend to respond to the demands item by item, or to meet each demand as specified, but instead to write a statement that would be responsive to the spirit of what they are trying to achieve—systemic changes that we know we need to make. … I was asked to read this statement to students today in Frost Library and did so at noon.

Also:

• Trustees abandon Lord Jeffery Amherst, commander who endorsed plan to “extirpate” Indians with smallpox-laden blankets, as symbol and unofficial mascot of Amherst College. School name will remain.

Response to Boston College Demands:

… the university announced it would convene a university committee on race. The Undergraduate Government at Boston College set a January 19 deadline for the administration to release a plan to “create a more racially inclusive campus,” but the administration missed the deadline and didn’t release any statement as to when an action plan would be released. …
Boston College spokesman Jack Dunn that suggest there isn’t any problem that needs to be addressed. In November, Dunn stated, “The supposition that BC is an institutionally racist place is a difficult argument to make … I think that’s a false assumption, an unfair assumption, and impugns the integrity of so many good people on this campus who’ve joined this community precisely because they’re people of good will who oppose all elements of bigotry,” according to an article in the college’s independent newspaper, The Heights.

Response to Brandeis University Demands:

Acting Brandeis University President Lisa M. Lynch is pushing for changes she hopes will increase diversity in the student body and staff — but she won’t do it on a timetable set by student protesters.

Lynch, with the backing of the Waltham school’s board of trustees, sent a multipage letter to the campus community this weekend after meeting with students who have occupied the Bernstein-Marcus Administrative Center — which includes Lynch’s office. …

“The atmosphere described by our students is painful to hear and calls on all of us to address these issues,’’ Lynch wrote. In her letter, Lynch aligned herself broadly with the goal of increasing diversity at all levels of the university …

Also:

• After a 12-day sit-in, Brandeis commits to increasing applicants of color (now 17 percent) by 5 to 10 percentage points annually and to double underrepresented faculty members (5 percent in 2014) by 2021.

See also: Reaffirming and Accelerating Brandeis’ Commitment to Diversity, Inclusion, and Racial Justice and Statements of Support and Commitments to Action to Advance Diversity and Inclusion at Brandeis University by Department, School, and Program

Response to Brown University Demands:

On Monday, Nov. 16, … Concerned Graduate Students of Color at Brown University came together to publish a list of demands and request a written response from the administration within one week. The working draft of the Diversity and Inclusion Action Plan (DIAP) was released by President Christina Paxson’s office on Nov. 19, 2015. … We, Graduate Students of Color, reject this plan as a response to our demands.

The anticipated 10-year, $100 million investment in diversity and inclusion sounds impressive, but note that this is a mere 3 percent of Paxson’s new $3 billion Brown Together capital campaign.  …

See also Brown U releases $100 million plan to increase inclusivity, ; plan later increased to $165 million.

Also:

• Brown faculty vote on Feb. 2 that Columbus Day will be known as Indigenous People’s Day, prompted by students objecting: “We don’t celebrate genocide.”

Response to Claremont McKenna College Demands:

• Mary Spellman, dean of students at Claremont McKenna College in California, steps down after making a statement about students not fitting “our C.M.C. mold.”

Response to Dartmouth College Demands: (warning PDF)

… we write as members of the senior leadership of the College and people who care deeply about Dartmouth. We want to share a message with the community: we hear your concerns about ensuring that Dartmouth is not only diverse in numbers, but also a place where all community members thrive. …

We couldn’t agree with you more. Diversity is one of the cornerstones of our academic community and, like you, we want Dartmouth to be a campus where our students gain the confidence and skills to work and lead in a global society. … Recently, a presentation of the “Freedom Budget” document highlighted for us that we, as the administration, must engage the campus more effectively in current and future action to achieve our shared vision for Dartmouth …

  • More than $30 million will be invested in the Society of Fellows program to bring recent post-doctorates to campus. Post-doctoral programs have been an effective tool for recruiting diverse faculty from other campuses. …
  • The E.E. Just Program, which supports the academic success of under-represented students in the science, technology, engineering, and mathematics (STEM) fields, will undergo a major expansion.
  • The Office of the President is sponsoring a three-year program project to help make Dartmouth Outing Club activities accessible to students receiving financial aid.
  • Dartmouth will provide $1 million in recurring funds to support the cost of hiring faculty who bring diverse perspectives to campus.

We can and will do more.

Response to Duke U Demands: (also PDF)

In response to student demands presented at the Duke Tomorrow forum Nov. 20, President Richard Brodhead sent an email last Tuesday to the students who organized the forum assuring them of his commitment to deal with the concerns they raised. … Brodhead’s email noted that the Task Force on Bias and Hate Issues will be responsible for considering many of the demands presented. He added that orientation programs and faculty diversity efforts—which were also included in the demands—are already in place.

“We look forward to working with all members of the Duke community to make the University a better place,” Brodhead wrote in the email.

Response to Emory Demands:

• Emory promises task force to “examine the feasibility of a geofence” to block the social media app Yik Yak in university ZIP codes to protect African-American students from what a black student group calls “intolerable and psychologically detrimental material.”

Response to Georgetown Demands:

• Mulledy Hall and McSherry Hall — named for Georgetown presidents who organized the sale of 272 slaves to settle university debts — are renamed. Students further demand the creation of an endowment, at the current value of the sale’s profit, to recruit “black identifying” professors.

Response to Harvard U Demands:

HLS seal re-designed by "Reclaiming HLS"
HLS seal re-designed by “Reclaiming HLS”

[Minow] has already taken several steps to respond to some of the student demands and formulated her own plans to improve race relations at the schools. She has appointed a committee to consider changing the school’s seal, which she said last Monday would require the Harvard Corporation’s approval; administrators have also said they will work to create a more diverse faculty and hire a staff member to focus on diversity issues. …

On Friday, however, Minow primarily watched and listened as students spoke. “Thinking, listening, thank you,” she said, after Leland S. Shelton, the president of the Harvard Black Law Students Association, reiterated each demand and asked if she was prepared to immediately agree to any of them. …

In an email sent to Law School affiliates on Friday, Minow wrote that she will carefully consider the student demands.

“I listened carefully,” Minow wrote. “I will do my best to ensure that we find ways to work together, joining students, staff, and faculty to address proposals and above all to strengthen this School and its possibilities to be better and to make the world better.”

Also:

College officials released the working group’s report Thursday. It included recommendations to diversify the College, and to support affinity-based students groups on campus and in multicultural centers, among others.

Harvard Law School has decided to officially chance the seal, though I don’t know yet what to.

Response to Ithaca College Demands:

Thomas R. Rochon, president of Ithaca College, pens an opinion piece asserting college presidents should step up, not down; in January, he announces he will step down, effective next year.

Response to Johns Hopkins Demands: (also)

Called on to address the student’s demands, Daniels pointed to the new Faculty Diversity Initiative, a multimillion dollar effort designed to help each of the university’s divisions find, attract, and retain the most talented faculty representing a broad diversity of backgrounds and experiences. The effort, unveiled earlier Monday, has been in the works for more than a year.

In response to a question suggesting that the initiative could lead to more qualified candidates being passed over, Provost Robert C. Lieberman said: “I would very, very strongly resist the premise of your question, which is that sometimes diversity and excellence or standards are opposed to each other. They in fact reinforce each other, and we will only be excellent to the extent that we are diverse.”

Daniels pledged transparency on the topic in the form of a report on the composition of the faculty, to be issued every two years. He also announced plans to strengthen the university’s Center for Africana Studies with the addition of five new faculty members—two in the center, two in the Department of History, and one interdisciplinary scholar.

One of the students’ requests was for a mandatory cultural competency course for all undergraduates. Daniels said that a single course required for all students “goes against the grain of choice that is embedded in our curriculum,” but that “other approaches to that issue are on the table.” He said possibilities open to discussion include establishment of a distribution requirement, mandating that students choose from among a set of courses in which cultural differences are considered.

Daniels also backed establishing a comprehensive diversity training program for the faculty, staff, and all students. A pilot training program was implemented at student orientation this past fall, and a working group to develop training recommendations will be launched by the start of the spring semester.

Response to Missouri State U Demands (not to be confused with U Missouri):

The joint statement from MSU president Clif Smart and Board of Governors Chair Stephen Hoven explained ongoing efforts to increase diversity and inclusion, announced plans to expand multicultural programming and outlined numerous ways students can help shape decisions.

“Recently, a group of students took the time and initiative to remind us of our responsibility and commitment to provide you with an inclusive environment that fosters learning, growth and opportunity. Pointing to the ongoing challenges that our nation continues to face in terms of diversity and inclusion, these students have presented important questions, made requests, and asked that we stop what we are doing to listen and respond,” …  “We have stopped, we are listening and we offer this letter in another effort to address those concerns.” …

MSU officials, in the Tuesday statement, noted that improving diversity and inclusion has been a top priority in recent years and said that commitment will continue with three overarching goals:

• Expand diversity programs

• Increase enrollment and retention of diverse students from “underrepresented” backgrounds

• Expand the pool of diverse faculty and staff

The president of MSU is not-ironically named Clifton Smart III.

Response to NYU Demands:

???

Response to Oberlin Demands: (PDF)

• Oberlin dining services promises “culturally sensitive menus” after demands for more traditional foods, including fried chicken, at Afrikan Heritage House, and for more indigenous versions of General Tso’s chicken and banh mi. Oberlin president finds 14 pages of other “demands and not suggestions” (e.g., eliminate Western-centered course requirements) even less palatable. In January, he announces he won’t respond to them.

Dammed if you do, dammed if you don’t: one of the complaints protesters lodged against UC Irvine:

a. In 2011, to begin the Cross Cultural Center’s 28th annual Martin Luther King Jr. symposium, UCI’s Hospitality and Dining services served fried chicken and waffles in “honor” of the event.

I don’t think there’s any agreement on whether serving fried chicken is “culturally sensitive” or “horribly racist”–which I find especially weird because everyone in the South, white and black, eats fried chicken. Also, BBQ is totally better than fried chicken.

Response to Princeton Demands:

Last week, the president of Princeton University agreed to implement or consider the demands of student protesters who had taken over his office, including providing black students a cultural space on the Ivy League campus and initiating discussions about “cultural competency” training. Christopher Eisgruber also agreed to open a debate about Woodrow Wilson’s legacy at Princeton. …

Cecilia Rouse, the dean of the Woodrow Wilson School of Public and International Affairs, welcomes the discussion. Rouse agrees that changing a name would be an easy thing to do, and that much more difficult challenges remain, such as how to develop a curriculum that is less focused on Europe, how to have course readings that are more reflective of the world, and how to ensure that faculty are comfortable talking about race.

Response to Tufts U Demands:

???

Response to UCLA Demands:

After heads rolled over a Kanye-Western themed frat party at the University of California, Los Angeles (UCLA), several adjustments to the UCLA campus climate have been made, including suspension of the social groups that hosted the party for alleged “racist undertones” of their event.  …

On October 22, UCLA’s vice chancellor Janina Montero responded … that she is open to many of the ASU’s demands, including exclusive funding for the ASU, revision of the school’s anti-discrimination policies, an “Afro-house” for black students, a student advisory board for campus diversity, increased enrollment of black students, and creation of a Black Student Leadership Task Force. She also said that the chancellor has collaborated with the LAUSD to build the Horace Mann UCLA Community School in South Los Angeles.

Response to U of Kentucky Demands:

Each time our student passes the images on his way to class or a movie or a speaker, this student — one of us — must confront humiliating images that bear witness to how we still fall short of being citizens together in what Dr. King called the “beloved community.” And countless other current students, faculty, staff, prospective students and their families, and other visitors to our campus, endure the same pain when they walk into one of our University’s signature and busiest venues. Moreover, this is often the first exposure people have to our campus, our culture, and our values.

This cannot continue. In spite of the artist’s admirable, finely honed skill that gave life to the mural, we cannot allow it to stand alone, unanswered by and unaccountable to the evolutionary trajectory of our human understanding and our human spirit.

Before:

After: 

Both photos Credit Mark Cornelison/Lexington Herald-Leader

Response to U of Missouri Demands:

• Charged with a sluggish response to racist incidents, Timothy M. Wolfe and R. Bowen Loftin, top University of Missouri officials, cave when football players threaten to strike, raising the specter of a forfeit penalty of more than $1 million.

Response to Yale U Demands:

Yale President Plans ‘Significant Changes’ In Response To Student Demands

Declaring that there is still much “unfinished work,” Yale University President Peter Salovey Tuesday offered a detailed response to student demands in the wake of rising racial tensions on campus.

Salovey, under intense pressure from the Yale community, proposed “a structure to build a more inclusive Yale” that would add faculty, multicultural training for staff, expanded resources for cultural centers, enhanced financial aid for low-income students and creation of a “prominent university center” to address issues of race, ethnicity and social identity. He said these are “the central issues of our era.”

“I have heard the expressions of those who do not feel fully included at Yale, many of whom have described experiences of isolation, and even of hostility, during their time here,” Salovey said.

It is just so HAAAARD to be a student at Yale. WAH.

Also:

• Erika Christakis quits teaching at Yale, citing lack of “civil dialogue and open inquiry” after a brouhaha over her criticism of university guidelines on culturally sensitive Halloween costumes. …

• Yale promises to devote $50 million in resources over five years for faculty members “who would enrich diversity” (currently 6 percent are underrepresented minorities). …

(In the interim, three portraits of Calhoun are removed from the college.)

At Yale, a stained-glass window depicting John C. Calhoun has been altered to remove the image of a chained slave. Credit Andrew Sullivan for The New York Times

Finally:

• Harvard and Princeton drop the title of “master” — term dating to medieval universities — for heads of residential colleges; Yale is mulling the same.

The award for shortest list of demands goes to Ithaca College:

The resignation of College President Tom Rochon or for him to be removed from his position.

The award for longest list goes to UVA, which, at 6259 words, was twice as long as the second-longest list, and included demands such as:

Posters in First-Year dorms and on Stall Seat Journals, and other educational, promotional tools should focus on prejudice and oppression, and should offer examples of implicit biases in student-to-student, faculty-to-student interactions. and student-to-Charlottesville resident interactions. Student-run University agencies such as The Honor Committee and The Student Council should prioritize the creation of initiatives aimed towards engaging the student body in conversations surrounding race and inclusivity as elements of our University ideals. …

Students of the University of Virginia must be knowledgeable and conscious about the history of racial oppression and discrimination in the current and historic U.Va. and Charlottesville communities. …

[A mandatory course on the history of UVA] …

Every course should strive to recognize minority perspectives and every department should make it a goal to offer multiple courses that include or focus on minority perspectives within their field each semester. For example, Biology could study genetics across minority communities, …

O RLY.

Well, at least I got a good laugh out of this one.

The hominin braid

Much has been said ’round the HBD-osphere, lately, on the age of the Pygmy (and Bushmen?)/everyone else split. Greg Cochran of West Hunter, for example, supports a split around 300,000 years ago–100,000 years before the supposed emergence of “anatomically modern humans” aka AMH aka Homo sapiens sapiens:

A number of varieties of Homo are grouped into the broad category of archaic humans in the period beginning 500,000 years ago (or 500ka). It typically includes Homo neanderthalensis (40ka-300ka), Homo rhodesiensis (125ka-300ka), Homo heidelbergensis (200ka-600ka), and may also include Homo antecessor (800ka-1200ka).[1] This category is contrasted with anatomically modern humans, which include Homo sapiens sapiens and Homo sapiens idaltu. (source)

According to genetic and fossil evidence, archaic Homo sapiens evolved to anatomically modern humans solely in Africa, between 200,000 and 100,000 years ago, with members of one branch leaving Africa by 60,000 years ago and over time replacing earlier human populations such as Neanderthals and Homo erectus. (source)

The last steps taken by the anatomically modern humans before becoming the current Homo sapiens, known as “behaviourally modern humans“, were taken either abruptly circa 40-50,000 years ago,[11] or gradually, and led to the achievement of a suite of behavioral and cognitive traits that distinguishes us from merely anatomically modern humans, hominins, and other primates. (source)

Cochran argues:

They’ve managed to sequence a bit of autosomal DNA from the Atapuerca skeletons, about 430,000 years old, confirming that they are on the Neanderthal branch.

Among other things, this supports the slow mutation rate, one compatible with what we see in modern family trios, but also with the fossil record.

This means that the Pygmies, and probably the Bushmen also, split off from the rest of the human race about 300,000 years ago. Call them Paleoafricans.

Personally, I don’t think the Pygmies are that old. Why? Call it intuition; it just seems more likely that they aren’t. Of course, there are a lot of guys out there whose intuition told them those rocks couldn’t possibly be more than 6,000 years old; I recognize that intuition isn’t always a great guide. It’s just the one I’ve got.

Picture 1( <– Actually, my intuition is based partially on my potentially flawed understanding of Haak’s graph, which I read as indicating that Pygmies split off quite recently.)

The thing about speciation (especially of extinct species we know only from their bones) is that it is not really as exact as we’d like it to be. A lot of people think the standard is “can these animals interbreed?” but dogs, coyotes, and wolves can all interbreed. Humans and Neanderthals interbred; the African forest elephant and African bush elephant were long thought to be the same species because they interbreed in zoos, but have been re-categorized into separate species because in the wild, their ranges don’t overlap and so they wouldn’t interbreed without humans moving them around. And now they’re telling us that the Brontosaurus was a dinosaur after all, but Pluto still isn’t a planet.

This is a tree
This is a tree

The distinction between archaic homo sapiens and homo sapiens sapiens is based partly on morphology (look at those brow ridges!) and partly on the urge to draw a line somewhere. If HSS could interbreed with Neanderthals, from whom they were separated by a good 500,000 years, there’s no doubt we moderns could interbreed with AHS from 200,000 years ago. (There’d be a fertility hit, just as pairings between disparate groups of modern HSS take fertility hits, but probably nothing too major–probably not as bad as an Rh- woman x Rh+ man, which we consider normal.)

bones sported by time
bones sported by time

So I don’t think Cochran is being unreasonable. It’s just not what my gut instinct tells me. I’ll be happy to admit I was wrong if I am.

The dominant model of human (and other) evolution has long been the tree (just as we model our own families.) Trees are easy to draw and easy to understand. The only drawback is that it’s not always clear exactly clear where a particular skull should be placed on our trees (or if the skull we have is even representative of their species–the first Neanderthal bones we uncovered actually hailed from an individual who had suffered from arthritis, resulting in decades of misunderstanding of Neanderthal morphology. (Consider, for sympathy, the difficulties of an alien anthropologist if they were handed a modern pygmy skeleton, 4’11”, and a Dinka skeleton, 5’11”, and asked to sort them by species.)

blob chart
blob chart

What we really have are a bunch of bones, and we try to sort them out by time and place, and see if we can figure out which ones belong to separate species. We do our best given what we have, but it’d be easier if we had a few thousand more ancient hominin bones.

The fact that different “species” can interbreed complicates the tree model, because branches do not normally split off and then fuse with other branches, at least not on real trees. These days, it’s looking more like a lattice model–but this probably overstates the amount of crossing. Aboriginal Australians, for example, were almost completely isolated for about 40,000 years, with (IIRC) only one known instance of genetic introgression that happened about 11,000 years ago when some folks from India washed up on the northern shore. The Native Americans haven’t been as isolated, because there appear to have been multiple waves of people that crossed the Bering Strait or otherwise made it into the Americas, but we are still probably talking about only a handful of groups over the course of 40,000 years.

Trellis model
Trellis model

Still, the mixing is there; as our ability to suss out genetic differences become better, we’re likely to keep turning up new incidences.

So what happens when we get deep into the 200,000 year origins of humanity? I suspect–though I could be completely wrong!–that things near the origins get murkier, not less. The tree model suggests that the original group hominins at the base of the “human” tree would be less genetically diverse than than the scattered spectrum of humanity we have today, but these folks may have had a great deal of genetic diversity among themselves due to having recently mated with other human species (many of which we haven’t even found, yet.) And those species themselves had crossed with other species. For example, we know that Melanesians have a decent chunk of Denisovan DNA (and almost no one outside of Melanesia has this, with a few exceptions,) and the Denisovans show evidence that they had even older DNA introgressed from a previous hominin species they had mated with. So you can imagine the many layers of introgression you could get with a part Melanesian person with some Denisovan with some of this other DNA… As we look back in time toward our own origins, we may see similarly a great variety of very disparate DNA that has, in essence, hitch-hiked down the years from older species, but has nothing to do with the timing of the split of modern groups.

As always, I am speculating.