Albion’s Seed and discreet vs. overlapping groups

Scott Alexander (of Slate Star Codex) recently posted an entertaining review of David Fischer’s Albion’s Seed, basically the longer version of Woodard’s American Nations, which ended, somewhat amusingly, with Scott realizing that maybe creating a democracy with a bunch of people whose political ideas you find morally repugnant isn’t a good idea.

A few notes:

1. I wouldn’t be surprised if the Puritans names like “Maybe” or “Notwithstanding” weren’t so much random words from the Bible as first words from favorite verses or parts of verses that had been assigned so that the names of the children together formed the complete line (see the Quakers for this sort of name.)

2. The lack of farmers among early Puritan stock might explain why they nearly all starved to death the first couple of years.

3. When people talk about the Cavaliers who settled the Deep South, they all seem to note that of course the underclass of society was not Cavaliers, but then kind of gloss over where the British underclass came from. Most of them, I suspect, were Borderers or their near-equivalents from other parts of the isle, such as thieves and the urban underclass.

I think people tend to imagine these groups (Puritans, Quakers, Borderers, and Cavaliers,) as supposed to be regionally distinct, but most of the time I think we are looking at layers which overlap multiple regions in varying thicknesses. The Borderers, for example, spread across the Deep South, Florida, Texas, the Mountain West, California, Quakerdom, and probably even New England (though the harsh New England climate was probably not as kind to them.) But the trajectory of the Deep South was shaped more by its Cavalier overclass with its African slaves (thus inspiring the Civil War) than by its Borderer underclass. Appalachia, by contrast, was not suited to plantations, and so there the Cavaliers never settled in great quantities and the Borderers are thus a much larger % of the overall society.

So when people ask why Appalachia tends to vote in line with the Deep South, despite these supposedly being two separate groups, I think they are just missing that the majority of whites in the Deep South and Appalachia come from the same or very similar groups of people. The Cavalier overclass was never more than a small % of the Deep South’s population, and obviously blacks vote Democrat.

Also, the Civil War seems to have left a long-term impact on people’s loyalties, where people who strike me as “pretty conservative” but hail from Massachusetts still vote Democrat because they perceive Republicans as the party of those Confederate-flag-waving bigots down in the South.

Yay tribalism leads to rational, optimal political outcomes!

4. Scott does not note that the reason the white Cavalier underclass became “sluggish and indolent” was massive rates of hookworm infection. IIRC, around 1910, de-worming campaigns found that about 25% of Southern children were already infected; who knows what the % was among adults.

Hookworms are intestinal parasites that came over from Africa (with the slaves) and are spread by stepping barefoot into human feces crawling with parasite larvae.

Life before flush toilets was thoroughly disgusting.

Anyway, bad enough that the poor slaves had parasites, but the whites hadn’t even had thousands of years to adapt them, leaving them especially susceptible. The parasites cause anemia, which causes people to act “sluggish and indolent.”

Things got better when they introduced “shoes” to the South.

5. I suspect the disappearance of the Quakers happened not because they “tolerated themselves out of existence” (or not just because) but because they had fewer children than everyone else around them. Plenty of immigrants have arrived, after all, in virtually all parts of the US, but Quakers today are rarer than hen’s teeth. Compare the 16% Quaker female non-marriage rate to the near 100% Puritan marriage rate. The Quakers also spawned the Shakers, who abstained from marriage (and having children) all together.

Of course, this may represent a failure to reproduce their religion rather than their genetics–Quakers resemble “normal people” closely enough that their children may have simply felt that it was unnecessary to attach a religious label to it.

6. Quakers may represent the “normal” position in American politics today in part because they were in the middle of the country, both physically and ideologically. People might not want a country dominated by some group from the extreme end of the geography, but perhaps we can be comfortable with the folks from right in the middle.

7. “It occurs to me that William Penn might be literally the single most successful person in history.”

I raise you a Jesus, Mohammad, Genghis Khan, Karl Marx, and Gautama Buddha.

8. While it is true that Southern Baptist denomination absolutely dominates the entire country south of the Mason-Dixon, it is slightly less popular in Appalachia than in the Deep South.  I think the interesting thing about Borderer religion is the popularity of Pentecostal and Charismatic denominations, which are rarer in the rest of the country.

9. Children physically attacking the school teacher or otherwise preventing the school from operating did not just happen in Borderer regions; it is a major theme in the early chapters of Laura Ingalls Wilder’s Farmer Boy, set in upstate New York. And as reader Psmith noted back on my review of Lenski’s Strawberry girl:

“Was beating up the teacher some kind of regular thing?”
If we take the song lyrics at face value, seems likely: http://slatestarcodex.com/2014/10/02/simpler-times/

Probably the best-recorded incident of this sort, and possibly the original source for all the songs (see the stuff about making a bonfire of the desks), took place at Rugby School in 1797 when the students mutinied and blew down the headmaster’s door with gunpowder, stopped in the end only by a band of special constables armed with swords. (https://www.archive.org/stream/historyofrugbysc00rousuoft/historyofrugbysc00rousuoft_djvu.txt, ctrl+f great rebellion)

From Scott’s post Psmith linked:

To the tune of “Oh My Darling Clementine”:

Build a bonfire out of schoolbooks,
Put the teacher on the top,
Put the prefects in the middle
And we’ll burn the bloody lot.

To the tune of “Deck The Halls”:

Deck the halls with gasoline
fa la la la la la la la la
Light a match and watch it gleam
fa la la la la la la la la
Watch the school burn down to ashes
Fa la la la la la la la la …

To the tune of “On Top Of Old Smokey”:

On top of old smokey
All covered in blood
I shot my poor teacher
with a .44 slug

Unlike Scott, I do remember hearing these sung by my classmates.

I did not enjoy being forced to attend school with those sorts of boys.

10. I have a lot of abstract appreciation for Borderer ideals of liberty, which are pretty much my symbolic idea of “what it means to be an American.” I also have a lot of sympathy for people who just want to go off in the woods and be left alone and not deal with interfering busy-bodies. I don’t now how well I’d actually get along in their society, though.

11. Scott remarks on the close parallels between the traits he’d already observed and attributed to the “Red Tribe” and “Blue Tribe;” and the traits Fischer ascribes to the original settlers of these regions as a point potentially in Fischer’s favor; I propose, however, a caution. Fischer himself is undoubtedly familiar with modern America and the relevant Republican/Democrat cultural divide. Fischer may have–consciously or unconsciously–sought out evidence and presented it to make the colonists resemble their descendants.

12. One of the… interesting aspects of the generalized orthosphere, including much of NRx, is that among American examples, Moldbuggian neocameralism most closely resembles (IMO) the “dystopian” Puritan bargain. The Puritan colonies were corporations owned by shareholders in which temporal and spiritual power were unified, only people who fit in culturally and were sufficiently intelligent were allowed in, and folks who wanted to leave were allowed to do so–the breaking off of Rhode Island as its own colony is a strong precursor for the concepts of patchwork and exit.

Of course, the Puritans still voted, as shareholders must–as long as your king is beholden to shareholders, they will vote. (And in any community where the population density is low enough that each man can be sovereign of his own individual domain, collective decisions are liable to entail, by necessity, a certain amount of consensus.

All of this is grafted onto a group of people who seem to favor the ideals of the Cavalier planter class, while claiming that the Puritans–wielding Quaker ideas–destroyed the moral basis of the formerly functional Borderer society. (Similar arguments are made that liberals have destroyed the moral basis of black society.)

This is not the first time I’ve noticed something like this–the dominant religion of the Deep South (the Cavalier zone,) Southern Baptism, does not resemble the beliefs put forth by deists like Thomas Jefferson, but good ol’ fashioned Puritanism. How exactly the Puritans converted to Unitarian Universalism and the Cavaliers and Borderers converted to Puritanism (or if this is just an artifact of Southern religion changing more slowly than Northern religion and so retaining more of its original character, which was closer to Puritanism in the 1600s than Puritanism is to its own modern descendants, much as Icelandic has morphed more slowly than other Scandinavian languages, allowing speakers of modern Icelandic to read archaic Norse texts that are unintelligible to speakers of other modern Scandinavian languages.

I read a book and it’s Friday: Homicide, by Daly and Wilson

Today’s selection, Homicide, is ev psych with a side of anthropology; I am excerpting the chapter on people-who-murder-children. (You are officially forewarned.)

Way back in middle school, I happened across (I forget how) my first university-level textbook, on historical European families and family law. I got through the chapter on infanticide before giving up, horrified that enough Germans were smushing their infants under mattresses or tossing them into the family hearth that the Holy Roman Empire needed to be laws specifically on the subject.

It was a disillusioning moment.

Daly and Wilson’s Homicide, 1988, contributes some (slightly) more recent data to the subject, (though of course it would be nice to have even more recent data.

Picture 6 Picture 5 Picture 4 Picture 2 Picture 1 CgxAZrOUYAEeANF

(I think some of the oddities in # of incidents per year may be due to ages being estimated when the child’s true age isn’t known, eg, “headless torso of a boy about 6 years old found floating in the Thames.”)

We begin with a conversation on the subject of which child parents would favor in an emergency:

If parental motives are such as to promote the parent’s own fitness, then we should expect that parents will often be inclined to act so that neither sibling’s interests prevail completely. Typically, parental imposition of equity will involve supporting the younger, weaker competitor, even when the parent would favor the older if forced to choose between the two. It is this latter sort of situation–“Which do you save when one must be sacrificed?”–in which parents’ differential valuation of their children really comes to the fore. Recall that there were 11 societies in the ethnographic review of Chapter 3 for which it was reported that a newborn might be killed if the birth interval were too short or the brood too numerous. It should come as no surprise that there were no societies in which the prescribed solution to such a dilemma was said to be the death of an older child. … this reaction merely illustrates that one takes for granted the phenomenon under discussion, namely the gradual deepening of parental commitment and love.

*Thinks about question for a while* *flails* “BUT MY CHILDREN ARE ALL WONDERFUL HOW COULD I CHOSE?” *flails some more*

That said, I think there’s an alternative possibility besides just affection growing over time: the eldest child has already proven their ability to survive; an infant has not. The harsher the conditions of life (and thus, the more likelihood of actually facing a real situation in which you genuinely don’t have enough food for all of your children,) the higher the infant mortality rate. The eldest children have already run the infant mortality gauntlet and so are reasonably likely to make it to adulthood; the infants still stand a high chance of dying. Sacrificing the child you know is healthy and strong for the one with a high chance of dying is just stupid.

Whereas infant mortality is not one of my personal concerns.

Figure 4.4 shows that the risk of parental homicide is indeed a declining function of the child’s age. As we wold anticipate, the most dramatic decrease occurs between infants and 1-year-old children. One reason for expecting this is that the lion’s share of the prepubertal increase in reproductive value in natural environments occurs within the first year.

(I think “prepubertal increase in reproductive value” means “decreased likelihood of dying.”)

Moreover, if parental disinclination reflects any sort of assessment of the child’s quality or the mother’s situation, then an evolved assessment mechanisms should be such as to terminate any hopeless reproductive episode as early as possible, rather than to squander parental effort in an enterprise that will eventually be abandoned. … Mothers killed 61 in the first 6 months compared to just 27 in the second 6 months. For fathers, the corresponding numbers are 24 vs. 14. [See figure 4.4] … This pattern of victimization contrasts dramatically with the risk of homicide at the hands of nonrelatives (Figure 4.5)…

I would like to propose an alternative possibility: just as a child who attempts to drive a car is much more likely to crash immediately than to successfully navigate onto the highway and then crash, so a murderous person who gets their hands onto a child is more likely to kill it immediately than to wait a few years.

A similar mechanism may be at play in the apparent increase and then decrease in homicides of children by nonrelatives during toddlerhood. Without knowing anything about these cases, I can only speculate, but 1-4 are the ages when children are most commonly put into daycares or left with sitters while their moms return to work. The homicidally-minded among these caretakers, then, are likely to kill their charges sooner rather than later. (School-aged children, by contrast, are both better at running away from attackers and highly unlikely to be killed by their teachers.)

Teenagers are highly conflictual creatures, and the rate at which nonrelatives kill them explodes after puberty. When we consider the conspicuous, tempestuous conflicts that occur between teenagers and their parents–conflicts that apparently dwarf those of the preadolescent period–it is all the more remarkable that the risk of parental homicide continues its relentless decline to near zero.

… When mothers killed infants, the victims had been born to them at a mean age of 22.7 years, whereas older victims had been born at a mean maternal age of 24.5. Thi is a significant difference, but both means are signficantly below the 25.8 year that was the average age of all new Candian mothers during the same period, accoding to Cadian Vital Statistics.

In other words, impulsive fuckups who get accidentally pregnant are likely to be violent impulsive fuckups.

We find a similar result with respect to marital status: Mothers who killed older children are again intermediate between infanticidal women and the population-at-large. Whereas 51% of mothers committing infanticide were unmarried, the same was true of just 34% of those killing older children. This is still substantially above the 12% of Canadian births in which the new mother was unmarried …

Killing of an older child is often associated with maternal depression. Of the 95 mothers who killed a child beynd its infancy, 15.8% also committed suicide. … By contrast, only 2 of 88 infanticidal mothers committed suicide (and even this meager 2.3% probably overestimates the assocation of infanticide with suicide, since infanticides are the only category of homicides in which a significant incidence of undetected cases is likely.) … one of thee 2 killed three older children as well.

Anyone else thinking of Andrea Yates and her idiot husband?

In the Canadian data, it is also noteworthy that 35% of maternal infanticides were attributed by the investigating police force … [as] “mentally ill or mentally retarded (insane),” verses 58% of maternal homicides of older children. Here and elsewhere, it seems that the sots of cases that are simultaneously rare and seemingly contrary to the actor’s interests–in both the Darwinian and the commonsense meaning of interest–also happen t be the sorts of cases most likely to be attributed to some sort of mental incompetence. … We identify as mad those people who lack a species-typical nepotistic perception of their interests or who no longer care to pursue them. …

Violent people go ahead and kill their kids; people who go crazy later kill theirs later.

We do at least know the ages of the 38 men who killed heir infant children: the mean was 26.3 years. Moreover, we know that fathers averaged 4 years older than mothers for that substantial majority of Canadian births that occurred within marriages… . Since the mean age for all new Canadian mothers during the relevant period… was 25.8, it seems clear that infanticidal fathers are indeed relatively young. And as was the case with mothers, infanticidal fathers were significantly younger than those fathers who killed older offspring. (mean age at the victim’s birth = 29.2 years). …

As with mothers, fathers who killed older children killed themselves as well significantly more often (43.6% of 101) than did those who killed their infant children (10.5% of 38). Also like mothers is the fact that those infanticidal fathers who did commit suicide were significantly older (mean age = 30.5 years) than those who did not (mean = 25.8). Likewise, the paternal age at which older victims had been born was also significantly greater for suicidal (mean = 31.1 years; N = 71) than for nonsuicidal (mean =27.5; N = 67) homicidal fathers. And men who killed their older children were a little more likely to be deemed mentally incompetent (20.8%) than those who killed their infants (15.8%). …

Fathers, however, were significantly less likely to commit suicide after killing an adult offspring (19% of 21 men) than a child (50% of 80 men.) … 20 of the 22 adult victims of their father were sons… three of the four adult victims of mothers were daughters. … There is no hint of such a same-ex bias in the killings of either infants… or older children. …

An infrequent but regular variety of homicide is that in which a man destroys his wife and children. A corresponding act of familicide by the wife is almost unheard of. …

No big surprises in this section.

Perhaps the most obvious prediction from a Darwinian view of parental motives is this: Substitute parents will generally tend to care less profoundly for their children than natural parents, with the result that children reared by people other than their natural parents will be more often exploited and otherwise at risk. Parental investment is a precious resource, and selection must favor those parental psyches that do not squander it on nonrelatives.

Disclaimer: obviously there are good stepparents who care deeply for their stepchilden. I’ve known quite a few. But I’ve also met some horrible stepparents. Given the inherent vulnerability of children, I find distasteful our society’s pushing of stepparenting as normal without cautions against its dangers. In most cases, remarriage seems to be undertaken to satisfy the parent, not the  child.

In an interview study of stepparents in Cleveland, Ohio, for example–a study of predominantly middle-class group suffering no particular distress or dysfunction–Loise Duberman (1975) found that only 53% of stepfathers and 25% of stepmothers could claim to have “parental feeling” toward their stepchildren, and still fewer to “love” them.

Some of this may be influenced by the kinds of people who are likely to become stepparents–people with strong family instincts probably have better luck getting married to people like themselves and staying that way than people who are bad at relationships.

In an observational study of Trinidadian villagers, Mark Flinn (1988) found that stepfathers interacted less with “their” children than did natural fathers; that interactions were more likely to be aggressive within steprelationships than within the corresponding natural relationships; and that stepchildren left home at an earlier age.

Pop psychology and how-to manuals for stepfamilies have become a growth industry. Serious study of “reconstituted” families is also burgeoning. Virtually all of this literature is dominated by a single theme: coping with the antagonisms…

Here the authors stops to differentiate between between stepparenting and adoption, which they suspect is more functional due to adoptive parents actually wanting to be parents in the first place. However,

such children have sometimes been found to suffer when natural children are subsequently born to the adopting couple, a result that has led some professionals to counsel against adoption by childless couples until infertility is definitely established. …

Continuing on with stepparents:

The negative characterization of stepparents is by no means peculiar to our culture. … From Eskimos to Indonesians, through dozens of tales, the stepparent is the villain of every piece. … We have already encountered the Tikopia or Yanomamo husband who demands the death of his new wife’s prior children. Other solutions have included leaving the children with postmenopausal matrilineal relatives, and the levirate, a wide-spread custom by which a widow and her children are inherited by the dead man’s brother or other near relative. …

Social scientists have turned this scenario on its head. The difficulties attending steprelationships–insofar as they are acknowledged at all–are presumed to be caused by the “myth of the cruel stepparent” and the child’s fears.

See: Freud.

Why this bizarre counterintuitive view is the conventional wisdom would be  a topic for a longer book than this; suffice to say that the answer surely has more to do with ideology than with evidence. In any event, social scientists have staunchly ignored the question of the factual basis for the negative “stereotyping” of stepparents.

Under Freud’s logic, all sorts of people who’d been genuinely hurt by others were summarily dismissed, told that they were the ones who actually harbored ill-will against others and were just “projecting” their emotions onto their desired victims.

Freudianism is a crock of shit, but in this case, it helped social “reformers” (who of course don’t believe in silly ideas like evolution) discredit people’s perfectly reasonable fears in order to push the notion that “family” doesn’t need to follow traditional (ie, biological) forms, but can be reinvented in all sorts of novel ways.

So are children at risk in stepparent homes in contemporary North America? [see Figures 4.7 and 4.8.] … There is … no appreciable statistical confounding between steprelationships and poverty in North America. … Stepparenthood per se remains the single most powerful risk factor for child abuse that has yet been identified. (here and throughout this discussion “stepparents” include both legal and common-law spouses of the natural parent.) …

Speaking of Figures 4.7 and 4.8, I must say that the kinds of people who get divorced (or were never married) and remarried within a year of their kid’s birth are likely to be unstable people who tend to pick particularly bad partners, and the kinds of people willing to enter into a relationship with someone who has a newborn is also likely to be, well, unusual. Apparently homicidal.

By contrast, the people who are willing to marry someone who already has, say, a ten year old, may be relatively normal folks.

Just how great an elevation of risk are we talking about? Our efforts to answer that question have been bedeviled by a lack of good information in the living arrangements of children in the general population. … there are no official statistics [as of when this was written] on the numbers of children of each age who live in each household type. There is no question that the 43% of murdered American child abuse victims who dwelt with substitute parents is far more than would be expected by chance, but estimates of that expected percentage can only be derived from surveys that were designed to answer other questions. For a random sample of American children in 1976, … the best available national survey… indicates that only about 1% or fewer would be expected to have dwelt with a substitute parent. An American child living with one or more substitute parents in 1976 was therefore approximately 100 times as likely to be fatally abused as a child living with natural parents only…

Results for Canada are similar. In Hamilton, Ontario in 1983, for example, 16% of child abuse victims under 5 years of age lived with a natural parent and a stepparent… Since small children very rarely have stepparents–less than 1% of preschoolers in Hamilton in 1983, for example–that 16% represents forty times the abuse rate for children of the same age living with natural parents. … 147 Canadian children between the ages of 1 and 4 were killed by someone in loco parentis between 1974 and 1983; 37 of those children (25.2%) were the victims of their stepparents, and another 5 (3.4%) were killed by unrelated foster parents.

…The survey shows, for example, that 0.4% of 2,852 Canadian children, aged 1-4 in 1984, lived with a stepparent. … For the youngest age group in Figure 4.9, those 2 years of age and younger, the risk from a stepparent is approximately 70 times that from a natural parent (even though the later category includes all infanticides by natural mothers.)

Now we need updated data. I wonder if abortion has had any effect on the rates of infanticide and if increased public acceptance of stepfamilies has led to more abused children or higher quality people being willing to become stepparents.

Some quick thoughts about Angry Birds and a caution about LSD stories

Watching the liberals lose their shit over the Angry Birds Movie has been rather entertaining and proof of just how absurdly out of touch with reality they’ve become.

The movie is limited by the game’s single conceit: the pigs stole the birds’ eggs, and the birds are flinging themselves at the pigs to get back the eggs. You can’t have reconciliation between the pigs and birds because, as is obvious if you’ve played the game, the pigs steal those eggs over and over.

Critically, the pigs are not refugees or economic migrants seeking a better life. They are invaders stealing the birds’ eggs. Liberals can no longer distinguish between the two. They are not freaking out over the birds attacking a group of peaceful refugees, but over the birds defending themselves against actual invaders.

The right of self-defense against people who attacked you unprovoked is not even right-wing; it is accepted by almost all moralists and is about as mainstream a view as you can find. I can understand the left’s humanitarian logic for accepting refugees/economic migrants, but to toss out the right to self-defense is just plain delusional.

(Comment originally posted in reaction to Gregory Hood’s Review of the Angry Birds Movie.)

640I also feel compelled to note that, while people have been claiming that the chief pig, Leonard, has a “Middle Eastern” style beard, Middle Easterners typically have curly haired beards, whereas Leonard clearly has straight fur. Also, Leonard has only managed a chin-beard, whereas Middle Easterners tend to have much fuller beards.

Because this is an HBD-centric blog, I have maps:

1024px-PSM_V52_D323_Global_hair_texture_map Bodyhair_map_according_to_American_Journal_of_Physical_Anthropology_and_other_sources

 

 

 

Personally, I think he looks more French, eg Childeric II or Henry I–for a pig.

__________________________________

 

The comments on Slate Star Codex’s recent post, “Why Were Early Psychedelicists so Weird?” contain a fair number of stories along the lines of “I took LSD/shrooms/other illegal drugs and had interesting, positive effects,” and a few stories along the lines of “I knew a guy who tried LSD and it fried his brain and turned him into a drooling idiot.”

Normally, I think it best to rate “I did X”-style testimony more highly than “I knew a guy who did X.” In this case, however, I want to urge caution, because there is an obvious selection bias in the kinds of stories you are going to hear: drooling idiots are bad at writing.

The people whose brains got fried on illegal drugs do not have the ability to get on the internet and write coherent, entertaining posts on the subject, and they certainly do not have the IQ points left to be part of the regular readership/commentariat on Scott’s blog. In fact, they aren’t writing a whole lot of anything. Which means that if you are reading about LSD-experiences in the comments section of Scot’s blog, you are only going to read stories from people who are still mentally with it, or people warning that a bad thing happened to a guy they knew.

I have no idea what % of people who try LSD end up okay, better, or worse afterwards, but for the sake of argument, let’s assume that 50% are fine-to-better and 50% end up in droolsville. The 50% who are fine go post on Scott’s blog, and the 50% who are not fine never show up because they can’t type anymore, except as cautionary tales from the few guys who know the details about a former friend’s illegal activities.

Maybe LSD researchers can tell you what percentage of people fry their brains on it, shrooms, or other psychedelics. But you certainly can’t make any good estimation based on a biased sample like this–so don’t.

And yes, I know, everyone with positive stories would probably say that the key is to be very careful about how much you use, purity, and allowing enough time between uses. But the people who fried their brains probably thought that, too.

I am not saying that these drugs cannot possibly have any positive medical uses. I am saying that you should avoid using biased datasets when formulating any theories on the matter.

“I don’t hate minorities, I just hate liberals”

A lot of people are talking about the Trump candidacy “realigning” or “reshaping” the American political landscape and things like that. Like why would traditionally blue-state voters in places like NY vote for a guy who’s also carrying traditionally red-states like Kentucky? Is the whole Albion’s seed-style ethno/political makeup of the nation breaking down after nearly 400 years?

Nah.

Look, when it comes to politics, conservatives are basically just reactive. There are some smart conservatives, of course–I’d wager they do well in fields like economics, finance, sports broadcasting, and military strategy–but conservatives overall do not dominate the production of new social ideas. It’s the liberals, somewhat by nature, who keep coming up with ideas like, “What if we let women have abortions?” “What if we all took LSD?” “What if we didn’t eat animals?” or “What if we let gay people get married?”

So the conservatives devote themselves to opposing whatever the hell cockamamie scheme the liberals have come up with this time.

During the Cold War, I’m pretty sure the conservative opposed the liberals on the grounds that the liberals were commie peaceniks who weren’t doing enough to ensure that we would win the nuclear war against the USSR.

By the ’80s, conservatives were visibly concerned about shifting national attitudes toward religion, especially as it impacted things like abortion, divorce, the teaching of evolution in schools, whether local governments could make religious displays, etc. “Talk radio” became an important bastion in the “Religious Right,” which by the mid-90s had won a sweeping victory in Congress.

When people talk about how no president has ever been so hated as Obama, I wonder if they remember just how much the right hated Clinton.

And what did they hate him for?

Because he represented degenerate, godless atheism. (Never mind that Bill Clinton is probably actually Christian; that doesn’t really matter.)

Reagan and Bush I may have been religious conservatives, but religious conservatism was not a big part of their campaigns. By contrast, Bob Dole, Bush II, and mildly, Mitt Romney, all ran on the religious right platform, with strong planks based on ideas like “ban abortion” and “make sure gay marriage stays illegal.” Bush II even managed to establish an “Office of Faith-Based and Neighborhood Partnerships.”

Meanwhile, though, liberals were changing. The big liberal push of the past 8 years has not been atheism; atheism has largely won already and atheists have wandered off to fight other battles, taking to the streets to protest racism. Thus the campus protests, the Black Lives Matter campaigns, the increasing push for open borders. Today, Germany; tomorrow, the US.  Today’s liberals are, first and foremost, anti-racists.

The Republican establishment–folks like Ted Cruz and Ben Carson–fell so flat with voters precisely because most of them were still harping on religious issues like abortion and war with the Russians that were a concern with Reagan’s and Bob Dole’s voters, not today’s.

Today’s conservatives do not exactly want to come out and declare themselves racist bigots–in fact, the vast majority of them don’t see themselves as racists, and many are quite vehemently opposed to racism. This makes people reluctant to say anything negative about blacks, which gets instantly called out as racist. But you can still say things about immigrants, especially illegal immigrants. There’s just enough plausible deniability (both for others and yourself) to claim that you are not opposed to Mexicans, per se, you are just opposed to people breaking the law and think that if the law exists, then it ought to be enforced or else it is unfair to the people who did obey it. And for that matter, many of them really aren’t opposed to Mexicans; they are just broke and unable to find work and have enough brains in their heads to figure out what a massive flood of low-wage workers does to their chances of finding a well-paying job.

Of course, in the backs of people’s minds, it is not just about immigrants; it is also about BLM protestors, the November terrorist attacks in Paris, and the conviction that if elected, Hillary Clinton will follow in Angela Merkel’s footsteps and invite a million Muslims to the US.

This is why they say, “I don’t hate blacks; I don’t hate Mexicans. I just hate liberals.”

 

Potato Madness

this_is_a_potatoThis is a potato.

Bake it, and you have a healthy, nutritious dinner that you can serve your family and feel good about.

That seems simple enough.

However, the potato would like to clarify some potential confusion about its culinary uses:

 

18rnflcoijyr0jpgThis is a potato that has been chopped up and deep fried.

Make it from scratch in the morning, and you are not just a good mom, but an excellent mom.

You can’t really eat them for dinner, unless you’re at IHOP or celebrating Chanukah.

 

Picture 11This is a potato that has been chopped up, deep fried, and served with a side of pickled tomatoes.

It is never for breakfast, except maybe if you are on a roadtrip and there’s nothing else available. If so, we will pretend it never happened.

Serve it for breakfast any other time, and you are a bad mom.

It is fine for lunch, though.

Picture 12This is a potato that has been chopped up and baked.

You can never, ever serve it for breakfast. In fact, you are a bad mom if you even think about serving it for breakfast.

It is not a meal at all!

Why, this potato is so unhealthy, you should probably never eat it at all.

 

There. I hope that clears everything up.

 

Anthropology Friday: Sacrifice Among the Semites pt. 2

Hello! Today we’re continuing with more excerpts from Smith’s Sacrifice Among the Semites, with all attendant warnings that I don’t necessarily trust Smith’s accuracy.

“Now, if kinship means participation in common mass of flesh, blood, and bones, it is natural ha tit should be regarded as dependent, not merely on the fact that a man was born of his mother’s body, and so was from hi birth a part of her flesh, but also n the not less significant fact that he was nourished by her mil. And so we find that among the Arabs there is a tie of milk, as well as of blood, which unites the foster-child t his foster-mother and her kin. Again, after the child is weaned, his flesh and blood continue to be nourished and renewed by the food which he shares with his commensals, so that commensality can be thought of (1) as confirming or even (2) as constituting kinship in a very real sense.

“… Primarily the circle of common religion and of common social duties was identical with that of natural kinship, and the god himself was conceived as being of the same stock with his worshipers. It was natural, therefore, that the kinsmen and their kindred god should seal and strengthen their fellowship by meeting together from time to time to nourish their common life by a common meal, to which those outside the kin were not admitted.”

White House Passover Seder, 2011
White House Passover Seder, 2011

“… after several clans had begun to frequent the same sanctuary and worship the same god, the worshipers still grouped themselves for sacrificial purposes on the principle of kinship. In the days of Saul and David all the tribes of Israel had long been united in the worship of Jehovah, yet the clans still maintained their annual gentile sacrifice, at which every member of the group was bound to be present. But evidence more decisive comes to us from Arabia, where, as we have seen, men would not eat together at all unless they were united by kinship or by a covenant that had the same effect as natural kinship. Under such a rule the sacrificial feast must have been confined to kinsmen, and the clan was the largest circle that could unite in a sacrificial act. And so, though the great sanctuaries of heathen Arabia were frequented at the pilgrimage feasts by men of different tribes, who met peaceably for a season under the protection of the truce of God, we find that their participation in the worship of the same holy place did not bind alien clans together in any religious unity; they worshiped side by side, but not together.”

EvX: I wish this guy would cite his sources or otherwise back up his claims.

“It is only under Islam that the pilgrimage becomes a bond of religious fellowship, whereas in the times of heathenism it was the correct usage that the different tribes, before they broke up from the feast, should engage in a rivalry of self–exaltation and mutual abuse, which sent them home with all their old jealousies freshly inflamed.”

“…But the notion that the clan is only a larger household is not consistent with the results of modern research. Kinship is an older thing than family life, and in the mot primitive societies know n to us the family or household group was not a subdivision of a clan, but contained members of more than one kindred. As a rule the savage man may not marry a clanswoman, and the children are of the mother’s kin, and therefore have no communion of blood religion with their father. In such a society their is hardly any family life, and there can be no sacred household meal.

“… The rudest nations have religious rule about food, based on the principle of kinship, viz,, that a man may not eat the totem animal of his clan; and they generally have some rites of the nature of the sacrificial feast of kinsmen; but it is not the custom of savages to take their ordinary daily food in a social way, in regular domestic meals. Their habit is to eat irregularly and apart, and this habit is strengthened by the religious rules, which often forbid to one member of a household the food which is permitted to another.”

Frankly, I think he is wrong. Set “meals” may be a modern innovation, but I highly doubt the Bushmen would be so picky as to allow one person in a family to eat a specific animal but forbid it to their spouse; same for the Inuit. There is far too much chance of starvation and hunger in these groups to go turning down good food.

“In Egypt, down to the present day, many persons hardly ever eat with their wives and children, and among the Arabs, boys who are not of full age do not presume to eat in the presence of their parents, but take their meals separately or with the women of the house No doubt the seclusion of women has retarded the development of family life in Mohammedan countries; but for most purposes this seclusion has never taken much hold on the desert, and yet in northern Arabia no woman will eat before men. … in Arabia the daily family meal has never been an established institution with such  a religious significance as attaches to the Roman supper.”

EvX: I don’t know much about Roman suppers, to be honest. I hear the Jews are into their Friday evening meals, though.

“… even among the agricultural Semites there is no trace of a sacrificial character being attached to ordinary household meals. The domestic hearth among the Semites was not an altar as it was at Rome. Almost all varieties of human food were offered to the gods, and any kind of food suffices, according to the laws of Arabian hospitality, to establish that bond between two men which in the last resort rests on the principle that only kinsmen eat together. It may seem, therefore, that in the abstract any sort of meal publicly partaken of by a company of kinsmen may constitute a sacrifice feast. The distinction between the feast and an ordinary meal lie, it may seem, not in the material or the copiousness of the repast, but in its public character. When men eat alone they do not invite the god to share their food, but when the clan eats together as a kindred unity the kindred god must also be of the party.

(source)
(source)

EvX: I am reminded here of Elijah’s cup, filled with wine and placed on the Passover table just in case the Prophet Elijah decides to show up for dinner. According to Wikipedia:

In the Talmudic literature, Elijah would visit rabbis to help solve particularly difficult legal problems. Malachi had cited Elijah as the harbinger of the eschaton. Thus, when confronted with reconciling impossibly conflicting laws or rituals, the rabbis would set aside any decision “until Elijah comes.”[24]

One such decision was whether the Passover seder required four or five cups of wine. Each serving of wine corresponds to one of the “four expressions of redemption” in the Book of Exodus: … The next verse, “And I will bring you into the land which I swore to give to Abraham, to Isaac, and to Jacob; I will give it to you for a possession. I am the Lord.” (Exodus 6:8) was not fulfilled until the generation following the Passover story, and the rabbis could not decide whether this verse counted as part of the Passover celebration (thus deserving of another serving of wine). Thus, a cup was left for the arrival of Elijah.

In practice the fifth cup has come to be seen as a celebration of future redemption. Today, a place is reserved at the seder table and a cup of wine is placed there for Elijah. During the seder, the door of the house is opened and Elijah is invited in. Traditionally, the cup is viewed as Elijah’s and is used for no other purpose.[25][26]

Returning to Smith:

“Practically, however, there is no sacrificial feast according to Semitic usage except where a victim is slaughtered. The rule of the Levitical law, that a cereal oblation, when offered alone, belongs wholly to the god and gives no occasion for a feast of worshipers, agrees with the older history, in which we never find a sacrificial meal of which flesh does not form a part. Among the Arabs the usage is the same; a religious banquet implies a victim.”

???

When anyone brings a grain offering to the Lord, their offering is to be of the finest flour. They are to pour olive oil on it, put incense on it and take it to Aaron’s sons the priests. The priest shall take a handful of the flour and oil, together with all the incense, and burn this as a memorial[a] portion on the altar, a food offering, an aroma pleasing to the Lord. The rest of the grain offering belongs to Aaron and his sons; it is a most holy part of the food offerings presented to the Lord.–Leviticus 2:1-3

“‘If the offering is a burnt offering from the herd, you are to offer a male without defect. You must present it at the entrance to the tent of meeting so that it will be acceptable to the Lord. You are to skin the burnt offering and cut it into pieces. The sons of Aaron the priest are to put fire on the altar and arrange wood on the fire. Then Aaron’s sons the priests shall arrange the pieces, including the head and the fat, on the wood that is burning on the altar. You are to wash the internal organs and the legs with water, and the priest is to burn all of it on the altar.–Leviticus 1:3-9

Am I misunderstanding Leviticus, or did Smith mix up the two forms of sacrifice?

Saint Nilus of Sinai
Saint Nilus of Sinai

Now Smith draws upon Nilus, “As to the habits of the Arabs of the Sinaitic desert towards the close of the fourth Christian century”

“The ordinary sustenance of these Saracens was derived from pillage or from hunting, to which, no doubt, must be added, as a main element, the milk of their herds. When these supplies failed they fell back on the flesh of their camels, one of which was slain for each clan … or for each group which habitually pitched their tents together… which according to known Arab usage would always be a fraction of a clan–and the flesh was hastily devoured by the kinsmen…”

According to Wikipedia:

About the year 390[2] or perhaps 404,[3] Nilus left his wife and one son and took the other, Theodulos, with him to Mount Sinai to be a monk. They lived here till about the year 410[4] when the Saracens, invading the monastery, took Theodulos prisoner. The Saracens intended to sacrifice him to their gods, but eventually sold him as a slave, so that he came into the possession of the Bishop of Elusa in Palestine. The Bishop received Theodulos among his clergy and made him door-keeper of the church. Meanwhile, Nilus, having left his monastery to find his son, at last met him at Elusa. The bishop then ordained them both priests and allowed them to return to Sinai.

Continuing with Smith: “To grasp the force of this evidence we must remember that, beyond question, the was at this time among the Saracens private property in camels, and that therefore, so far as the law of property went, there could be no reason why a man should not kill a beast for the use of his own family. And though a whole camel might be too much for a single household to eat fresh, the Arabs knew and practiced the art of preserving flesh by cutting it into strips and drying them in the sun. Under these circumstances private slaughter could not have failed to be customary, unless it was absolutely forbidden by tribal usage. In short, it appears that while milk, game, and the fruits of pillage were private food which might be eaten in any way, the camel was not allowed to be killed and eaten except in a public rite, at which all the kinsmen assisted.”

From his monastery at Sinai Nilus was a well known person throughout the Eastern Church; by his writings and correspondence he played an important part in the history of his time. He was known as a theologian, Biblical scholar and ascetic writer, so people of all kinds, from the emperor down, wrote to consult him. His numerous works, including a multitude of letters, consist of denunciations of heresy, paganism, abuses of discipline and crimes, of rules and principles of asceticism, especially maxims about the religious life. He warns and threatens people in high places, abbots and bishops, governors and princes, even the emperor himself, without fear. He kept up a correspondence with Gainas, a leader of the Goths, endeavouring to convert him from Arianism;[6] he denounced vigorously the persecution of St. John Chrysostom both to the Emperor Arcadius[7] and to his courtiers.[8]

Nilus must be counted as one of the leading ascetic writers of the 5th century.–Wikipedia

“This evidence is all the more remarkable because, among the Saracens of whom Nilus speaks, the slaughter of a camel in times of hunger does not seem to have been considered as a sacrifice to the gods. For a couple of pages later he speaks expressly of he sacrifices which these Arabs offered to the morning star, the sole deity they acknowledged. These could be performed only when the star was visible, and the whole victim–flesh, skin, and bones–had to be devoured before the sun rose upon it and the day-star disappeared. As this form of sacrifice was necessarily confined to seasons when the planet Venus was a morning star, while the necessity for slaughtering a camel as food might arise at any season, it is to be inferred that in the latter case the victim was not recognized as having a sacrificial character. … the Saracens of Nilus, like the Arabs generally in the last ages of heathenism, had ceased to do sacrifice to the tribal or clan god with whose worship the feast of kinsmen was originally connected. The planet Venus, or Lucifer, was not a tribal deity, but, as we know from a variety of sources, was worshiped by all the northern Arabs, to whatever kin they belonged. … ”

According to Wikipedia:

Ptolemy‘s Geography (2nd century CE) describes “Sarakene” as a region in the northern Sinai peninsula.[2] Ptolemy also mentions a people called the “Sarakenoi” living in north-western Arabia (near neighbor to the Sinai).[2] Eusebius of Caesarea refers to Saracens in his Ecclesiastical history, in which he narrates an account wherein Dionysius, Bishop of Alexandria, mentions Saracens in a letter while describing the persecution of Christians by the Roman emperor Decius: “Many were, in the Arabian mountain, enslaved by the barbarous ‘sarkenoi’.”[2]

But a few centuries after that, Europeans started using Saracen as a catch-all for Arabs and Muslims.

I have just started reading the Wikipedia page on Religion in pre-Islamic Arabia, but a quick search does not turn up “Venus” or “star.” I’ll be on the lookout for evidence one way or another regarding Smith’s claims.

Is there a correlation between intelligence and taste?

(I am annoyed by the lack of bands between 1200 and 1350)
(source)

De gustibus non disputandum est. — Confucius

We’re talking about foods, not whether you prefer Beethoven or Lil’ Wayne.

Certainly there are broad correlations between the foods people enjoy and their ethnicity/social class. If you know whether I chose fried okra, chicken feet, gefilte fish, escargot, or grasshoppers for dinner, you can make a pretty good guess about my background. (Actually, I have eaten all of these things. The grasshoppers were over-salted, but otherwise fine.) The world’s plethora of tasty (and not-so-tasty) cuisines is due primarily to regional variations in what grows well where (not a lot of chili peppers growing up in Nunavut, Canada,) and cost (the rich can always afford fancier fare than the poor,) with a side dish of seemingly random cultural taboos like “don’t eat pork” or “don’t eat cows” or “don’t eat grasshoppers.”

But do people vary in their experience of taste? Does intelligence influence how you perceive your meal, driving smarter (or less-smart) people to seek out particular flavor profiles or combinations? Or could there be other psychological or neurological factors at play n people’s eating decisions?

This post was inspired by a meal my husband, an older relative and I shared recently at McDonald’s. It had been a while since we’d last patronized McDonald’s, but older relative likes their burgers, so we went and ordered some new-to-us variety of meat-on-a-bun. As my husband and I sat there, deconstructing the novel taste experience and comparing it to other burgers, the older relative gave us this look of “Jeez, the idiots are discussing the flavor of a burger! Just eat it already!”

As we dined later that evening at my nemesis, Olive Garden, I began wondering whether we actually experienced the food the same way. Perhaps there is something in people that makes them prefer bland, predictable food. Perhaps some people are better at discerning different flavors, and the people who cannot discern them end up with worse food because they can’t tell?

Unfortunately, it appears that not a lot of people have studied whether there is any sort of correlation between IQ and taste (or smell.) There’s a fair amount of research on taste (and smell,) like “do relatives of schizophrenics have impaired senses of smell?” (More on Schizophrenics and their decreased ability to smell) or “can we get fat kids to eat more vegetables?” Oh, and apparently the nature of auditory hallucinations in epileptics varies with IQ (IIRC.) But not much that directly addresses the question.

I did find two references that, somewhat in passing, noted that they found no relationship between taste and IQ, but these weren’t studies designed to test for that. For example, in A Food Study of Monotony, published in 1958 (you know I am really looking for sources when I have to go back to 1958,) researchers restricted the diets of military personnel employed at an army hospital to only 4 menus to see how quickly and badly they’d get bored of the food. They found no correlation between boredom and IQ, but people employed at an army hospital are probably pre-selected for being pretty bright (and having certain personality traits in common, including ability to stand army food.)

Interestingly, three traits did correlate with (or against) boredom:

Fatter people got bored fastest (the authors speculate that they care the most about their food,) while depressed and feminine men (all subjects in the study were men) got bored the least. Depressed people are already disinterested in food, so it is hard to get less-interested, but no explanation was given of what they meant by “femininity” or how this might affect food preferences. (Also, the hypochondriacs got bored quickly.)

Some foods inspire boredom (or even disgust) quickly, while others are virtually immune. Milk and bread, for example, can be eaten every day without complaint (though you might get bored if bread were your only food.) Potted meat, by contrast, gets old fast.

Likewise, Personality Traits and Eating Habits (warning PDF) notes that:

Although self-reported eating practices were not associated with educational level, intelligence, nor various indices of psychopathology, they were related to the demographic variables of gender and age: older participants reported eating more fiber in their diets than did younger ones, and women reported more avoidance of fats from meats than did men.

Self-reported eating habits may not be all that reliable, though.

Autistic children do seem to be worse at distinguishing flavors (and smells) than non-autistic children, eg Olfaction and Taste Processing in Autism:

Participants with autism were significantly less accurate than control participants in identifying sour tastes and marginally less accurate for bitter tastes, but they were not different in identifying sweet and salty stimuli. … Olfactory identification was significantly worse among participants with autism. … True differences exist in taste and olfactory identification in autism. Impairment in taste identification with normal detection thresholds suggests cortical, rather than brainstem dysfunction.

(Another study of the eating habits of autistic kids found that the pickier ones were rated by their parents as more severely impaired than the less picky ones, but then severe food aversions are a form of life impairment. By the way, do not tell the parents of an autistic kid, “oh, he’ll eat when he’s hungry.” They will probably respond politely, but mentally they are stabbing you.)

On brainstem vs. cortical function–it appears that we do some of our basic flavor identification way down in the most instinctual part of the brain, as Facial Expressions in Response to Taste and Smell Stimulation explores. The authors found that pretty much everyone makes the same faces in response to sweet, sour, and bitter flavors–whites and blacks, old people and newborns, retarded people and blind people, even premature infants, blind infants, and infants born missing most of their brains. All of which is another point in favor of my theory that disgust is real. (And if that is not enough science of taste for you, I recommend Place and Taste Aversion Learning, in which animals with brain lesions lost their fear of new foods.)

Genetics obviously plays a role in taste. If you are one of the 14% or so of people who think cilantro tastes like soap (and I sympathize, because cilantro definitely tastes like soap,) then you’ve already discovered this in a very practical way. Genetics also obviously determine whether you continue producing the enzyme for milk digestion after infancy (lactase persistence). According to Why are you a picky eater? Blame genes, brains, and breastmilk:

In many cases, mom and dad have only themselves to blame for unwittingly passing on the genes that can govern finicky tastes. Studies show that genes play a major role in determining who becomes a picky eater, including recent research on a group of 4- to 7-year-old twins. Part of the pickiness can be attributed to specific genes that govern taste. Variants of the TAS2R38 gene, for example, have been found to encode for taste receptors that determine how strongly someone tastes bitter flavors.

Researchers at Philadelphia’s Monell Chemical Senses Center, a scientific institute dedicated to the study of smell and taste, have found that this same gene also predicts the strength of sweet-tooth cravings among children. Kids who were more sensitive to bitterness preferred sugary foods and drinks. However, adults with the bitter receptor genes remained picky about bitter foods but did not prefer more sweets, the Monell study found. This suggests that sometimes age and experience can override genetics.

I suspect that there is actually a sound biological, evolutionary reason why kids crave sweets more than grownups, and this desire for sweets is somewhat “turned off” as we age.

Picture 10

From a review of Why some like it hot: Food, Genetics, and Cultural Diversity:

Ethnobotanist Gary Paul Nabhan suggests that diet had a key role in human evolution, specifically, that human genetic diversity is predominately a product of regional differences in ancestral diets. Chemical compounds found within animals and plants varied depending on climate. These compounds induced changes in gene expression, which can vary depending on the amount within the particular food and its availability. The Agricultural Age led to further diet-based genetic diversity. Cultivation of foods led to the development of novel plants and animals that were not available in the ancestral environment. …

There are other fascinating examples of gene-diet interaction. Culturally specific recipes, semi-quantitative blending of locally available foods and herbs, and cooking directions needed in order to reduce toxins present in plants, emerged over time through a process of trial-and error and were transmitted through the ages. The effects on genes by foods can be extremely complex given the range of plant-derived compounds available within a given region. The advent of agriculture is suggested to have overridden natural selection by random changes in the environment. The results of human-driven selection can be highly unexpected. …

In sedentary herding societies, drinking water was frequently contaminated by livestock waste. The author suggests in order to avoid contaminated water, beverages made with fermented grains or fruit were drunk instead. Thus, alcohol resistance was selected for in populations that herded animals, such as Europeans. By contrast, those groups which did not practice herding, such as East Asians and Native Americans, did not need to utilize alcohol as a water substitute and are highly sensitive to the effects of alcohol.

Speaking of genetics:

(source?)
From Eating Green could be in your Genes

Indians and Africans are much more likely than Europeans and native South Americans to have an allele that lets them eat a vegetarian diet:

The vegetarian allele evolved in populations that have eaten a plant-based diet over hundreds of generations. The adaptation allows these people to efficiently process omega-3 and omega-6 fatty acids and convert them into compounds essential for early brain development and controlling inflammation. In populations that live on plant-based diets, this genetic variation provided an advantage and was positively selected in those groups.

In Inuit populations of Greenland, the researchers uncovered that a previously identified adaptation is opposite to the one found in long-standing vegetarian populations: While the vegetarian allele has an insertion of 22 bases (a base is a building block of DNA) within the gene, this insertion was found to be deleted in the seafood allele.

Of course, this sort of thing inspires a wealth of pop-psych investigations like Dr. Hirsch’s What Flavor is your Personality?  (from a review:

Dr. Hirsh, neurological director of the Smell and Taste Research and Treatment Foundation in Chicago, stands by his book that is based on over 24 years of scientific study and tests on more than 18,000 people’s food choices and personalities.)

that nonetheless may have some basis in fact, eg: Personality may predict if you like spicy foods:

Byrnes assessed the group using the Arnett Inventory of Sensation Seeking (AISS), a test for the personality trait of sensation-seeking, defined as desiring novel and intense stimulation and presumed to contribute to risk preferences. Those in the group who score above the mean AISS score are considered more open to risks and new experiences, while those scoring below the mean are considered less open to those things.

The subjects were given 25 micrometers of capsaicin, the active component of chili peppers, and asked to rate how much they liked a spicy meal as the burn from the capsaicin increased in intensity. Those in the group who fell below the mean AISS rapidly disliked the meal as the burn increased. People who were above the mean AISS had a consistently high liking of the meal even as the burn increased. Those in the mean group liked the meal less as the burn increased, but not nearly as rapidly as those below the mean.

And then there are the roughly 25% of us who are “supertasters“:

A supertaster is a person who experiences the sense of taste with far greater intensity than average. Women are more likely to be supertasters, as are those from Asia, South America and Africa.[1] The cause of this heightened response is unknown, although it is thought to be related to the presence of the TAS2R38 gene, the ability to taste PROP and PTC, and at least in part, due to an increased number of fungiform papillae.[2]

Perhaps the global distribution of supertasters is related to the distribution of vegetarian-friendly alleles. It’s not surprising that women are more likely to be supertasters, as they have a better sense of smell than men. What may be surprising is that supertasters tend not to be foodies who delight in flavoring their foods with all sorts of new spices, but instead tend toward more restricted, bland diets. Because their sense of taste is essentially on overdrive, flavors that taste “mild” to most people taste “overwhelming” on their tongues. As a result, they tend to prefer a much more subdued palette–which is, of course, perfectly tasty to them.

Picture 8A French study, Changes in Food Preferences and Food Neophobia during a Weight Reduction Session, measured kids’ ability to taste flavors, then the rate at which they became accustomed to new foods. The more sensitive the kids were to flavors, the less likely they were to adopt a new food; the less adept they were at tasting flavors, the more likely they were to start eating vegetables.

Speaking of pickiness again:

“During research back in the 1980s, we discovered that people are more reluctant to try new foods of animal origin than those of plant origin,” Pelchat says. “That’s ironic in two ways. As far as taste is concerned, the range of flavors in animal meat isn’t that large compared to plants, so there isn’t as much of a difference. And, of course, people are much more likely to be poisoned by eating plants than by animals, as long as the meat is properly cooked.” …

It’s also possible that reward mechanisms in our brain can drive changes in taste. Pelchat’s team once had test subjects sample tiny bits of unfamiliar food with no substantial nutritional value, and accompanied them with pills that contained either nothing or a potent cocktail of caloric sugar and fat. Subjects had no idea what was in the pills they swallowed. They learned to like the unfamiliar flavors more quickly when they were paired with a big caloric impact—suggesting that body and brain combined can alter tastes more easily when unappetizing foods deliver big benefits.

So trying to get people to adopt new foods while losing weight may not be the best idea.

(For all that people complain about kids’ pickiness, parents are much pickier. Kids will happily eat playdoh and crayons, but one stray chicken heart in your parents’ soup and suddenly it’s “no more eating at your house.”)

Of course, you can’t talk about food without encountering meddlers who are convinced that people should eat whatever they’re convinced is the perfect diet, like these probably well-meaning folks trying to get Latinos to eat fewer snacks:

Latinos are the largest racial and ethnic minority group in the United States and bear a disproportionate burden of obesity related chronic disease. Despite national efforts to improve dietary habits and prevent obesity among Latinos, obesity rates remain high. …

there is a need for more targeted health promotion and nutrition education efforts on the risks associated with soda and energy-dense food consumption to help improve dietary habits and obesity levels in low-income Latino communities.

Never mind that Latinos are one of the healthiest groups in the country, with longer life expectancies than whites! We’d better make sure they know that their food ways are not approved of!

I have been saving this graph for just such an occasion.
Only now I feel bad because I forgot to write down who made this graph so I can properly credit them. If you know, please tell me!

(Just in case it is not clear already: different people are adapted to and will be healthy on different diets. There is no magical, one-size-fits-all diet.)

And finally, to bring this full circle, it’s hard to miss the folks claiming that Kids Who Eat Fast Food Have Lower IQs:

4,000 Scottish children aged 3-5 years old were examined to compare the intelligence dampening effects of fast food consumption versus  “from scratch”  fare prepared with only fresh ingredients.

Higher fast food consumption by the children was linked with lower intelligence and this was even after adjustments for wealth and social status were taken into account.

It’d be better if they controlled for parental IQ.

The conclusions of this study confirm previous research which shows long lasting effects on IQ from a child’s diet. An Australian study from the University of Adelaide published in August 2012 showed that toddlers who consume junk food grow less smart as they get older. In that study, 7000 children were examined at the age of 6 months, 15 months, 2 years to examine their diet.

When the children were examined again at age 8, children who were consuming the most unhealthy food had IQs up to 2 points lower than children eating a wholesome diet.