Most of the activities our ancestors spent the majority of their time on have been automated or largely replaced by technology. Chances are good that the majority of your great-great grandparents were farmers, but few of us today hunt, gather, plant, harvest, or otherwise spend our days physically producing food; few of us will ever build our own houses or even sew our own clothes.
Evolution has (probably) equipped us with neurofeedback loops that reward us for doing the sorts of things we need to do to survive, like hunt down prey or build shelters (even chimps build nests to sleep in,) but these are precisely the activities that we have largely automated and replaced. The closest analogues to these activities are now shopping, cooking, exercising, working on cars, and arts and crafts. (Even warfare has been largely replaced with professional sports fandom.)
Society has invented vicarious thrills: Books, movies, video games, even roller coasters. Our ability to administer vicarious emotions appears to be getting better and better.
And yet, it’s all kind of fake.
Exercising, for example, is in many ways a pointless activity–people literally buy machines so they can run in place. But if you have a job that requires you to be sedentary for most of the day and don’t fancy jogging around your neighborhood after dark, running in place inside your own home may be the best option you have for getting the post-running-down prey endorphin hit that evolution designed you to crave.
A sedentary lifestyle with supermarkets and restaurants deprives us of that successful-hunting endorphin hit and offers us no logical reason to go out and get it. But without that exercise, not only our physical health, but our mental health appears to suffer. According to the Mayo Clinic, exercise effectively decreases depression and anxiety–in other words, depression and anxiety may be caused in part by lack of exercise.
So what do we do? We have to make up some excuse and substitute faux exercise for the active farming/gardening/hunting/gathering lifestyles our ancestors lived.
Overall, the number of Americans on medications used to treat psychological and behavioral disorders has substantially increased since 2001; more than one‐in‐five adults was on at least one
of these medications in 2010, up 22 percent from ten years earlier. Women are far more likely to take a drug to treat a mental health condition than men, with more than a quarter of the adult female population on these drugs in 2010 as compared to 15 percent of men.
Women ages 45 and older showed the highest use of these drugs overall. …
The trends among children are opposite those of adults: boys are the higher utilizers of these medications overall but girls’ use has been increasing at a faster rate.
This is mind-boggling. 1 in 5 of us is mentally ill, (supposedly,) and the percent for young women in the “prime of their life” years is even higher. (The rates for Native Americans are astronomical.)
Lack of exercise isn’t the only problem, but I wager a decent chunk of it is that our lives have changed so radically over the past 100 years that we are critically lacking various activities that used to make us happy and provide meaning.
Take the rise of atheism. Irrespective of whether God exists or not, many functions–community events, socializing, charity, morality lessons, etc–have historically been done by religious groups. Atheists are working on replacements, but developing a full system that works without the compulsion of religious belief may take a long while.
Sports and video games replace war and personal competition. TV sitcoms replace friendship. Twitter replaces real life conversation. Politics replace friendship, conversation, and religion.
There’s something silly about most of these activities, and yet they seem to make us happy. I don’t think there’s anything wrong with enjoying knitting, even if you’re making toy octopuses instead of sweaters. Nor does there seem to be anything wrong with enjoying a movie or a game. The problem comes when people get addicted to these activities, which may be increasingly likely as our ability to make fake activities–like hyper-realistic special effects in movies–increases.
Given modernity, should we indulge? Or can we develop something better?
From the evolutionist point of view, the point of marriage is the production of children.
Let’s quickly analogize to food. Humans have a tremendous variety of customs, habits, traditions, and taboos surrounding foods. Foods enjoyed in one culture, like pork, crickets, and dog, are regarded as disgusting, immoral, or forbidden in another. Cheese is, at heart, rotten vomit–the enzyme used to make cheese coagulate is actually extracted from a calf’s stomach lining–and yet the average American eats it eagerly.
Food can remind you of your childhood, the best day of your life, the worst day of your life. It can comfort the sick and the mourning, and it accompanies our biggest celebrations of life.
We eat comfort food, holiday food, even sacrificial food. We have decadent luxuries and everyday staples. Some people, like vegans and ascetics, avoid large classes of food generally eaten by their own society for moral reasons.
People enjoy soda because it has water and calories, but some of us purposefully trick our taste buds by drinking Diet Coke, which delivers the sensation of drinking calories without the calories themselves. We enjoy the taste of calories even when we don’t need any more.
But the evolutionary purpose of eating is to get enough calories and nutrients to survive. If tomorrow we all stopped needing to eat–say, we were all hooked into a Matrix-style click-farm in which all nutrients were delivered automatically via IV–all of the symbolic and emotional content attached to food would wither away.
The extended helplessness of human infants is unique in the animal kingdom. Even elephants, who gestate for an incredible two years and become mature at 18, can stand and begin walking around shortly after birth. Baby elephants are not raised solely by their mothers, as baby rats are, but by an entire herd of related female elephants.
Elephants are remarkable animals, clever, communicative, and caring, who mourn their dead and create art:
But from the evolutionist point of view, the point of elephants’ family systems is still the production of elephant children.
Love is a wonderful, sweet, many-splendored thing, but the purpose of marriage, in all its myriad forms–polygamy, monogamy, polyandry, serial monogamy–is still the production of children.
In the Southwest United States, the Apache tribe practices a form of this, where the uncle is responsible for teaching the children social values and proper behavior while inheritance and ancestry is reckoned through the mother’s family alone. (Modern day influences have somewhat but not completely erased this tradition.)
Despite the long public argument over the validity of gay marriage, very few gay people actually want to get married. Gallop reports that after the Obergefell v. Hodges ruling, the percent of married gay people jumped quickly from 7.9% to 9.5%, but then leveled off, rising to only 9.6% by June 2016.
Between 1990 and 2010, the percentage of 50-year-old people who had never married roughly quadrupled for men to 20.1% and doubled for women to 10.6%. The Welfare Ministry predicts these numbers to rise to 29% of men and 19.2% of women by 2035. The government’s population institute estimated in 2014 that women in their early 20s had a one-in-four chance of never marrying, and a two-in-five chance of remaining childless.
Recent media coverage has sensationalized surveys from the Japan Family Planning Association and the Cabinet Office that show a declining interest in dating and sexual relationships among young people, especially among men. However, changes in sexuality and fertility are more likely an outcome of the decline in family formation than its cause. Since the usual purpose of dating in Japan is marriage, the reluctance to marry often translates to a reluctance to engage in more casual relationships.
In other words, marriage is functionally about providing a supportive way of raising children. In a society where birth control does not exist, children born out of wedlock tend not to survive, and people can easily get jobs to support their families, people tended to get married and have children. In a society where people do not want children, cannot afford them, are purposefully delaying childbearing as long as possible, or have found ways to provide for them without getting married, people simply see no need for marriage.
“Marriage” ceases to mean what it once did, reserved for old-fashioned romantics and the few lucky enough to afford it.
Mass acceptance of gay marriage did change how people think of marriage, but it’s downstream from what the massive, societal-wide decrease in child-bearing and increase in illegitimacy have done to our ideas about marriage.
There are three categories of supersars who seem to attract excessive female interest. The first is actors, who of course are selected for being abnormally attractive and put into romantic and exciting narratives that our brains subconsciously interpret as real. The second are sports stars and other athletes, whose ritualized combat and displays of strength obviously indicate their genetic “fitness” for siring and providing for children.
The third and strangest category is professional musicians, especially rock stars.
I understand why people want to pass athletic abilities on to their children, but what is the evolutionary importance of musical talent? Does music tap into some deep, fundamental instinct like a bird’s attraction to the courtship song of its mate? And if so, why?
There’s no denying the importance of music to American courtship rituals–not only do people visit bars, clubs, and concerts where music is being played in order to meet potential partners, but they also display musical tastes on dating profiles in order to meet musically-like-minded people.
Of all the traits to look for in a mate, why rate musical taste so highly? And why do some people describe their taste as, “Anything but rap,” or “Anything but country”?
At least when I was a teen, musical taste was an important part of one’s “identity.” There were goths and punks, indie scene kids and the aforementioned rap and country fans.
Is there actually any correlation between musical taste and personality? Do people who like slow jazz get along with other slow jazz fans better than fans of classical Indian? Or is this all compounded by different ethnic groups identifying with specific musical styles?
Obviously country correlates with Amerikaner ancestry; rap with African American. I’m not sure what ancestry is biggest fans of Die Antwoord. Heavy Metal is popular in Finno-Scandia. Rock ‘n Roll got its start in the African American community as “Race Music” and became popular with white audiences after Elvis Presley took up the guitar.
While Europe has a long and lovely musical heritage, it’s indisputable that African Americans have contributed tremendously to American musical innovation.
Here are two excerpts on the subject of music and dance in African societies:
Both of these h/t HBD Chick and my apologies in advance if I got the sources reversed.
One of the major HBD theories holds that the three races vary–on average–in the distribution of certain traits, such as age of first tooth eruption or intensity of an infant’s response to a tissue placed over its face. Sub-Saharan Africans and Asians are considered two extremes in this distribution, with whites somewhere in between.
If traditional African dancing involves more variety in rhythmic expression than traditional European, does traditional Asian dance involve less? I really know very little about traditional Asian music or dance of any kind, but I would not be surprised to see some kind of continuum affected by whether a society traditionally practiced arranged marriages. Where people chose their own mates, it seems like they display a preference for athletic or musically talented mates (“sexy” mates;) when parents chose mates, they seem to prefer hard-working, devout, “good providers.”
Even in traditional European and American society, where parents played more of a role in courtship than they do today, music still played a major part. Young women, if their families could afford it, learned to play the piano or other instruments in order to be “accomplished” and thus more attractive to higher-status men; young men and women often met and courted at musical events or dances organized by the adults.
It is undoubtedly true that music stirs the soul and speaks to the heart, but why?
If everyone in the world exhibits a particular behavior, chances are it’s innate. But I have been informed–by Harvard-educated people, no less–that humans do not have instincts. We are so smart, you see, that we don’t need instincts anymore.
This is nonsense, of course.
One amusing and well-documented human instinct is the nesting instinct, experienced by pregnant women shortly before going into labor. (As my father put it, “When shes starts rearranging the furniture, get the ready to head to the hospital.”) Having personally experienced this sudden, overwhelming urge to CLEAN ALL THE THINGS multiple times, I can testify that it is a real phenomenon.
Humans have other instincts–babies will not only pick up and try to eat pretty much anything they run across, to every parent’s consternation, but they will also crawl right up to puddles and attempt to drink out of them.
But we’re getting ahead of ourselves: What, exactly, is an instinct? According to Wikipedia:
Instinct or innate behavior is the inherent inclination of a livingorganism towards a particular complex behavior. The simplest example of an instinctive behavior is a fixed action pattern (FAP), in which a very short to medium length sequence of actions, without variation, are carried out in response to a clearly defined stimulus.
Any behavior is instinctive if it is performed without being based upon prior experience (that is, in the absence of learning), and is therefore an expression of innate biological factors. …
Instincts are inborn complex patterns of behavior that exist in most members of the species, and should be distinguished from reflexes, which are simple responses of an organism to a specific stimulus, such as the contraction of the pupil in response to bright light or the spasmodic movement of the lower leg when the knee is tapped.
The go-to example of an instinct is the gosling’s imprinting instinct. Typically, goslings imprint on their mothers, but a baby gosling doesn’t actually know what its mother is supposed to look like, and can accidentally imprint on other random objects, provided they are moving slowly around the nest around the time the gosling hatches.
Here we come to something I think may be useful for distinguishing an instinct from other behaviors: an instinct, once triggered, tends to keep going even if it has been accidentally or incorrectly triggered. Goslings look like they have an instinct to follow their mothers, but they actually have an instinct to imprint on the first large, slowly moving object near their nest when they hatch.
So if you find people strangely compelled to do something that makes no sense but which everyone else seems to think makes perfect sense, you may be dealing with an instinct. For example, women enjoy celebrity gossip because humans have an instinct to keep track of social ranks and dynamics within their own tribe; men enjoy watching other men play sports because it conveys the vicarious feeling of defeating a neighboring tribe at war.
So what about racism? Is it an instinct?
Strictly speaking–and I know I have to define racism, just a moment–I don’t see how we could have evolved such an instinct. Races exist because major human groups were geographically separated for thousands of years–prior to 1492, the average person never even met a person of another race in their entire life. So how could we evolve an instinct in response to something our ancestors never encountered?
Unfortunately, “racism” is a chimera, always changing whenever we attempt to pin it down, but the Urban Dictionary gives a reasonable definition:
An irrational bias towards members of a racial background. The bias can be positive (e.g. one race can prefer the company of its own race or even another) or it can be negative (e.g. one race can hate another). To qualify as racism, the bias must be irrational. That is, it cannot have a factual basis for preference.
Of course, instincts exist because they ensured our ancestors’ survival, so if racism is an instinct, it can’t exactly be “irrational.” We might call a gosling who follows a scientist instead of its mother “irrational,” but this is a misunderstanding of the gosling’s motivation. Since “racist” is a term of moral judgment, people are prone to defending their actions/beliefs towards others on the grounds that it can’t possibly be immoral to believe something that is actually true.
The claim that people are “racist” against members of other races implies, in converse, that they exhibit no similar behaviors toward members of their own race. But even the most perfunctory overview of history reveals people acting in extremely “racist” ways toward members of their own race. During the Anglo-Boer wars, the English committed genocide against the Dutch South Africans (Afrikaners.) During WWII, Germans allied with the the Japanese and slaughtered their neighbors, Poles and Jews. (Ashkenazim are genetically Caucasian and half Italian.) If Hitler were really racist, he’d have teamed up with Stalin and Einstein–his fellow whites–and dropped atomic bombs on Hiroshima. (And for their part, the Japanese would have allied with the Chinese against the Germans.)
The murder victim, a West African chimpanzee called Foudouko, had been beaten with rocks and sticks, stomped on and then cannibalised by his own community. …
“When you reverse that and have almost two males per every female — that really intensifies the competition for reproduction. That seems to be a key factor here,” says Wilson.
Jill Pruetz at Iowa State University, who has been studying this group of chimpanzees in south-eastern Senegal since 2001, agrees. She suggests that human influence may have caused this skewed gender ratio that is likely to have been behind this attack. In Senegal, female chimpanzees are poached to provide infants for the pet trade. …
Early one morning, Pruetz and her team heard loud screams and hoots from the chimps’ nearby sleep nest. At dawn, they found Foudouko dead, bleeding profusely from a bite to his right foot. He also had a large gash in his back and a ripped anus. Later he was found to have cracked ribs. Pruetz says Foudouko probably died of internal injuries or bled out from his foot wound.
Foudouko also had wounds on his fingers. These were likely to have been caused by chimps clamping them in their teeth to stretch his arms out and hold him down during the attack, says Pruetz.
After his death, the gang continued to abuse Foudouko’s body, throwing rocks and poking it with sticks, breaking its limbs, biting it and eventually eating some of the flesh.
“It was striking. The female that cannibalised the body the most, she’s the mother of the top two high-ranking males. Her sons were the only ones that really didn’t attack the body aggressively,” Pruetz says …
Historically, the vast majority of wars and genocides were waged by one group of people against their neighbors–people they were likely to be closely related to in the grand scheme of things–not against distant peoples they’d never met. If you’re a chimp, the chimp most likely to steal your banana is the one standing right in front of you, not some strange chimp you’ve never met before who lives in another forest.
Indeed, in Jane Goodall’s account of the Gombe Chimpanzee War, the combatants were not members of two unrelated communities that had recently encountered each other, but members of a single community that had split in two. Chimps who had formerly lived peacefully together, groomed each other, shared bananas, etc., now bashed each other’s brains out and cannibalized their young. Poor Jane was traumatized.
I think there is an instinct to form in-groups and out-groups. People often have multiple defined in-groups (“I am a progressive, a Christian, a baker, and a Swede,”) but one of these identities generally trumps the others in importance. Ethnicity and gender are major groups most people seem to have, but I don’t see a lot of evidence suggesting that the grouping of “race” is uniquely special, globally, in people’s ideas of in- and out-.
For example, as I am writing today, people are concerned that Donald Trump is enacting racist policies toward Muslims, even though “Muslim” is not a race and most of the countries targeted by Trump’s travel/immigration ban are filled with fellow Caucasians, not Sub-Saharan Africans or Asians.
Race is a largely American obsession, because our nation (like the other North and South American nations,) has always had whites, blacks, and Asians (Native Americans). But many countries don’t have this arrangement. Certainly Ireland didn’t have an historical black community, nor Japan a white one. Irish identity was formed in contrast to English identity; Japanese in contrast to Chinese and Korean.
Only in the context where different races live in close proximity to each other does it seem that people develop strong racial identities; otherwise people don’t think much about race.
Napoleon Chagnon, a white man, has spent years living among the Yanomamo, one of the world’s most murderous tribes, folks who go and slaughter their neighbors and neighbors’ children all the time, and they still haven’t murdered him.
Why do people insist on claiming that Trump’s “Muslim ban” is racist when Muslims aren’t a race? Because Islam is an identity group that appears to function similarly to race, even though Muslims come in white, black, and Asian.
If you’ve read any of the comments on my old post about Turkic DNA, Turkey: Not very Turkic, you’ll have noted that Turks are quite passionate about their Turkic identity, even though “Turkic” clearly doesn’t correspond to any particular ethnic groups. (It’s even more mixed up than Jewish, and that’s a pretty mixed up one after thousands of years of inter-breeding with non-Jews.)
Group identities are fluid. When threatened, groups merged. When resources are abundant and times are good, groups split.
What about evidence that infants identify–stare longer at–faces of people of different races than their parents? This may be true, but all it really tells us is that babies are attuned to novelty. It certainly doesn’t tell us that babies are racist just because they find people interesting who look different from the people they’re used to.
What happens when people encounter others of a different race for the first time?
We have many accounts of “first contacts” between different races during the Age of Exploration. For example, when escaped English convict William Buckley wandered into an uncontacted Aborigine tribe, they assumed he was a ghost, adopted him, taught him to survive, and protected him for 30 years. By contrast, the last guy who landed on North Sentinel Island and tried to chat with the natives there got a spear to the chest and a shallow grave for his efforts. (But I am not certain the North Sentinelese haven’t encountered outsiders at some point.)
But what about the lunchroom seating habits of the wild American teenager?
If people have an instinct to form in-groups and out-groups, then races (or religions?) may represent the furthest bounds of this, at least until we encounter aliens. All else held equal, perhaps we are most inclined to like the people most like ourselves, and least inclined to like the people least like ourselves–racism would thus be the strongest manifestation of this broader instinct. But what about people who have a great dislike for one race, but seem just fine with another, eg, a white person who likes Asians but not blacks, or a black who like Asians but not whites? And can we say–per our definition above–that these preferences are irrational, or are they born of some lived experience of positive or negative interactions?
Again, we are only likely to have strong opinions about members of other races if we are in direct conflict or competition with them. Most of the time, people are in competition with their neighbors, not people on the other side of the world. I certainly don’t sit here thinking negative thoughts about Pygmies or Aborigines, even though we are very genetically distant from each other, and I doubt they spend their free time thinking negatively about me.
Just because flamingos prefer to flock with other flamingos doesn’t mean they dislike horses; for the most part, I think people are largely indifferent to folks outside their own lives.
A recent article in Stanford Magazine highlighted the work of psychologist Richard Lampiere. Back in 1931, Lampiere, a Chinese student of his, and his student’s Chinese wife drove cross-country, visiting 250 hotels and restaurants.
One business refused them service, presumably because of race.
Then Lampiere sent surveys to the businesses they’d visited (plus controls) asking if they served Chinese people. The businesses responded:
235 said NO,
18 said maybe,
and only 2 said YES.
Basically the complete opposite of reality.
Social signalling is cheap; losing actual customers on the ground is expensive.
People today still say whatever they think will gain them approval, though our politics have changed a lot since 1931. For example, 89% of people these days report being willing to marry someone of another race:
but of marriages conducted in 2013, only 12% actually were. By contrast, while a similar number of people said they would be unhappy about a cross-political marriage in their family:
One of my relatives died this week, so I’m going to go be sad, now. Please, if you have any fights with your relatives, try to make up if you can before they die. Sometimes people die a lot younger than you think they will.
And don’t let all of this election bullshittery drive you apart. Just don’t.
(Warning: this post is based on personal, entirely anecdotal observations of other humans.)
I interact, on a fairly regular basis, with people from a wide range of backgounds: folks who’ve spent decades living on the streets; emotionally disabled folks and folks who were emotionally traumatized but recovered; working, middle, and upper class folks.
“Functionality” may not be the easiest term to define, but you know it when you see it: people who manage to pick up the pieces when bad shit happens and continue on with their lives. Non-functionality does not automatically make you poor, (nor does functionality make you rich,) but it is often a major contributing factor.
I’m not going to claim that we all go through equal amounts of trauma; certainly some of us, like infants who were dropped on their heads, have truly shitty lives. Still, almost all of us endure at least some trauma, and there is great variation in our responses to the tragedies we endure.
Among the people I know personally, I’ve noticed that the less-functional tend to have “sticky brains.” When trauma happens, they gloom onto it and get stuck. Years, sometimes decades later, you hear these people still talking about things other people did to them.
For example: two people I know (we’ll call them Foxtrot and Golf, following my alias convention,) had rough childhoods. Foxtrot is still quite bitter over things that happened over 50 years ago, committed by relatives who are long dead. He’s is also bitter about things that happened recently; I often hear about very minor conflicts that normal people would just be angry about for a day or two that Foxtrot is still losing sleep over a month later. Unsurprisingly, he is an unstable emotional wreck with no job, a string of divorces, and virtually no contact with his family.
Golf’s childhood was, by all objective measures, far worse than Foxtrot’s. But Golf doesn’t talk much about his childhood and is today a functional person. When bad things happen to Golf, he deals with them, he might get angry, and then he finishes with them and puts them aside. He has his bad spells–times when things are going badly and he gets really depressed. He also has his good times. But he has managed to keep himself together well enough, even through these bad times, to stay married and employed (to the same person and at the same job, for decades,) is in contact with most of his family, and enjoys a decent reputation in the community.
The homeless people I interact with also have “sticky brains.” When bad things happen to them (and, yes, being homeless is like a permanent bad thing happening to you,) they get really focused on that bad thing. For example, one homeless woman I know has worried for decades about a possible indiscretion she might have committed back in highschool–it is a very minor thing of less importance than copying a few answers on a math test, but she is still worried that she is a cheater and dishonest member of society. Another is fixated on a bad interaction with an aid worker that happened over a year ago. Most people would say, “yeah, that guy was a jerk,” and then stop worrying about it after a week or so; in this case, the hurt is reviewed and re-felt almost every day.
And, of course, I have many personal friends who’ve endured or dealt with traumas in their own more or less useful ways. (Not to mention the various ups and downs of my own life.)
Because trauma is common–some, like the death of a loved one, strike almost everyone who makes it to adulthood–societies tend to adopt guidelines for trauma response, such as a funeral for the dead followed by a six-month mourning period for widows, official days of mourning or remembrance for people who died in wars, therapy and anti-depressants, confession and forgiveness, head-hunting (among head-hunters), or sympathy cards among the less violently inclined. My own family has a tradition of visiting the graveyard where many of our older relatives are buried once a year and cleaning the gravestones. (The children have a tradition of pretending to be zombies.)
Anthropologists like to call these things “rituals” and “customs.” Different societies have different customs, but all of the ones listed exist for the purpose of helping people cope with trauma and grief. (Or at least, that’s what the head-hunters claimed.)
Watching people attempt to cope with life has made me appreciate (most of) these customs. “Six months of mourning,” may seem arbitrary, but it is also pretty useful: it dictates that yes, it is very normal to feel terrible for a while and everyone will be understanding of that, but now the time has passed and it is time to get on with life.
Christianity and Judaism (and probably other religions) command forgiveness:
Do not seek revenge or bear a grudge against anyone among your people, but love your neighbor as yourself. I am the LORD. — Leviticus 19: 18
Then Peter came to Jesus and asked, “Lord, how many times shall I forgive my brother when he sins against me? Up to seven times? “Jesus answered, “I tell you, not seven times, but seventy-seven times.” — Matthew 18: 21-22
This is ostensibly for practical reasons:
For if you forgive men when they sin against you, your heavenly Father will also forgive you. But if you do not forgive men their sins, your Father will not forgive your sins. — Matthew 6:14-15
On Yom Kippur, Jews observe a tradition of forgiving others and asking forgiveness for themselves. (It is not surprising that forgiveness should be handled similarly in two religions that share much of their scriptures; Christianity seems to differ primarily in making the institution of forgiveness a more personal matter rather than an annual ritual.)
I’m pretty sure forgiveness is a big deal in Buddhism, as well, but I don’t know much about Hinduism and other belief systems, so I can’t comment on them.
But why should God require forgiveness? It seems rather unfair to say to someone who was raped as a child and has done nothing worse than tell a few lies in their life, “If you don’t forgive your rapist, God won’t forgive you for lying.”
But this assumes that forgiveness exists for the forgiven. In some cases, of course, it does. But forgiveness also serves a function for the forgiver. I shall leave the concept of spiritual purity to the spiritual; as a practical matter, forgiveness allows the hurt party to stop focusing on their pain and resume life. Most people do this fairly naturally, but some of us need a bit of encouragement–and perhaps ritual focus and faith–to heal.
“3 Blessed are the poor in spirit, for theirs is the kingdom of heaven. 4 Blessed are those who mourn, for they will be comforted. 5 Blessed are the meek, for they will inherit the earth. 6 Blessed are those who hunger and thirst for righteousness, for they will be filled. 7 Blessed are the merciful, for they will be shown mercy. 8 Blessed are the pure in heart, for they will see God. 9 Blessed are the peacemakers, for they will be called children of God. 10 Blessed are those who are persecuted because of righteousness, for theirs is the kingdom of heaven.
11 “Blessed are you when people insult you, persecute you and falsely say all kinds of evil against you because of me.12 Rejoice and be glad, because great is your reward in heaven, for in the same way they persecuted the prophets who were before you. — Matthew 5: 3-12
I don’t think the point of this is that it is morally superior to be insulted or hurt or poor, but reassure and comfort those who have been.
So I was thinking about taste (flavor) and disgust (emotion.)
As I mentioned about a month ago, 25% of people are “supertasters,” that is, better at tasting than the other 75% of people. Supertasters experience flavors more intensely than ordinary tasters, resulting in a preference for “bland” food (food with too much flavor is “overwhelming” to them.) They also have a more difficult time getting used to new foods.
One of my work acquaintances of many years –we’ll call her Echo–is obese, constantly on a diet, and constantly eats sweets. She knows she should eat vegetables and tries to do so, but finds them bitter and unpleasant, and so the general outcome is as you expect: she doesn’t eat them.
Since I find most vegetables quite tasty, I find this attitude very strange–but I am willing to admit that I may be the one with unusual attitudes toward food.
Echo is also quite conservative.
This got me thinking about vegetarians vs. people who think vegetarians are crazy. Why (aside from novelty of the idea) should vegetarians be liberals? Why aren’t vegetarians just people who happen to really like vegetables?
What if there were something in preference for vegetables themselves that correlated with political ideology?
Certainly we can theorize that “supertaster” => “vegetables taste bitter” => “dislike of vegetables” => “thinks vegetarians are crazy.” (Some supertasters might think meat tastes bad, but anecdotal evidence doesn’t support this; see also Wikipedia, where supertasting is clearly associated with responses to plants:
Any evolutionary advantage to supertasting is unclear. In some environments, heightened taste response, particularly to bitterness, would represent an important advantage in avoiding potentially toxic plant alkaloids. In other environments, increased response to bitterness may have limited the range of palatable foods. …
Although individual food preference for supertasters cannot be typified, documented examples for either lessened preference or consumption include:
Mushrooms? Echo was just complaining about mushrooms.
Let’s talk about disgust. Disgust is an important reaction to things that might infect or poison you, triggering reactions from scrunching up your face to vomiting (ie, expelling the poison.) We process disgust in our amygdalas, and some people appear to have bigger or smaller amygdalas than others, with the result that the folks with more amygdalas feel more disgust.
Humans also route a variety of social situations through their amygdalas, resulting in the feeling of “disgust” in response to things that are not rotten food, like other people’s sexual behaviors, criminals, or particularly unattractive people. People with larger amygdalas also tend to find more human behaviors disgusting, and this disgust correlates with social conservatism.
To what extent are “taste” and “disgust” independent of each other? I don’t know; perhaps they are intimately linked into a single feedback system, where disgust and taste sensitivity cause each other, or perhaps they are relatively independent, so that a few unlucky people are both super-sensitive to taste and easily disgusted.
People who find other people’s behavior disgusting and off-putting may also be people who find flavors overwhelming, prefer bland or sweet foods over bitter ones, think vegetables are icky, vegetarians are crazy, and struggle to stay on diets.
What’s that, you say, I’ve just constructed a just-so story?
Michael Shin and William McCarthy, researchers from UCLA, have found an association between counties with higher levels of support for the 2012 Republican presidential candidate and higher levels of obesity in those counties.
Looks like the Mormons and Southern blacks are outliers.
(I don’t really like maps like this for displaying data; I would much prefer a simple graph showing orientation on one axis and obesity on the other, with each county as a datapoint.)
(Unsurprisingly, the first 49 hits I got when searching for correlations between political orientation and obesity were almost all about what other people think of fat people, not what fat people think. This is probably because researchers tend to be skinny people who want to fight “fat phobia” but aren’t actually interested in the opinions of fat people.)
Liberals are 28 percent more likely than conservatives to eat fresh fruit daily, and 17 percent more likely to eat toast or a bagel in the morning, while conservatives are 20 percent more likely to skip breakfast.
Ten percent of liberals surveyed indicated they are vegetarians, compared with 3 percent of conservatives.
Liberals are 28 percent more likely than conservatives to enjoy beer, with 60 percent of liberals indicating they like beer.
(See above where Wikipedia noted that supertasters dislike beer.) I will also note that coffee, which supertasters tend to dislike because it is too bitter, is very popular in the ultra-liberal cities of Portland and Seattle, whereas heavily sweetened iced tea is practically the official beverage of the South.
The only remaining question is if supertasters are conservative. That may take some research.
Update: I have not found, to my disappointment, a simple study that just looks at correlation between ideology and supertasting (or nontasting.) However, I have found a couple of useful items.
Standard tests of disgust sensitivity, a questionnaire developed for this research assessing different types of moral transgressions (nonvisceral, implied-visceral, visceral) with the terms “angry” and “grossed-out,” and a taste sensitivity test of 6-n-propylthiouracil (PROP) were administered to 102 participants. [PROP is commonly used to test for “supertasters.”] Results confirmed past findings that the more sensitive to PROP a participant was the more disgusted they were by visceral, but not moral, disgust elicitors. Importantly, the findings newly revealed that taste sensitivity had no bearing on evaluations of moral transgressions, regardless of their visceral nature, when “angry” was the emotion primed. However, when “grossed-out” was primed for evaluating moral violations, the more intense PROP tasted to a participant the more “grossed-out” they were by all transgressions. Women were generally more disgust sensitive and morally condemning than men, … The present findings support the proposition that moral and visceral disgust do not share a common oral origin, but show that linguistic priming can transform a moral transgression into a viscerally repulsive event and that susceptibility to this priming varies as a function of an individual’s sensitivity to the origins of visceral disgust—bitter taste. [bold mine.]
In other words, supertasters are more easily disgusted, and with verbal priming will transfer that disgust to moral transgressions. (And easily disgusted people tend to be conservatives.)
If you aren’t familiar with the “replication crisis,” in social psychology, start here, here, and here.
I consider the courses I took in college on quantitative and qualitative methods the most important of my undergraduate years. I learned thereby a great many important things about how not to conduct an experiment and how to think about experimental methodology (not to mention statistics.)
If I were putting together a list of “general education” requirements I wanted all students to to take in order to declare them well-educated and ready to go out into the world, it’d be a course on Quantitative and Qualitative Methods. (Much like current “gen ed” and “distribution requirements,” the level of mathematical ability required would likely vary by field, though no one should be obtaining a college degree without some degree of numerical competence.)
But the real problem with the social science fields is not lack of rigorous statistical background, but overwhelming ideological conformity, enforced by the elders of the fields–advisers, hiring committees, textbook writers, journal editors, etc., who all believe in the same ideology and so have come to see their field as “proving” their ideology.
Ideology drives both the publication biases and the wishful thinking that underlie this crisis. For example, everyone in “Women’s studies” is a feminist who believes that “science” proves that women are oppressed because everyone they know has done studies “proving” it. You’re not going to find a lot of Women’s Studies professors aiming for tenure on the basis of their successful publication of a bunch of studies that failed to find any evidence of bias against women. Findings like that => no publication => no tenure. And besides, feminist professors see it as their moral duty to prove that discrimination exists, not to waste their time on studies that just happened not to be good enough to find the effect.
In the Social Sciences more generally, we get this “post modern” mish-mash of everything from Marxists to Freudians to folks who like Foucault and Said, where the goal is to mush up long-winded descriptions of otherwise simple phenomena into endless Chomsky Sentences.
(Just reading the Wikipedia pages on a variety of Social Science oriented topics reveals how very little real research or knowledge is generated in these fields, and how much is based on individual theorists’ personal views. It is often obvious that virtually anyone not long steeped in the academic literature of these fields would not come up with these theories, but with something far more mundane and sensible. Economists, for all their political bias, at least provide a counterpoint to many of these theories.)
Obviously different fields study different aspects of phenomena, but entire fields should not become reduced to trying to prove one political ideology or another. If they are, they should label themselves explicitly, rather than make a pretense of neutrality.
When ideology rather than correctness become the standard for publication (not to mention hiring and tenure,) the natural result is incorrectness.
More statistical knowledge is not, by itself, going to resolve the problem. The fields must first recognize that they have an ideological bias problem, and then work to remedy it by letting in and publishing work by researchers outside the social science ideological mainstream. It is very easy to think your ideas sound rigorous when you are only debating with people who already agree with you; it is much more difficult to defend your views against people who disagree, or come from very different intellectual backgrounds.
They could start with–hahahaha–letting in a Republican.
Continuing with yesterday’s discussion (in response to a reader’s question):
Why are people snobs about intelligence?
Is math ability better than verbal?
Do people only care about intelligence in the context of making money?
1. People are snobs. Not all of them, obviously–just a lot of them.
So we’re going to have to back this up a step and ask why are people snobs, period.
Paying attention to social status–both one’s own and others’–is probably instinctual. We process social status in our prefrontal cortexes–the part of our brain generally involved in complex thought, imagination, long-term planning, personality, not being a psychopath, etc. Our brains respond positively to images of high-status items–activating reward-feedback loop that make us feel good–and negatively to images of low-status items–activating feedback loops that make us feel bad.
…researchers asked a person if the following statement was an accurate description of themselves: “I wouldn’t hesitate to go out of my way to help someone in trouble.” Some of the participants answered the question without anyone else seeing their response. Others knowingly revealed their answer to two strangers who were watching in a room next to them via video feed. The result? When the test subjects revealed an affirmative answer to an audience, their [medial prefrontal cortexes] lit up more strongly than when they kept their answers to themselves. Furthermore, when the participants revealed their positive answers not to strangers, but to those they personally held in high regard, their MPFCs and reward striatums activated even more strongly. This confirms something you’ve assuredly noticed in your own life: while we generally care about the opinions of others, we particularly care about the opinions of people who really matter to us.
(Note what constitutes a high-status activity.)
But this alone does not prove that paying attention to social status is instinctual. After all, I can also point to the part of your brain that processes written words (the Visual Word Form Area,) and yet I don’t assert that literacy is an instinct. For that matter, anything we think about has to be processed in our brains somewhere, whether instinct or not.
Better evidence comes from anthropology and zoology. According to Wikipedia, “All societies have a form of social status,” even hunter-gatherers. If something shows up in every single human society, that’s a pretty good sign that it is probably instinctual–and if it isn’t, it is so useful a thing that no society exists without it.
Among animals, social status is generally determined by a combination of physical dominance, age, relationship, and intelligence. Killer whale pods, for example, are led by the eldest female in the family; leadership in elephant herds is passed down from a deceased matriarch to her eldest daughter, even if the matriarch has surviving sisters. Male lions assert dominance by being larger and stronger than other lions.
In all of these cases, the social structure exists because it benefits the group, even if it harms some of the individuals in it. If having no social structure were beneficial for wolves, then wolf packs without alpha wolves would out-compete packs with alphas. This is the essence of natural selection.
Among humans, social status comes in two main forms, which I will call “earned” and “background.”
“Earned” social status stems from things you do, like rescuing people from burning buildings, inventing quantum physics, or stealing wallets. High status activities are generally things that benefit others, and low-status activities are generally those that harm others. This is why teachers are praised and thieves are put in prison.
Earned social status is a good thing, because it reward people for being helpful.
“Background” social status is basically stuff you were born into or have no effect over, like your race, gender, the part of the country you grew up in, your accent, name, family reputation, health/disability, etc.
Americans generally believe that you should not judge people based on background social status, but they do it, anyway.
Interestingly, high-status people are not generally violent. (Just compare crime rates by neighborhood SES.) Outside of military conquest, violence is the domain of the low-class and those afraid they are slipping in social class, not the high class. Compare Andrea Merkel to the average German far-right protester. Obviously the protester would win in a fist-fight, but Merkel is still in charge. High class people go out of their way to donate to charity, do volunteer work, and talk about how much they love refugees. In the traditional societies of the Pacific Northwest, they held potlatches at which they distributed accumulated wealth to their neighbors; in our society, the wealthy donate millionsto education. Ideally, in a well-functioning system, status is the thanks rich people get for doing things that benefit the community instead of spending their billions on gold-plated toilets.
The Arabian babbler … spends most of its life in small groups of three to 20 members. These groups lay their eggs in a communal nest and defend a small territory of trees and shrubs that provide much-needed safety from predators.
When it’s living as part of a group, a babbler does fairly well for itself. But babblers who get kicked out of a group have much bleaker prospects. These “non-territorials” are typically badgered away from other territories and forced out into the open, where they often fall prey to hawks, falcons, and other raptors. So it really pays to be part of a group. … Within a group, babblers assort themselves into a linear and fairly rigid dominance hierarchy, i.e., a pecking order. When push comes to shove, adult males always dominate adult females — but mostly males compete with males and females with females. Very occasionally, an intense “all-out” fight will erupt between two babblers of adjacent rank, typically the two highest-ranked males or the two highest-ranked females. …
Most of the time, however, babblers get along pretty well with each other. In fact, they spend a lot of effort actively helping one another and taking risks for the benefit of the group. They’ll often donate food to other group members, for example, or to the communal nestlings. They’ll also attack foreign babblers and predators who have intruded on the group’s territory, assuming personal risk in an effort to keep others safe. One particularly helpful activity is “guard duty,” in which one babbler stands sentinel at the top of a tree, watching for predators while the rest of the group scrounges for food. The babbler on guard duty not only foregoes food, but also assumes a greater risk of being preyed upon, e.g., by a hawk or falcon. …
Unlike chickens, who compete to secure more food and better roosting sites for themselves, babblers compete to give food away and to take the worst roosting sites. Each tries to be more helpful than the next. And because it’s a competition, higher-ranked (more dominant) babblers typically win, i.e., by using their dominance to interfere with the helpful activities of lower-ranked babblers. This competition is fiercest between babblers of adjacent rank. So the alpha male, for example, is especially eager to be more helpful than the beta male, but doesn’t compete nearly as much with the gamma male. Similar dynamics occur within the female ranks.
In the eighteenth and early nineteenth century, wealthy private individuals substantially supported the military, with a particular wealthy men buying stuff for a particular regiment or particular fort.
Noblemen paid high prices for military commands, and these posts were no sinecure. You got the obligation to substantially supply the logistics for your men, the duty to obey stupid orders that would very likely lead to your death, the duty to lead your men from in front while wearing a costume designed to make you particularly conspicuous, and the duty to engage in honorable personal combat, man to man, with your opposite number who was also leading his troops from in front.
A vestige of this tradition remains in that every English prince has been sent to war and has placed himself very much in harm’s way.
It seems obvious to me that a soldier being led by a member of the ruling class who is soaking up the bullets from in front is a lot more likely to be loyal and brave than a soldier sent into battle by distant rulers safely in Washington who despise him as a sexist homophobic racist murderer, that a soldier who sees his commander, a member of the ruling classes, fighting right in front of him, is reflexively likely to fight.
(Note, however, that magnanimity is not the same as niceness. The only people who are nice to everyone are store clerks and waitresses, and they’re only nice because they have to be or they’ll get fired.)
Most people are generally aware of each others’ social statuses, using contextual clues like clothing and accents to make quick, rough estimates. These contextual clues are generally completely neutral–they just happen to correlate with other behaviors.
For example, there is nothing objectively good or bad for society about wearing your pants belted beneath your buttocks, aside from it being an awkward way to wear your pants. But the style correlates with other behaviors, like crime, drug use, and aggression, low paternal investment, and unemployment, all of which are detrimental to society, and so the mere sight of underwear spilling out of a man’s pants automatically assigns him low status. There is nothing causal in this relationship–being a criminal does not make you bad at buckling your pants, nor does wearing your pants around your knees somehow inspire you to do drugs. But these things correlate, and humans are very good at learning patterns.
Likewise, there is nothing objectively better about operas than Disney movies, no real difference between a cup of coffee brewed in the microwave and one from Starbucks; a Harley Davidson and a Vespa are both motorcycles; and you can carry stuff around in just about any bag or backpack, but only the hoity-toity can afford something as objectively hideous as a $26,000 Louis Vutton backpack.
All of these things are fairly arbitrary and culturally dependent–the way you belt your pants can’t convey social status in a society where people don’t wear pants; your taste in movies couldn’t matter before movies were invented. Among hunter-gatherers, social status is based on things like one’s skills at hunting, and if I showed up to the next PTA meeting wearing a tophat and monocle, I wouldn’t get any status points at all.
We tend to aggregate the different social status markers into three broad classes (middle, upper, and lower.) As Scott Alexander says in his post about Siderea’s essay on class in America, which divides the US into 10% Underclass, 65% Working Class, 23.5% Gentry Class, and 1.5% Elite:
Siderea notes that Church’s analysis independently reached about the same conclusion as Paul Fussell’s famous guide. I’m not entirely sure how you’d judge this (everybody’s going to include lower, middle, and upper classes), but eyeballing Fussell it does look a lot like Church, so let’s grant this.
It also doesn’t sound too different from Marx. Elites sound like capitalists, Gentry like bourgeoisie, Labor like the proletariat, and the Underclass like the lumpenproletariat. Or maybe I’m making up patterns where they don’t exist; why should the class system of 21st century America be the same as that of 19th century industrial Europe?
There’s one more discussion of class I remember being influenced by, and that’s Unqualified Reservations’ Castes of the United States. Another one that you should read but that I’ll summarize in case you don’t:
1. Dalits are the underclass, … 2. Vaisyas are standard middle-class people … 3. Brahmins are very educated people … 4. Optimates are very rich WASPs … now they’re either extinct or endangered, having been pretty much absorbed into the Brahmins. …
Michael Church’s system (henceforth MC) and the Unqualified Reservation system (henceforth UR) are similar in some ways. MC’s Underclass matches Dalits, MC’s Labor matches Vaisyas, MC’s Gentry matches Brahmins, and MC’s Elite matches Optimates. This is a promising start. It’s a fourth independent pair of eyes that’s found the same thing as all the others. (commenters bring up Joel Kotkin and Archdruid Report as similar convergent perspectives).
I suspect the tendency to try to describe society as consisting of three broad classes (with the admission that other, perhaps tiny classes that don’t exactly fit into the others might exist) is actually just an artifact of being a three-biased society that likes to group things in threes (the Trinity, three-beat joke structure, three bears, Three Musketeers, three notes in a chord, etc.) This three-bias isn’t a human universal (or so I have read) but has probably been handed down to us from the Indo-Europeans, (“Many Indo-European societies know a threefold division of priests, a warriorclass, and a class of peasants or husbandmen. Georges Dumézil has suggested such a division for Proto-Indo-European society,”) so we’re so used to it that we don’t even notice ourselves doing it.
(For more information on our culture’s three-bias and different number biases in other cultures, see Alan Dundes’s Interpreting Folklore, though I should note that I read it back in highschool and so my memory of it is fuzzy.)
(Also, everyone is probably at least subconsciously cribbing Marx, who was probably cribbing from some earlier guy who cribbed from another earlier guy, who set out with the intention of demonstrating that society–divided into nobles, serfs, and villagers–reflected the Trinity, just like those Medieval maps that show the world divided into three parts or the conception of Heaven, Hell, and Purgatory.)
At any rate, I am skeptical of any system that lumps 65% of people into one social class and 0.5% of people into a different social class as being potentially too-finely grained at one end of the scale and not enough at the other. Determining the exact number of social classes in American society may ultimately be futile–perhaps there really are three (or four) highly distinct groups, or perhaps social classes transition smoothly from one to the next with no sharp divisions.
I lean toward the latter theory, with broad social classes as merely a convenient shorthand for extremely broad generalizations about society. If you look any closer, you tend to find that people do draw finer-grained distinctions between themselves and others than “65% Working Class” would imply. For example, a friend who works in agriculture in Greater Appalachia once referred dismissively to other people they had to deal with as “red necks.” I might not be able to tell what differentiates them, but clearly my friend could. Similarly, I am informed that there are different sorts of homelessness, from true street living to surviving in shelters, and that lifetime homeless people are a different breed altogether. I might call them all “homeless,” but to the homeless, these distinctions are important.
Is social class evil?
This question was suggested by a different friend.
I suspect that social class is basically, for the most part, neutral-to-useful. I base this on the fact that most people do not work very hard to erase markers of class distinction, but instead actively embrace particular class markers. (Besides, you can’t get rid of it, anyway.)
It is not all that hard to learn the norms and values of a different social class and strategically employ them. Black people frequently switch between speaking African American Vernacular English at home and standard English at work; I can discuss religion with Christian conservatives and malevolent AI risk with nerds; you can purchase a Harley Davidson t-shirt as easily as a French beret and scarf.
(I am reminded here of an experiment in which researchers were looking to document cab drivers refusing to pick up black passengers; they found that when the black passengers were dressed nicely, drivers would pick them up, but when they wore “ghetto” clothes, the cabs wouldn’t. Cabbies: responding more to perceived class than race.)
And yet, people don’t–for the most part–mass adopt the social markers of the upper class just to fool them. They love their motorcycle t-shirts, their pumpkin lattes, even their regional accents. Class markers are an important part of peoples’ cultural / tribal identities.
But what about class conflicts?
Because every class has its own norms and values, every class is, to some degree, disagreeing with the other classes. People for whom frugality and thrift are virtues will naturally think that people who drink overpriced coffee are lacking in moral character. People for whom anti-racism is the highest virtue will naturally think that Trump voters are despicable racists. A Southern Baptist sees atheists as morally depraved fetus murderers; nerds and jocks are famously opposed to each other; and people who believe that you should graduate from college, become established in your career, get married, and then have 0-1.5 children disapprove of people who drop out of highschool, have a bunch of children with a bunch of different people, and go on welfare.
A moderate sense of pride in one’s own culture is probably good and healthy, but spending too much energy hating other groups is probably negative–you may end up needlessly hurting people whose cooperation you would have benefited from, reducing everyone’s well-being.
(A good chunk of our political system’s dysfunctions are probably due to some social classes believing that other social classes despise them and are voting against their interests, and so counter-voting to screw over the first social class. I know at least one person who switched allegiance from Hillary to Trump almost entirely to stick it to liberals they think look down on them for classist reasons.)
Ultimately, though, social class is with us whether we like it or not. Even if a full generation of orphan children were raised with no knowledge of their origins and completely equal treatment by society at large, each would end up marrying/associating with people who have personalities similar to themselves (and remember that genetics plays a large role in personality.) Just as current social classes in America are ethnically different, (Southern whites are drawn from different European populations than Northern whites, for example,) so would the society resulting from our orphanage experiment differentiate into genetically and personalityish-similar groups.
Why do Americans generally proclaim their opposition to judging others based on background status, and then act classist, anyway? There are two main reasons.
As already discussed, different classes have real disagreements with each other. Even if I think I shouldn’t judge others, I can’t put aside my moral disgust at certain behaviors just because they happen to correlate with different classes.
It sounds good to say nice, magnanimous things that make you sound more socially sensitive and aware than others, like, “I wouldn’t hesitate to go out of my way to help someone in trouble.” So people like to say these things whether they really mean them or not.
In reality, people are far less magnanimous than they like to claim they are in front of their friends. People like to say that we should help the homeless and save the whales and feed all of the starving children in Africa, but few people actually go out of their way to do such things.
There is a reason Mother Teresa is considered a saint, not an archetype.
In real life, not only does magnanimity has a cost, (which the rich can better afford,) but if you don’t live up to your claims, people will notice. If you talk a good talk about loving others but actually mistreat them, people will decide that you’re a hypocrite. On the internet, you can post memes for free without havng to back them up with real action, causing discussions to descend into competitive-virtue signalling in which no one wants to be the first person to admit that they actually are occasionally self-interested. (Cory Doctorow has a relevant discussion about how “reputations economies”–especially internet-based ones–can go horribly wrong.)
Unfortunately, people often confuse background and achieved status.
American society officially has no hereditary social classes–no nobility, no professions limited legally to certain ethnicities, no serfs, no Dalits, no castes, etc. Officially, if you can do the job, you are supposed to get it.
Most of us believe, at least abstractly, that you shouldn’t judge or discriminate against others for background status factors they have no control over, like where they were born, the accent thy speak with, or their skin tone. If I have two resumes, one from someone named Lakeesha, and the other from someone named Ian William Esquire III, I am supposed to consider each on their merits, rather than the connotations their names invoke.
But because “status” is complicated, people often go beyond advocating against “background” status and also advocate that we shouldn’t accord social status for any reasons. That is, full social equality.
This is not possible and would be deeply immoral in practice.
When you need heart surgery, you really hope that the guy cutting you open is a top-notch heart surgeon. When you’re flying in an airplane, you hope that both the pilot and the guys who built the plane are highly skilled. Chefs must be good at cooking and authors good at writing.
These are all forms of earned status, and they are good.
Smart people are valuable to society because they do nice things like save you from heart attacks or invent cell-phones. This is not “winning at capitalism;” this is benefiting everyone around them. In this context, I’m happy to let smart people have high status.
In a hunter-gatherer society, smart people are the ones who know the most about where animals live and how to track them, how to get water during a drought, and where that 1-inch stem they spotted last season that means a tasty underground tuber is located. Among nomads, smart people are the ones with the biggest mental maps of the territory, the folks who know the safest and quickest routes from good summer pasture to good winter pasture, how to save an animal from dying and how to heal a sick person. Among pre-literate people, smart people composed epic poems that entertained their neighbors for many winters’ nights, and among literate ones, the smart people became scribes and accountants. Even the communists valued smart people, when they weren’t chopping their heads off for being bourgeois scum.
So even if we say, abstractly, “I value all people, no matter how smart they are,” the smart people do more of the stuff that benefits society than the dumb people, which means they end up with higher social status.
So, yes, high IQ is a high social status marker, and low IQ is a low social status marker, and thus at least some people will be snobs about signaling their IQ and their disdain for dumb people.
I am speaking here very abstractly. There are plenty of “high status” people who are not benefiting society at all. Plenty of people who use their status to destroy society while simultaneously enriching themselves. And yes, someone can come into a community, strip out all of its resources and leave behind pollution and unemployment, and happily call it “capitalism” and enjoy high status as a result.
I would be very happy if we could stop engaging in competitive holiness spirals and stop lionizing people who became wealthy by destroying communities. I don’t want capitalism at the expense of having a pleasant place to live in.