The results were striking. Various combinations of height, weight, and head shape were significantly related to 90% of the negative C-BARQ behavioral traits. Further, in nearly all cases, the smaller the dogs, the more problematic behaviors their owners reported. Here are some examples.
Height – Short breeds were more prone to beg for food, have serious attachment problems, be afraid of other dogs, roll in feces, be overly sensitive to touch, defecate and urinate when left alone, and be harder to train. They also were more inclined to hump people’s legs.
So what’s up with small dogs? Let’s run through the obvious factors first:
Culling: Behavioral and psychological problems obviously get bred out of large dogs more quickly. An anxious pug is cute; an anxious doberman is a problem. A chihuahua who snaps at children is manageable; a rottweiler who snaps at children gets put down.
Training: Since behavioral problems are more problematic in larger dogs, their owners (who chose them in the first place,) are stricter from the beginning about problematic behaviors. No one cares if a corgi begs at the dinner table; a St. Bernard who thinks he’s going to eat off your plate gets unmanageable fast.
Rational behaviors: Since small dogs are small, some of the behaviors listed in the article make sense. They pee indoors by accident more often because they have tiny bladders and just need to pee more often than large dogs (and they have to drink more often). They are more fearful because being smaller than everything around them actually is frightening.
Accident of Breeding: Breeding for one trait can cause other traits to appear by accident. For example, breeding for tameness causes changes to animals’ pelt colors, for reasons we don’t yet know. Breeding for small dogs simultaneously breeds for tiny brains, and dogs with tiny brains are stupider than dogs with bigger brains. Stupider dogs are harder to train and may just have more behavioral issues. They may also attempt behaviors (guarding, hunting, herding, etc) that are now very difficult for them due to their size.
Accident of training: people get small dogs and then stick them in doggy carriages, dress them in doggy clothes, and otherwise baby them, preventing them from being properly trained. No wonder such dogs are neurotic.
And finally, That’s not a Bug, it’s a Feature: Small dogs have issues because people want them to.
Small dogs are bred to be companions to people, usually women (often lonely, older women whose children have moved out of the house and don’t call as often as they should). As such, these dogs are bred to have amusing, human-like personalities–including psychological problems.
Lonely people desire dogs that will stay by them, and so favor anxious dogs. Energetic people favor hyperactive dogs. Anti-social people who don’t want to bond emotionally with others get a snake.
There’s an analogy here with other ways people meet their emotional/psychological needs, like Real Dolls and fake babies (aka “reborns”). The “reborn” doll community contains plenty of ordinary collectors and many grieving parents whose babies died or were stillborn and some older folks with Alzheimer’s, as well as some folks who clearly take it too far and enter the creepy territory.
Both puppies and babydolls are, in their way, stand-ins for the real thing (children,) but dogs are also actually alive, so people don’t feel stupid taking care of dogs. Putting your dog in a stroller or dressing it up in a cute outfit might be a bit silly, but certainly much less silly than paying thousands of dollars to do the same thing to a doll.
And unlike dolls, dogs actually respond to our emotions and have real personalities. As John Katz argues, we now use dogs, in effect, for their emotional work:
In an increasingly fragmented and disconnected society, dogs are often treated not as pets, but as family members and human surrogates. The New Work of Dogsprofiles a dozen such relationships in a New Jersey town, like the story of Harry, a Welsh corgi who provides sustaining emotional strength for a woman battling terminal breast cancer; Cherokee, companion of a man who has few friends and doesn’t know how to talk to his family; the Divorced Dogs Club, whose funny, acerbic, and sometimes angry women turn to their dogs to help them rebuild their lives; and Betty Jean, the frantic founder of a tiny rescue group that has saved five hundred dogs from abuse or abandonment in recent years.
Normally we’d call this “bonding,” “loving your dog,” or “having a friend,” but we moderns have to overthink everything and give it fussy labels like “emotional work.” We’re silly, but thankfully our dogs put up with us.
The ancestors of horses–small, multi-toed quadrupeds–emerged around 50 million years ago, but horses as we know them (and their wild cousins) evolved from a common ancestor around 6 million years ago. Horses in those days were concentrated in North America, but spread via the Bering land bridge to Eurasia and Africa, where they differentiated into zebras, asses, and “wild” horses.
When humans first encountered horses, we ate them. American horses became extinct around 14,000-10,000 years ago, first in Beringia and then in the rest of the continent–coincidentally about the time humans arrived here. The first known transition from hunting horses to herding and ranching them occurred around 6,000 years ago among the Botai of ancient Kazakhstan, not far from the proto-Indo European homeland (though the Botai themselves do not appear to have been pIEs). These herds were still managed for meat, of which the Botai ate tons, until some idiot teenager decided to impress his friends by riding one of the gol-dang things. Soon after, the proto-Indo-Europeans got the idea and went on a rampage, conquering Europe, Iran, and the Indian subcontinent, (and then a little later North and South America, Africa, Australia, and India again). Those horses were useful.
Oddly, though, it appears that those Botai horses are not the ancestors of the modern horses people ride today–but instead are the ancestors of the Przewalski “wild” horse. The Przewalski was though to be a truly wild, undomesticated species, but it appears to have been a kind of domesticated horse* that went feral, much like the mustangs of the wild west. Unlike the mustang, though, the Przewalski is a truly separate species, with 66 chromosomes. Domesticated horses have 64, so the two species cannot produce fertile hybrids. When exactly the Przewalski obtained their extra chromosomes, I don’t know.
*This, of course, depends on the assumption that the Botai horses were “domesticated” in the first place.
Instead, modern, domesticated horses are believed to have descended from the wild Tarpan, though as far as I know, genetic studies proving this have not yet been done. The Tarpan is extinct, but survived up to the cusp of the twentieth century. (Personally, I’m not putting odds on any major tarpan herds in the past couple thousand years having had 100% wild DNA, but I wouldn’t classify them as “feral” just because of a few escaped domestics.)
Thus the horse was domesticated multiple times–especially if we include that other useful member of the equus family, the ass (or donkey, if you’d prefer). The hardworking little donkey does not enjoy its cousin’s glamorous reputation, and Wikipedia reports,
Throughout the world, working donkeys are associated with the very poor, with those living at or below subsistence level. Few receive adequate food, and in general donkeys throughout the Third World are under-nourished and over-worked.
The donkey is believed to have been domesticated from the wild African ass, probably in ancient Nubia (southern Egypt/northern Sudan). From there it spread up the river to the rest of Egypt, where it became an important work animal, and from there to Mesopotamia and the rest of the world.
Wild African asses still exist, but they are critically endangered.
I have no idea while equines have so much chromosomal diversity; dogs have been domesticated for much longer than horses, but are still interfertile with wolves and even coyotes (tbf, maybe horses could breed with tarpans.)
Interestingly, domestication causes a suit of changes to a species’ appearance that are not obviously useful. Recently-domesticated foxes exhibit pelt colors and patterns similar to those of domesticated dogs, not wild foxes. We humans have long hair, unlike our chimp-like ancestors. Horses also have long manes, unlike wild zebras, asses, and tarpans. Horses have evolved, then, to look rather like humans.
Also like humans, horses have different male and female histories. Male horses were quite difficult to tame, and so early domesticators only obtained a few male horses. Females, by contrast, were relatively easy to gentle, so breeders often restocked their herds with wild females. As a result, domesticated horses show far more variation in their mitochondrial DNA than their Y chromosomes. The stocking of herds from different groups of wild horses most likely gave rise to 17 major genetic clusters:
From these sequences, a phylogenetic network was constructed that showed that most of the 93 different mitochondrial (mt)DNA types grouped into 17 distinct phylogenetic clusters. Several of the clusters correspond to breeds and/or geographic areas, notably cluster A2, which is specific to Przewalski’s horses, cluster C1, which is distinctive for northern European ponies, and cluster D1, which is well represented in Iberian and northwest African breeds. A consideration of the horse mtDNA mutation rate together with the archaeological timeframe for domestication requires at least 77 successfully breeding mares recruited from the wild. The extensive genetic diversity of these 77 ancestral mares leads us to conclude that several distinct horse populations were involved in the domestication of the horse.
The wild mustangs of North America might have even more interesting DNA:
The researchers said four family groups (13.8%) with 31 animals fell into haplogroup B, with distinct differences to the two haplogroup L lineages identified.
The closest mitochondrial DNA sequence was found in a Thoroughbred racing horse from China, but its sequence was still distinct in several areas.
The testing also revealed links to the mitochondrial DNA of an Italian horse of unspecific breed, the Yunnan horse from China, and the Yakutia horse from central Siberia, Russia.
Haplogroup B seems to be most frequent in North America (23.1%), with lower frequencies in South America (12.68%) and the Middle East (10.94%) and Europe (9.38%).
“Although the frequency of this lineage is low (1.7%) in the Asian sample of 587 horses, this lineage was found in the Bronze Age horses from China and South Siberia.”
Westhunter suggests that this haplogroup could have originated from some surviving remnant of American wild horses that hadn’t actually been completely killed off before the Spanish mustangs arrived and bred with them. I caution a more prosaic possibility that the Russians brought them while colonizing Alaska and the coast down to northern California. Either way, it’s an intriguing finding.
The horse has been man’s companion for thousands of years and helped him conquer most of the Earth, but the recent invention of internal and external combustion engines (eg, the Iron Horse) have put most horses out to pasture. In effect, they have become obsolete. Modern horses have much easier lives than their hard-working plow and wagon-pulling ancestors, but their populations have shrunk enormously. They’re not going to go extinct, because rich people still like them (and they are still useful in parts of the world where cars cannot easily go,) but they may suffer some of the problems of inbreeding found in genetically narrow dog breeds.
Maybe someday, significant herds of wild horses will roam free again.
Well, there’s a clickbaity title if ever I wrote one.
Nevertheless, human breasts are strange. Sure, all females of the class mammalia are equipped with mammary glands for producing milk, but humans alone posses permanent, non-functional breasts.
Yes, non-functional: the breast tissue that develops during puberty and that you see on women all around you is primarily fat. Fat does not produce milk. Milk ducts produce milk. They are totally different things.
Like all other mammals, the milk-producing parts of the breasts only activate–make milk–immediately after a baby is born. At any other time, milk production is a useless waste of calories. And when mothers begin to lactate, breasts noticeably increase in size due to the sudden production of milk.
A number of factors associated with low milk supply have been identified, such as nipple pain, ineffective nursing, hormonal disorders, breast surgery, certain medications, and maternal obesity. … Research into breast size and milk production shows that milk supply is not dependent on breast size, but rather on the amount of epithelial tissue contained in a breast that is capable of making milk …
However, in addition to baby attachment issues, accumulating evidence shows that a major factor preventing overweight and obese mothers to breastfeed is the inability of their breast epithelial cells to start producing copious amounts of milk after birth. This is often referred to as unsuccessful initiation of lactation. …
a recent study took advantage of breast epithelial cells non-invasively isolated from human milk. In these cells, certain genes are turned on, which enable the cells to gradually make milk as the breast matures during pregnancy, and then deliver it to the baby during breastfeeding.
The study reported a negative association between maternal BMI (body mass index), and the function of a gene that represents the milk-producing cells. This suggested that the breast epithelial tissue is not as mature and ready to make copious amounts of milk in mothers with higher BMI. Most likely, the large breasts of overweight or obese mothers contain more fat cells than milk-making cells, which can explain the low milk supply of many of these mothers.
Therefore, breast size does not necessarily translate to more milk-producing cells or higher ability to make milk.
More fat=less room for milk production.
Interestingly, average cup size varies by country. Of course the data may not be 100% accurate, and the lumping of everyone together at the national level obscures many smaller groups, like Siberians, but it otherwise still indicates some general trends that we can probably trust.
If breasts don’t actually make milk, then why on Earth do we have them? Why are women cursed with lumpy fat blobs hanging off their chests that have to be carefully smushed into specialized clothing just so we can run without them flopping around painfully?
And for that matter, why do we think they look nice?
One reasonable theory holds that breasts are really just front-butts. Our apish ancestors, like modern chimpanzees, most likely not copulate ad libitum like we do, but only when females were fertile. Female fertility among our chimpish relatives is signaled via a significant swelling and reddening of their rear ends, a clear signal in a species that wears no clothes and often walks on four limbs.
When humans began walking consistently on two legs, wearing clothes, and looking at each other’s faces, this obvious signal of female fertility was lost, but not our desire to look at rear-ends. So we simply transferred this desire to women’s fronts and selectively had more children with the women who piqued our interests by having more butt-shaped cleavage.
In support of this theory, many women go to fair lengths to increase the resemblance between their ample bosoms and an impressive behind; against this theory is the fact that no other bottom-obsessed species has accidentally evolved a front-butt.
I realized yesterday that there is an even simpler potential explanation: humans are just smart enough to be stupid.
Most of us know that breasts produce milk. Few of us really understand the mechanism of how they produce milk. I had to explain that fat lumps don’t produce milk at the beginning of this post because so few people actually understand this. Far more people think “Big breasts=lots of milk” than think “big breasts=lactation problems.” Humans have probably just been accidentally selecting for big breasts for millennia while trying to select for milk production.
Of course there are smart people who are insane, and dumb people who are completely rational. But if we define intelligence as having something to do with accurately understanding and interpreting the information we constantly receive from the world, necessary to make accurate predictions about the future and how one’s interactions with others will go, there’s a clear correlation between accurately understanding the world and being sane.
In other words, a sufficiently dumb person, even a very sane one, will be unable to distinguish between accurate and inaccurate depictions of reality and so can easily espouse beliefs that sound, to others, completely insane.
Is there any way to distinguish between a dumb person who believes wrong things by accident and a smart person who believes wrong things because they are insane?
Digression: I have a friend who was homeless for many years. Eventually he was diagnosed as mentally ill and given a disability check.
“Why?” he asked, but received no answer. He struggled (and failed) for years to prove that he was not disabled.
Eventually he started hearing voices, was diagnosed with schizophrenia, and put on medication. Today he is not homeless, due at least in part to the positive effects of anti-psychotics.
The Last Psychiatrist has an interesting post (deleted from his blog, but re-posted elsewhere,) on how SSI is determined:
Say you’re poor and have never worked. You apply for Welfare/cash payments and state Medicaid. You are obligated to try and find work or be enrolled in a jobs program in order to receive these benefits. But who needs that? Have a doctor fill out a form saying you are Temporarily Incapacitated due to Medical Illness. Yes, just like 3rd grade. The doc will note the diagnosis, however, it doesn’t matter what your diagnosis is, it only matters that a doctor says you are Temporarily Incapacitated. So cancer and depression both get you the same benefits.
Nor does it matter if he medicates you, or even believes you, so long as he signs the form and writes “depression.”(1) The doc can give you as much time off as he wants (6 months is typical) and you can return, repeatedly, to get another filled out. You can be on state medicaid and receive cash payments for up to 5 years. So as long as you show up to your psych appointments, you’ll can receive benefits with no work obligation.
“That’s not how it works for me”
you might say, which brings us to the whole point: it’s not for you. It is for the entire class of people we label as poor, about whom comic Greg Geraldo joked: “it’s easy to forget there’s so much poverty in the United States, because the poor people look just like black people.” Include inner city whites and hispanics, and this is how the government fights the War On Poverty.
In the inner cities, the system is completely automated. Poor person rolls in to the clinic, fills out the paperwork (doc signs a stack of them at the end of the day), he sees a therapist therapist, a doctor, +/- medications, and gets his benefits.
There’s no accountability, at all. I have never once been asked by the government whether the person deserved the money, the basis for my diagnosis– they don’t audit the charts, all that exists is my sig on a two page form. The system just is.
Enter SSI, Supplemental Security Income. You can earn lifetime SSI benefits (about $600/mo + medical insurance) if “you” can “show” you are “Permanently Disabled” due to a “medical illness.”
“You“= your doc who fills out a packet with specific questions; and maybe a lawyer who processes the massive amounts of other paperwork, and argues your case, and charges about 20% of a year’s award.
“show” has a very specific legal definition: whatever the judge feels like that day. I have been involved in thousands of these SSI cases, and to describe the system as arbitrary is to describe Blake Lively as “ordinary.”
“Permanently disabled” means the illness prevents you from ever working. “But what happens when you get cured?” What is this, the future? You can’t cure bipolar.
“Medical illness” means anything. The diagnosis doesn’t matter, only that “you” show how the diagnosis makes it impossible for you to work. Some diagnoses are easier than others, but none are impossible. “Unable to work” has specific meaning, and specific questions are asked: ability to concentrate, ability to complete a workweek, work around others, take criticism from supervisors, remember and execute simple/moderately difficult/complex requests and tasks, etc.
Fortunately, your chances of being awarded SSI are 100%…
It’s a good post. You should read the whole thing.
TLP’s point is not that the poor are uniformly mentally ill, but that our country is using the disability system as a means of routing money to poor people in order to pacify them (and maybe make their lives better.)
I’ve been playing a bit of sleight of hand, here, subbing in “poor” and “dumb.” But they are categories that highly overlap, given that dumb people have trouble getting jobs that pay well. Despite TLP’s point, many of the extremely poor are, by the standards of the middle class and above, mentally disabled. We know because they can’t keep a job and pay their bills on time.
“Disabled” is a harsh word to some ears. Who’s to say they aren’t equally able, just in different ways?
Living under a bridge isn’t being differently-abled. It just sucks.
Normativity bias happens when you assume that everyone else is just like you. Middle and upper-middle class people tend to assume that everyone else thinks like they do, and the exceptions, like guys who think the CIA is trying to communicate with them via the fillings in their teeth, are few and far between.
As for the vast legions of America’s unfortunates, they assume that these folks are basically just like themselves. If they aren’t very bright, this only means they do their mental calculations a little slower–nothing a little hard work, grit, mindfulness, and dedication can’t make up for. The fact that anyone remains poor, then, has to be the fault of either personal failure (immorality) or outside forces like racism keeping people down.
These same people often express the notion that academia or Mensa are crawling with high-IQ weirdos who can barely tie their shoes and are incapable of socializing with normal humans, to which I always respond that furries exist.
These people need to get out more if they think a guy successfully holding down a job that took 25 years of work in the same field to obtain and that requires daily interaction with peers and students is a “weirdo.” Maybe he wears more interesting t-shirts than a middle manager at BigCorp, but you should see what the Black Hebrew Israelites wear.
I strongly suspect that what we would essentially call “mental illness” among the middle and upper classes is far more common than people realize among the lower classes.
As I’ve mentioned before, there are multiple kinds of intellectual retardation. Some people suffer physical injuries (like shaken baby syndrome or encephalitis), some have genetic defects like Down’s Syndrome, and some are simply dull people born to dull parents. Intelligence is part genetic, so just as some people are gifted with lucky smart genes, some people are visited by the stupid fairy, who only leaves dumb ones. Life isn’t fair.
Different kinds of retardation manifest differently, with different levels of overall impairment in life skills. There are whole communities where the average person tests as mentally retarded, yet people in these communities go providing for themselves, building homes, raising their children, etc. They do not do so in the same ways as we would–and there is an eternal chicken and egg debate about whether the environment they are raised in causes their scores, or their scores cause their environment–but nevertheless, they do.
All of us humans are descended from people who were significantly less intelligent than ourselves. Australopithecines were little smarter than chimps, after all. The smartest adult pygmy chimps, (bonobos) like Kanzi, only know about 3,000 words, which is about the same as a 3 or 4 year old human. (We marvel that chimps can do things a kindergartener finds trivial, like turn on the TV.) Over the past few million years, our ancestors got a lot smarter.
How do chimps think about the world? We have no particular reason to assume that they think about it in ways that substantially resemble our own. While they can make tools and immediately use them, they cannot plan for tomorrow (dolphins probably beat them at planning.) They do not make sentences of more than a few words, much less express complex ideas.
Different humans (and groups of humans) also think about the world in very different ways from each other–which is horrifyingly obvious if you’ve spent any time talking to criminals. (The same people who think nerds are weird and bad at socializing ignore the existence of criminals, despite strategically moving to neighborhoods with fewer of them.)
Even non-criminals communities have all sorts of strange practices, including cannibalism, human sacrifice, wife burning, genital mutilation, coprophagy, etc. Anthropologists (and economists) have devoted a lot of effort to trying to understand and explain these practices as logical within their particular contexts–but a different explanation is possible: that different people sometimes think in very different ways.
For example, some people think there used to be Twa Pygmies in Ireland, before that nefarious St. Patrick got there and drove out all of the snakes. (Note: Ireland did’t have snakes when Patrick arrived.)
(My apologies for this being a bit of a ramble, but I’m hoping for feedback from other people on what they’ve observed.)
For the past three days, I have been seized with a passion for cleaning and organizing the house that my husband describes as “a little scary.” So far I’ve found a missing hairbrush, the video camera, (it was in a lunchbox under some papers under some toys), and the floor; reorganized the bedroom, built a mini-chest of drawers out of cardboard, and returned my mother’s plates–and I’m not even pregnant.
A mere week ago, my limbs hurt whenever I moved. I wasn’t sad or depressed, but it simply felt like pushing boulders every time I needed to walk over to the kitchen.
I woke up this morning with high spirits, sore arms from carrying laundry and a question: is spring cleaning an instinct?
You don’t hear much about fall cleaning or winter cleaning. No one bothers with night cleaning or rainy day cleaning. Only Spring receives special mention for its burst of cleaning.
But the drops in estrogen and serotonin aren’t the only things that spur the desire to clean up. Before your period, your progesterone levels also drop, which combines the impulse to clean with an instinct to “nest.” We see this tendency manifest itself more dramatically in pregnant women, who in their later months of pregnancy have low progesterone levels — which often lead them to go into a frenzy of cleaning house and nesting in order to prepare for the baby.
The PMS-related drop in progesterone is a less-intense version of the same phenomenon.
Well, it’s no myth; winter causes us to be inherently less active and motivated. That’s right; your brain creates melatonin when there is less sunlight on cold dreary days, making you sleepy! Come spring, Mother Nature provides us a natural energy boost by giving us warmer weather and extra sunlight. The dreary days of snow are (hopefully) over and our natural instinct is to explore and interact with others. Although it may seem like a western tradition, cultures from all over the world have been spring cleaning for thousands of years.
Hopefully I can use this newfound energy to write more, because my posting has been deficient of late.
Window Genie (which I suspect is really a window-cleaning service) also notes that spring-cleaning is a cross-cultural phenomenon. I was just commenting on this myself, in a flurry of dish-washing. Do the Jews not clean thoroughly before Passover? Don’t they go through the house, removing all of the bits of old bread, vacuuming and sweeping and dusting to get out even the slightest bit of crumbs or stray yeast? Some even purchase a special feather and spoon kit to dust up the last few crumbs from the corners of the cupboards, then burn them. Burning seems a bit extreme, yet enjoyable–your cleaning is thoroughly done when you’ve burned the last of it.
I would be surprised if “spring cleaning” exists in places that effectively don’t have spring because their weather is warm all-year-long. Likely they have some other traditions, like “Dry season dusting” or “annual migration.” (I find moving an especially effective way to motivate oneself to throw out excess belongings.)
It’s no secret that sales of cleaning and organizing products ramp up in spring, but the claim that our seasonal affection for washing is merely “cultural” is highly suspect–mere “culture” is an extremely ineffective way of getting me to do the laundry.
The claim that Spring Cleaning started in ancient Iran is even more nonsensical. This is simply mistaking the presence of written records in one place and not another for evidence that a tradition is older there. There is no cultural connection between modern American housewives vacuuming their carpets and ancient Iranian cleaning habits.
I do wish people wouldn’t say such idiotic things; I certainly didn’t work through dinner last night because of a love of Zoroaster. It is far more likely that I and the Persians–and millions of other people–simply find ourselves motivated by the same instincts, For we are both humans, and humans, like all higher animals, make and arrange our shelters to suit our needs and convenience. The spider has her web, the snake his hole, the bee her hive. Chimps build nests and humans, even in the warmest of climates, build homes.
These homes must be kept clean, occasionally refreshed and rid of dust and disease-bearing parasites.
Like the circle of the seasons, let us end with the beginning, from The Wind in the Willows:
The Mole had been working very hard all morning, spring-cleaning his little home. First with brooms, then with dusters; then on ladders and steps and chairs, with a brush and a pail of whitewash; till he had dust in his throat and eyes and splashes of whitewash all over his black fur, and an aching back and weary arms. Spring was moving in the air above and in the earth below and around him, penetrating even his dark and lowly little house with its spirit of divine discontent and longing. It was small wonder, then, that he suddenly flung down his brush on the floor, said “Bother!” and “Oh blow!” and also “Hang spring cleaning!” and bolted out of the house without even waiting to put on his coat. Something up above was calling to him…
A “social construct”–in the context of groups of people–is just a stereotype. We’ll call it an “idealized version.” We learn this idealized version by interacting with many individual instances of a particular type of thing and learning to predict its typical behaviors and characteristics.
Suppose I asked you to draw a picture of a man and woman. Go ahead, if you want; then you can compare it to the draw-a-man test.
Out in reality, there are about 7 billion men and women; there is no way you drew someone who looks like all of them. Chances are you drew the man somewhat taller than the woman, even though in reality, there are millions of men and women who are the same height. You might have even drawn hair on the figures–long hair for the woman, short for the man–and some typical clothing, even though you know there are many men with long hair and women with short.
In other words, you drew an idealized version of the pair in order to make it clear to someone else what, exactly, you were drawing.
Our idealized pictures work because they are true on average. The average woman is shorter than the average man, so we draw the woman shorter than the man–even though we know perfectly well that short men exist.
Once an ideal exists, people (it seems) start using artificial means to try to achieve it (like wearing makeup,) which shifts the average, which in turn prompts people to take more extreme measures to meet that ideal.
This may lead to run-away beauty or masculinity trends that look completely absurd from the outside, like foot binding, adult circumcision rituals, or peacocks’ tails. Or breasts–goodness knows why we have them while not nursing.
Our idealized images work less well for people far from the average, or who don’t want to do the activities society has determined are necessary to meet the ideal.
Here’s an interesting survey of whether people (in this case, whites) consider themselves masculine or feminine, broken down by political orientation.
The same trend holds for women–conservative women are much more likely to consider themselves to be very feminine than liberal women. Of course, ideology has an effect on people’s views, but the opposite is probably also true–people who don’t feel like they meet gender ideals are more likely to think those ideals are problematic, while people who do meet them are more likely to think they are perfectly sensible.
And this sort of thinking applies to all sorts of groups–not just men and women. Conservatives probably see themselves as better encapsulating the ideal of their race, religion, nationality (not just American conservatives, but conservatives of all stripes,) while liberals are probably more likely to see themselves as further from these ideals. The chief exceptions are groups where membership is already pre-determined as liberal, like vegetarians.
This may also account for the tendency people have, especially of late, to fight over certain representations. An idealized representation of “Americans” may default to white, since whites are still the majority in this country, but our growing population of non-whites would also like to be represented. This leads to pushback against what would be otherwise uncontroversial depictions (and the people who fit the ideal are not likely to appreciate someone else trying to change it on them.)
There is strength in numbers, but is there wisdom?
I’ve heard from multiple sources the claim that parenting, paradoxically, gets easier after the fourth child. There are several simple explanations for this phenomenon: people get more skilled at parenting after lots of practice; the older kids start helping out with the younger ones, etc.
But what if the phenomenon rests on something much more basic about human psychology–our desire to imitate others?
(Perhaps you don’t, dear reader. There are always exceptions.)
As Aristotle put it, man is a political animal–by which he meant that we are inherently social and prone to building communities (polities) together, not that we are inherently prone to arguing about who should govern North Carolina, though that may be political, too. In Aristotle’s words, a man who lives entirely alone is either a beast (living like an animal) or a god (able to fulfill all of his own needs without recourse to other humans.) Normal humans depend in many ways on other humans.
Compared to our pathetic ability to learn math (just look at most people’s SAT-math scores) and inability to read without direct instruction, humans learn socially-imparted skills like the ability to speak multiple languages, play games, assert dominance over each other, which clothes are fashionable, and how to crack a socially-appropriate joke with ease.
Social learning comes so naturally to people that we only notice it in cases of extreme deficit–like autism–or when parents protest that their children are becoming horribly corrupted by their peers.
So perhaps households with more than 4 children have hit a threshold beyond which social learning takes over and the younger children simply seem to “absorb” knowledge from their older siblings instead of having to be explicitly taught.
Consider learning to eat, a hopefully simple task. We are born with instincts to nurse, put random things in our mouths, and swallow. Preventing babies from eating random non-food objects is a bit of a problem for new parents. But learning things like “how to get this squishy food into your mouth with a spoon without also getting it everywhere else in the room” is much more complicated–and humans take food rituals to much more complicated heights than strained peas and carrots.
Parents of new children put a great deal of effort into teaching them to eat (something that ought to be an instinct.) Those with means puree fresh veggies, chop bits of meat, show a sudden interest in organics, and sit down to spoon every single last bit into their infants’ mouths. It is as if they are convinced that kids cannot learn to eat without at least as much instruction as a student learning to wield a welding torch. (And based on my own experience, they’re probably right.)
By contrast, parents of multiple children have–by necessity–relaxed. As a popular comic once depicted (though I can’t find it now,) feeding at this point becomes throwing Cheerios at the highchair as you run by.
Yet I’ve never seen any evidence that the younger children in large families are likely to be malnourished–they seem to catch the Cheerios on the fly and do just fine.
What if imitation is a strong factor in larger families, allowing infants and young children to learn skills like “how to eat” without needing direct parental instruction just by watching their older siblings? You might object that even infants in single parent households could learn to eat by imitating their parents (and they probably do,) but having more people around probably enforces the behavior more strongly, and having younger children around gives an example that is much more similar to the infant. We adults are massive compared to children, after all.
If basic learning of life skills proceeds more easily in an environment with more peers,(for infants or adults,) then what effects should we expect from our current trend toward extreme atomization?
To me, growing up in that trailer park meant playing until dark with neighborhood kids, building tree houses and snow forts. Listening out my bedroom window for the sound of my dad’s pickup truck leaving for work in the early morning. Riding my bike down the big hill at the top of the lot, avoiding potholes and feeling safe because there wasn’t much traffic and if I fell and skinned my knee, someone would come out on their front porch and ask if I was okay.
Some of the only happy memories I have of my childhood were from that time in my life, before my parents were thrust into insurmountable debt, before my mother was hospitalized, before I had to go live with my grandmother. Nana had a real house. She didn’t live in a trailer. But when she would scream at me or try to attack me as I squeezed by her and fled upstairs, I wished I had neighbors close by to hear her — to believe me, and to perhaps even help.
The most dysfunctional and unstable years of my life were spent in a real house, with four walls and a slanted roof — where fences went up between the houses so that no one ever had to feel responsible for what went on behind their neighbor’s front door.
This is more about atomization than learning, but still interesting. Is it good for humans to be so far apart? To live far from relatives, in houses with thick walls, as single children or single adults, working and commuting every day among strangers?
Certainly the downsides of being among relatives are well-documented. Many tribal societies have downright cruel customs directed at relatives, like sati or adult circumcision. But that doesn’t mean that the extreme opposite–total atomization–is perfect. Atomization carries other risks. Among them, staying indoors and not socializing with our neighbors may cause us to lose some of our social knowledge, our ability to learn how to exist together.
We might expect that physical atomization due to technological change (sturdier houses, more entertaining TV, comfier climate control systems,) could cause symptoms in people similar to those caused by medical deficits in social learning, like autism. A recent study on the subject found an interesting variation between the brains of normies and autists:
So great was the difference between the two groups that the researchers could identify whether a brain was autistic or neurotypical in 33 out of 34 of the participants—that’s 97% accuracy—just by looking at a certain fMRI activation pattern. “There was an area associated with the representation of self that did not activate in people with autism,” Just says. “When they thought about hugging or adoring or persuading or hating, they thought about it like somebody watching a play or reading a dictionary definition. They didn’t think of it as it applied to them.” This suggests that in autism, the representation of the self is altered, which researchers have known for many years, Just says.
This might explain the high rates of body dysmorphias in autism. It might also explain the high rates in society.
I remember another study which I read ages ago which found that people basically thought about “God” in the same parts of their brain where they thought about themselves. This explains why God tends to have the same morals as His believers. If autists have trouble imagining themselves, then they may also have trouble imagining God–and this might explain rising atheism rates.
Even our rising autism rates, though probably driven primarily by shifts in diagnostic fads, might be influenced by shrinking families and greater atomization, as kids with borderline conditions might show more severe symptoms if they are also more isolated.
On the other hand, social media is allowing people to come together and behave socially in new and ever larger groups.
For all their weaknesses, autists are probably better at normies at certain kinds of tasks, like abstract reasoning where you don’t want to think too much about yourself. I have long suspected that normies balk at philosophical dilemmas such as the trolley problem because they over-empathize with the subjects. Imagining themselves as one of the victims of the runaway trolley causes them distress, and distress causes them to attack the person causing them distress–the philosopher.
And so the citizens of Athens condemned Socrates to death.
But just as people can overcome their natural and very sensible fear of heights in order to work on skyscrapers, perhaps they can train themselves not to empathize with the subjects of trolley problems. Spending time on problems with no human subjects (such as mathematics or engineering) may also help people practice ways of approaching problems that don’t immediately resort to imagining themselves as the subject. On the converse, perhaps a bit of atomization (as seen historically in countries like Britain and France, and recently AFAIK in Japan,) helps equip people to think about difficult, non-human related mathematical or engineering problems.
This is a little quote from E. O. Wilson’s Sociobiology that I deleted from the previous post for being a little tangential, but it is still interesting
Guppies (Lebistes reticulatus) are well known for the stabilization of their populations in aquaria by the consumption of their excess young.
So that’s what happened to my pet fish! I always wondered why they seemed to appear and disappear at random. It wasn’t a big enough bowl to logically be losing them in.
Um. Poor guppies.
“Cannibalism is commonplace in the social insects, where it serves as a means of conserving nutrients as well as a precise mechanism for regulating colony size. The colonies of all termite species so far investigated promptly eat their own dead and injured. Cannibalism is in fact so pervasive in termites that it can be said to be a way of life in these insects. …
The eating of immature stages is common in the social Hymenoptera.
Hymenoptera is an order of insects with over 150,000 species, including ants and bees. (Termites, despite also being social, are not members of hymenoptera, and are more closely related to cockroaches.)
Among most or all hymenopterans, sex is determined by the number of chromosomes an individual possesses. Fertilized eggs get two sets of chromosomes (one from each parent’s respective gametes) and develop into diploid females, while unfertilized eggs only contain one set (from the mother) and develop into haploid males. The act of fertilization is under the voluntary control of the egg-laying female, giving her control of the sex of her offspring. This phenomenon is called haplodiploidy.
However, the actual genetic mechanisms of haplodiploid sex determination may be more complex than simple chromosome number. In many Hymenoptera, sex is actually determined by a single gene locus with many alleles. In these species, haploids are male and diploids heterozygous at the sex locus are female, but occasionally a diploid will be homozygous at the sex locus and develop as a male, instead. This is especially likely to occur in an individual whose parents were siblings or other close relatives. Diploid males are known to be produced by inbreeding in many ant, bee, and wasp species. Diploid biparental males are usually sterile but a few species that have fertile diploid males are known.
One consequence of haplodiploidy is that females on average actually have more genes in common with their sisters than they do with their own daughters. Because of this, cooperation among kindred females may be unusually advantageous, and has been hypothesized to contribute to the multiple origins of eusociality within this order. In many colonies of bees, ants, and wasps, worker females will remove eggs laid by other workers due to increased relatedness to direct siblings, a phenomenon known as worker policing.
Another consequence is that hymenopterans may be more resistant to the deleterious effects of inbreeding. As males are haploid, any recessive genes will automatically be expressed, exposing them to natural selection. Thus, the genetic load of deleterious genes is purged relatively quickly.
Back to Wilson:
In ant colonies, all injured eggs, larvae, and pupae are quickly consumed. When colonies are starved, workers begin attacking healthy brood as well. In fact, there exists a direct relation between colony hunger and the amount of brood cannibalism that is precise enough to warrant the suggestion that the brood functions normally as a last-ditch food supply to keep the queen and workers alive. In the army ants of the genus Eciton, cannibalism has apparently been further adapted to the purposes of caste determination. According to Schneirla (1971), most of the female larvae in the sexual generation (the generation destined to transform into males and queens) are consumed by workers. The protein is converted into hundred or thousands of males and several of the very large virgin queens. It seems to follow, but is far from proved, that female larvae are determined as queens by this special protein-rich diet. Other groups of ants, bees, and wasps show equally intricate patterns of specialized cannibalism…
Nomadic male lions of the Serengeti plains frequently invade the territories of prids and drive away or kill the resident males. The cubs are also sometimes killed and eaten during territorial disputes. … Infant mortality is much higher as a result of the disturbances [in the social order of langurs.] In the case of P. entellus, [a langur species,] the young are actually murdered by the usurper…
The main character of the first 4 chapters of Harry Potter isn’t Harry: it’s the Dursleys:
Mr and Mrs Dursley, of number four, Privet Drive, were proud to say that they were perfectly normal, thank you very much. They were the last people you’d expect to be involved in anything strange or mysterious, because they just didn’t hold with such nonsense.
The Dursleys are awful and abusive in an over-the-top, Roald Dahl way that somehow manages not to cause Harry any serious emotional problems, which even I, a hard-core hereditarian, would find improbable if Harry were a real boy. But Harry isn’t the point: watching the Dursleys get their comeuppance is the point.
JRR Tolkien and JK Rowling both focused on the same group of people–common English peasants–but Tolkien’s depiction of the Hobbits are much more sympathetic than Rowling’s Muggles, even if they don’t like adventures:
This hobbit was a very well-to-do hobbit, and his name was Baggins. The Bagginses had lived in the neighborhood of The Hill for time out of mind and people considered them very respectable, not only because most of them were rich, but also because they never had any adventures or did anything unexpected: you could tell what a Baggins would say on any question without the bother of asking him.
We could wax philosophical (or political) about why Tolkien sees common folk as essentially good, despite their provinciality, and why Rowling sees them as essentially bad, for precisely the same reasons, but in the end both writers are correct, for there is good and bad in all groups.
Why are the Dursleys effective villains? Why is their buffoonish abuse believable, and why do so many people identify with young Harry? Is he not the Dursley’s kin, if not their son, their nephew? Shouldn’t they look out for him?
One of the great ironies of life is that the people who are closest to us are also the most likely to abuse us. Despite fears of “stranger danger” (or perhaps because of it) children are most likely to be harmed by parents, step-parents, guardians, or other close relatives/friends of the family, not strangers lurking in alleys or internet chatrooms.
…there were an estimated 57 000 deaths attributed to homicide among children under 15 years of age in 2000. Global estimates of child homicide suggest that infants and very young children are at greatest risk, with rates for the 0–4-year-old age group more than double those of 5–14-year-olds…
The risk of fatal abuse for children varies according to the income level of a country and region of the world. For children under 5 years of age living in high-income countries, the rate of homicide is 2.2 per 100 000 for boys and 1.8 per 100 000 for girls. In low- to middle-income countries the rates are 2–3 times higher – 6.1 per 100 000 for boys and 5.1 per 100 000 for girls. The highest homicide rates for children under 5 years of age are found in the WHO African Region – 17.9 per 100 000 for boys and 12.7 per 100 000 for girls.
(Aside: in every single region, baby boys were more likely to be murdered than baby girls–how’s that “male privilege” for you?)
Estimates of physical abuse of children derived from population-based surveys vary considerably. A 1995 survey in the United States asked parents how they disciplined their children (12). An estimated rate of physical abuse of 49 per 1000 children was obtained from this survey when the following behaviours were included: hitting the child with an object, other than on the buttocks; kicking the child; beating the child; and threatening the child with a knife or gun. …
.In a cross-sectional survey of children in Egypt, 37% reported being beaten or tied up by their parents and 26% reported physical injuries such as fractures, loss of consciousness or permanent disability as a result of being beaten or tied up (17).
. In a recent study in the Republic of Korea, parents were questioned about their behaviour towards their children. Two-thirds of the parents reported whipping their children and 45% confirmed that they had hit, kicked or beaten them (26).
. A survey of households in Romania found that 4.6% of children reported suffering severe and frequent physical abuse, including being hit with an object, being burned or being deprived of food. Nearly half of Romanian parents admitted to beating their children ‘‘regularly’’ and 16% to beating their children with objects (34).
. In Ethiopia, 21% of urban schoolchildren and 64% of rural schoolchildren reported bruises or swellings on their bodies resulting from parental punishment (14).
Ugh. The Dursleys are looking almost decent right now.
In most ways, the Dursleys do not fit the pattern characteristic of most abuse cases–severe abuse and neglect are concentrated among drug-addicted single mothers with more children than they can feed and an unstable rotation of unrelated men in and out of the household. The Dursley’s case is far more mild, but we may still ask: why would anyone mistreat their kin? Wouldn’t natural selection–selfish genes and all that–select against such behavior?
There are a number of facile explanations for the Dursley’s behavior. The first, suggest obliquely by Rowling, is that Mrs. Dursley was jealous of her sister, Lily, Harry’s mother, for being more talented (and prettier) than she was. This is the old “they’re only bullying you because they’re jealous” canard, and it’s usually wrong. We may discard this explanation immediately, as it is simply too big a leap from “I was jealous of my sister” to “therefore I abused her orphaned child for 11 years.” Most of us endured some form of childhood hardship–including sibling rivalry–without turning into abusive assholes who lock little kids in cupboards.
The superior explanation is that there is something about Harry that they just can’t stand. He’s not like them. This is expressed in Harry’s appearance–the Dursleys are described as tall, fat, pink skinned, and blue eyed with straight, blond hair, while Harry is described as short, skinny, pale skinned, and green-eyed with wavy, dark hair.
More importantly, Harry can do magic. The Dursley’s can’t.
It’s never explained in the books why some people can do magic and not others, but the trait looks strongly like a genetic one–not much more complicated than blue eyes. Magic users normally give birth to magical children, and non-magic users (the term “muggle” is an ethnic slur and should be treated as such,) normally have non-magical children. Occasionally magical children are born to regular families, just as occasionally two brown-eyed parents have a blue-eyed child because both parents carried a recessive blue eyed gene that they both happened to pass on to their offspring, and occasionally magical parents have regular children, just as smart people sometimes have dumb offspring. On the whole, however, magical ability is stable enough across generations that there are whole magical families that have been around for hundreds of years and non-magical families that have done the same.
Any other factor–environmental, magical–could have been figured out by now and used to turn kids like Neville into competent wizards, so we conclude that such a factor does not exist.
Magic is a tricky thing to map, metaphorically, onto everyday existence, because nothing like it really exists in our world. We can vaguely imagine that Elsa hiding her ice powers is kind of like a gay person hiding the fact that they are gay, but being gay doesn’t let you build palaces or create sentient snowmen. Likewise, the Dursely’s anger at Harry being “one of them” and adamantly claiming that magic and wizardry don’t exist, despite the fact that they know very well that Mrs. Dursley’s sister could turn teacups into frogs, does resemble the habit of certain very conservative people to pretend that homosexuality doesn’t exist, or that if their children never hear that homosexuality exists, they’ll never become gay.
The other difficulty with this metaphor is that gay people, left to their own devices, don’t produce children.
But putting together these two factors, we arrive at the conclusion that wizards are a distinct, mostly endogamous ethnic group that the Dursleys react to as though they were flaming homosexuals.
How many generations of endogamy would it take to produce two genetically distinct populations from one? Not many–take, for example, the Irish Travellers:
Researchers led by the Royal College of Surgeons in Ireland (RCSI) and the University of Edinburgh analysed genetic information from 42 people who identified as Irish Travellers.
The team compared variations in their DNA code with that of 143 European Roma, 2,232 settled Irish, 2,039 British and 6,255 European or worldwide individuals. …
They found that Travellers are of Irish ancestral origin but have significant differences in their genetic make-up compared with the settled community.
These differences have arisen because of hundreds of years of isolation combined with a decreasing Traveller population, the researchers say. …
The team estimates the group began to separate from the settled population at least 360 years ago.
That’s a fair bit of separation for a mere 360 years or so–and certainly enough for your relatives to act rather funny about it if you decided to run off with Travellers and then your orphaned child turned up on their doorstep.
How old are the wizarding families? Ollivander’s Fine Wands has been in business since 382 BC, and Merlin, Agrippa, and Ptolemy are mentioned as ancient Wizards, so we can probably assume a good 2,000 years of split between the two groups, with perhaps a 10% in-migration of non-magical spouses.
Harry is, based on his parents, 50% magical and 50% non-magical, though of course both Lily and Petunia Dursley probably carry some Wizard DNA.
In The Blank Slate, Pinker has some interesting observations on the subject of sociobiology:
As the notoriety of Sociobiology grew in the ensuing years, Hamilton and Trivers, who had thought up many of the ideas, also became targets of picketers… Trivers had argued that sociobiology is, if anything a force for political progress. It is rooted in the insight that organisms did not evolve to benefit their family, group, or species, because the individuals making up those groups have genetic conflicts of interest with one another and would be selected to defend those interests. This immediately subverts the comfortable belief that those in power rule for the good of all, and it throws a spotlight on hidden actors in the social world, such as female sand the younger generation.
Further in the book, Pinker continues:
Tolstoy’s famous remark that happy families are all alike but every unhappy family is unhappy in its own way is not true at the level of ultimate (evolutionary) causation. Trivers showed how the seeds of unhappiness in every family have the same underlying source. Though relatives have common interests because of their common genes, the degree of overlap is not identical within all their permutations and combinations of family members. Parents are related to all of their offspring by an equal factor, 50 percent, but each child is related to himself or herself by a factor of 100 percent. …
Parental investment is a limited resource. A day has only twenty-four hours … At one end of the lifespan, children learnt hat a mother cannot pump out an unlimited stream of milk; at the other, they learn that parents do not leave behind infinite inheritances.
To the extent that emotions among people reflect their typical genetic relatedness, Trivers argued, the members of a family should disagree on how parental investment should be divvied up.
And to the extent that one of the children in a household is actually a mixed-ethnicity nephew and no close kin at all to the father, the genetic relationship is even more distant between Harry and the Dursleys than between most children and the people raising them.
Parents should want to split their investment equitably among the children… But each child should want the parent to dole out twice as much of the investment to himself or herself as to a sibling, because children share half their genes with each full sibling but share all their genes with themselves. Given a family with two children and one pie, each child should want to split it in a ratio of two thirds to one third, while parents should want it to be split fifty fifty.
A person normally shares about 50% of their genes with their child and 25% of their genes with a niece or nephew, but we also share a certain amount of genes just by being distantly related to each other in the same species, race, or ethnic group.
Harry is, then, somewhat less genetically similar than the average nephew, so we can expect Mrs. Dursley to split any pies a bit less than 2/3s for Dudley and 1/3 for Harry, with Mr. Dursley grumbling that Harry doesn’t deserve any pie at all because he’s not their kid. (In a more extreme environment, if the Dursleys didn’t have enough pie to go around, it would be in their interest to give all of the pie to Dudley, but the Dursleys have plenty of food and they can afford to grudgingly keep Harry alive.)
Most kinds of social behavior, including perhaps all of the most complex forms, are based in one way or another on kinship. As a rule, the closer the genetic relationship of the members of a group, the more stable and intricate the social bonds of its members. …
Parent-offspring conflict and its obverse, sibling-sibling conflict, can be seen throughout the animal kingdom. Littermates or nestmates fight among themselves sometimes lethally, and fight with their mothers over access to milk, food, and care…. The conflict also plays out in the physiology of prenatal human development. Fetuses tap their mothers’ bloodstreams to mine the most nutrients possible from their body, while the mother’s body resists to keep it in good shape for future children. …
Trivers touted the liberatory nature of sociobiology by invoking an “underlying symmetry in our social relationships” and “submerged actors in the social world.” He was referring to women, as we will see in the chapter on gender, and to children. The theory of parent-offspring conflict says that families do not contain all-powerful, all-knowing parents and their passive, grateful children. …
Sometimes families contain Dursleys and Potters.
Most profoundly, children do not allow their personalities to be shaped by their parents’ nagging, blandishments, or attempts to serve as role models.
Quite lucky for Harry!
The offspring cannot rely on its parents for disinterested guidance. One expects the offspring to be preprogrammed to resist some parental manipulation while being open to other forms. When the parent imposes an arbitrary system of reinforcement (punishment and reward) in order to manipulate the offspring to act against its own best interests, selection will favor offspring that resist such schedules of reinforcement.
(Are mixed-race kids more likely to be abused than single-race kids? Well, they’re more likely to be abused than White, Asian, or Hispanic kids, but less likely to be abused than Black or Native American children [Native American children have the highest rates of abuse]. It seems likely that the important factor here isn’t degree of relatedness, but how many of your parents hail from a group with high rates of child abuse. The Dursleys are not from a group with high child abuse rates.)
Let us return to E. O. Wilson’s Sociobiology:
Mammalogists have commonly dealt with conflict as if it were a nonadaptive consequence of the rupture of the parent-offspring bond. Or, in the case of macaques, it has been interpreted as a mechanism by which the female forces the offspring into independence, a step designed ultimately to benefit both generations. …
A wholly different approach to the subject has been taken by Trivers (1974). … Trivers interprets it as the outcome of natural selection operating in opposite directions on the two generations. How is it possible for a mother and her child to be in conflict and both remain adaptive? We must remember that the two share only one half their genes by common descent. There comes a time when it is more profitable for the mother to send the older juvenile on its way and to devote her efforts exclusively to the production of a new one. To the extent that the first offspring stands a chance to achieve an independent life, the mother is likely to increase (and at most, double,) her genetic representation in the next breeding generation by such an act. But the youngster cannot be expected to view the matter in this way at all. …
If the mothers inclusive fitness suffers first from the relationship, conflict will ensue.
At some point, of course, the child is grown and therefore no longer benefits from the mother’s care; at this point the child and mother are no longer in conflict, but the roles may reverse as the parents become the ones in need of care.
As for humans:
Consider the offspring that behaves altruistically toward a full sibling. If it were the only active agent, its behavior would be selected when the benefit to the sibling exceeds two times the cost to itself. From the mother’s point of view, however, inclusive fitness is gained however the benefit to the sibling simply exceeds the cost to the altruist. Consequently, there is likely to evolve a conflict between parents and offspring in the attitudes toward siblings: the parent will encourge more altruism than the youngster is prepared to give. The converse argument also holds: the parent will tolerate less selfishness and spite among siblings than they have a tendency to display…
Indeed, Dudley is, in his way, crueler (more likely to punch Harry) and more greedy than even his parents.
Altruistic acts toward a first cousin are ordinarily selected if the benefit to the cousin exceeds 8 times the cost to the altruist, since the coefficient of relationship of first cousins is 1/8. However, the parent is related to its nieces and nephews by r=1/4, and it should prefer to see altruistic acts by its children toward their cousins whenever the benefit-to-cost ratio exceeds 2. Parental conscientiousness will also extend to interactions with unrelated individuals. From a child’s point of view, an act of selfishness or spite can provide a gain so long as its own inclusive fitness is enhanced… In human terms, the asymmetries in relationship and the differences in responses they imply will lead in evolution to an array of conflicts between parents and their children. In general, offspring will try to push their own socialization in a more egoistic fashion, while the parents will repeatedly attempt to discipline the children back to a higher level of altruism. There is a limit to the amount of altruism [healthy, normal] parents want to see; the difference is in the levels that selection causes the two generations to view as optimum.
To return to Pinker:
As if the bed weren’t crowded enough, every child of a man and a woman is also the grandchild of two other men and two other women. Parents take an interest in their children’s reproduction because in the long run it is their reproduction, too. Worse, the preciousness of female reproductive capacity makes it a valuable resource for the men who control her in traditional patriarchal societies, namely her father and brothers. They can trade a daughter or sister for additional wives or resources for themselves and thus they have an interest in protecting their investment by keeping her from becoming pregnant by men other than the ones they want to sell her to. It is not just the husband or boyfriend who takes a proprietary interest in a woman’s sexual activity, then, but also her father and brothers. Westerners were horrified by the treatment of women under the regime of the Taliban in Afghanistan from 1995 to 2001…
[ah such an optimist time Pinker wrote in]
Like many children, Harry is rescued from a bad family situation by that most modern institution, the boarding school.
The weakening of parents’ hold over their older children is also not just a recent casualty of destructive forces. It is part of a long-running expansion of freedom in the West that has granted children their always-present desire for more autonomy than parents are willing to cede. In traditional societies, children were shackled to the family’s land, betrothed in arranged marriages, and under the thumb of the family patriarch. That began to change in Medieval Europe, and some historians argue it was the first steppingstone in the expansion of rights that we associate with the Enlightenment and that culminated in the abolition of feudalism and slavery. Today it is no doubt true that some children are led astray by a bad crowd or popular culture. But some children are rescued from abusive or manipulative families by peers, neighbors, and teachers. Many children have profited from laws, such as compulsory schooling and the ban on forced marriages, that may override the preferences of their parents.
The sad truth, for Harry–and many others–is that their interests and their relatives’ interests are not always the same. Sometimes humans are greedy, self-centered, or just plain evil. Small children are completely dependent on their parents and other adults, unable to fend for themselves–so the death of his parents followed by abuse and neglect by his aunt and uncle constitute true betrayal.
But there is hope, even for an abused kid like Harry, because we live in a society that is much larger than families or tribal groups. We live in a place where honor killings aren’t common and even kids who aren’t useful to their families can find a way to be useful in the greater society. We live in a civilization.
In addition to the reported Neanderthal and Denisovan introgressions, our results support a third introgression in all Asian and Oceanian populations from an archaic population. This population is either related to the Neanderthal-Denisova clade or diverged early from the Denisova lineage.
(Congratulations to the authors, Mondal, Bertranpetit, and Lao.)
Here we report an analysis comparing cultural and genetic data from 13 populations from in and around Northeast Asia spanning 10 different language families/isolates. We construct distance matrices for language (grammar, phonology, lexicon), music (song structure, performance style), and genomes (genome-wide SNPs) and test for correlations among them. … robust correlations emerge between genetic and grammatical distances. Our results suggest that grammatical structure might be one of the strongest cultural indicators of human population history, while also demonstrating differences among cultural and genetic relationships that highlight the complex nature of human cultural and genetic evolution.
I feel like there’s a joke about grammar Nazis in here.
While humans average seven hours, other primates range from just under nine hours (blue-eyed black lemurs) to 17 (owl monkeys). Chimps, our closest living evolutionary relatives, average about nine and a half hours. And although humans doze for less time, a greater proportion is rapid eye movement sleep (REM), the deepest phase, when vivid dreams unfold.
Sleep is pretty much universal in the animal kingdom, but different species vary greatly in their habits. Elephants sleep about two hours out of 24; sloths more than 15. Individual humans vary in their sleep needs, but interestingly, different cultures vary greatly in the timing of their sleep, eg, the Spanish siesta. Our modern notion that people “should” sleep in a solid, 7-9 hour chunk (going so far as to “train” children to do it,) is more a result of electricity and industrial work schedules than anything inherent or healthy about human sleep. So if you find yourself stressed out because you keep taking a nap in the afternoon instead of sleeping through the night, take heart: you may be completely normal. (Unless you’re tired because of some illness, of course.)
Within any culture, people also prefer to rest and rise at different times: In most populations, individuals range from night owls to morning larks in a near bell curve distribution. Where someone falls along this continuum often depends on sex (women tend to rise earlier) and age (young adults tend to be night owls, while children and older adults typically go to bed before the wee hours).
Genes matter, too. Recent studies have identified about a dozen genetic variations that predict sleep habits, some of which are located in genes known to influence circadian rhythms.
While this variation can cause conflict today … it may be the vestige of a crucial adaptation. According to the sentinel hypothesis, staggered sleep evolved to ensure that there was always some portion of a group awake and able to detect threats.
So they gave sleep trackers to some Hadza, who must by now think Westerners are very strange, and found that at any particular period of the night, about 40% of people were awake; over 20 nights, there were “only 18 one-minute periods” when everyone was asleep. That doesn’t prove anything, but it does suggest that it’s perfectly normal for some people to be up in the middle of the night–and maybe even useful.
In May, a pair of papers published by separate teams in the journal Cell focused on the NOTCH family of genes, found in all animals and critical to an embryo’s development: They produce the proteins that tell stem cells what to turn into, such as neurons in the brain. The researchers looked at relatives of the NOTCH2 gene that are present today only in humans.
In a distant ancestor 8 million to 14 million years ago, they found, a copying error resulted in an “extra hunk of DNA,” says David Haussler of the University of California, Santa Cruz, a senior author of one of the new studies.
This non-functioning extra piece of NOTCH2 code is still present in chimps and gorillas, but not in orangutans, which went off on their own evolutionary path 14 million years ago.
About 3 million to 4 million years ago, a few million years after our own lineage split from other apes, a second mutation activated the once non-functional code. This human-specific gene, called NOTCH2NL, began producing proteins involved in turning neural stem cells into cortical neurons. NOTCH2NL pumped up the number of neurons in the neocortex, the seat of advanced cognitive function. Over time, this led to bigger, more powerful brains. …
The researchers also found NOTCH2NL in the ancient genomes of our closest evolutionary kin: the Denisovans and the Neanderthals, who had brain volumes similar to our own.
“Genomes that evolve in different geographic locations without intermixing can end up being different from each other,” said Kateryna Makova, Pentz Professor of Biology at Penn State and an author of the paper. “… This variation has a lot of advantages; for example, increased variation in immune genes can provide enhanced protection from diseases. However, variation in geographic origin within the genome could also potentially lead to communication issues between genes, for example between mitochondrial and nuclear genes that work together to regulate mitochondrial function.”
Researchers looked at recently (by evolutionary standards) mixed populations like Puerto Ricans and African Americans, comparing the parts of their DNA that interact with mitochondria to the parts that don’t. Since mitochondria hail from your mother, and these populations have different ethnic DNA contributions along maternal and paternal lines. If all of the DNA were equally compatible with their mitochondria, then we’d expect to see equal contributions to the specifically mitochondria-interacting genes. If some ethnic origins interact better with the mitochondria, then we expect to see more of this DNA in these specific places.
The latter is, in fact, what we find. Puerto Ricans hail more from the Taino Indians along their mtDNA, and have relatively more Taino DNA in the genes that affect their mitochondria–indicating that over the years, individuals with more balanced contributions were selected against in Puerto Rico. (“Selection” is such a sanitized way of saying they died/had fewer children.)
This indicates that a recently admixed population may have more health issues than its parents, but the issues will work themselves out over time.