I read recently (my apologies, I can’t find the link) that in every country where we have reliable testing data, a consistent pattern emerges: girls tend to do slightly better on reading/writing tasks than mathematical tasks, and boys slightly better on mathematical than language-tasks.
This is an interesting dynamic because it creates different “optimal” outcomes depending on what you are trying to optimize for.
If you optimize for individual achievement–that is, get each student to go into the field where they, personally, can do the best–the vast majority of girls will go into language-related fields and the vast majority of boys will go into math-based fields. This leaves us with a strongly gender-divided workforce.
But if we optimize instead for getting talented people into a particular field, the gender divide would be narrower. Most smart students are good at both math and language, and could excel in either domain. You could easily have a case where the best mathematician in a class is even more talented in language, or where the most verbally talented person is even more talented at mathematical tasks (but not both at once).
If we let people chose the careers that best suit them, some fields may end up sub-optimally filled because talented people go elsewhere. If we push people into particular fields, some people will end up sub-optimally employed, because they could have done a better job elsewhere.
Relatedly, we find that people show more gendered job preferences in developed countries, and less gendered preferences in undeveloped countries. In Norway, women show a pretty strong preference, on average, for careers involving people or language skills, while in the third world, they show a stronger preference for “masculine” jobs involving math, science, or technical skills. This finding is potentially explained by different countries offering different job opportunities. In Norway, there are lots of cushy jobs, and people feel comfortable pursuing whatever makes them happy or they’re good at. In the third world, technical skills are valued and thus these jobs pay well and people strive to get them.
People often ascribe the gender balance in different jobs to nefarious social forces (ie, sexism,) but it is possible that they are an entirely mundane side effect of people just having the wealth and opportunity to pursue careers in the things they are best at.
There has been a lot of chatter lately about whether the development of human musical abilities can be explained via some form of sexual selection. Most of this debate has been needlessly heated/involved more insults than it warrants, so I don’t want to pick on any particular people, but all of it seems to have overlooked some basic facts:
Musical success–at least as expressed in our culture–is strongly dimorphic in favor of men.
Music groupies–that is, fans who want to have sex with musicians–are strongly dimorphic in favor of women (especially teens).
Successful musicians have tons of sex.
Let’s run through a little evidence on each of these points. First, talent:
Wikipedia has a nice list of musicians/bands by # of albums sold. It probably doesn’t include folks like Beethoven, but that’s for the best since it would muck up the data to have artists whose work has been for sale for so long.
The top selling artists, with 250 million or more record sales, are:
If this list surprises you, you might want to listen to more music.
Men dominate women here 3:1.
I’m not going to list the rest of the top-selling artists on the page, but if we total them up, I count 27 women/female bands (including two bands that are half women) and 83 male (including the two half-male bands).
Remarkably, 83:27 (and 89:29) is almost exactly 3:1.
Now, some people object that “people liking their music enough to fork over money for it” is not a good measure of “musical talent,” but it is definitely a measure of musical success. If someone is super talented but no one wants to listen to them, well, I am a bit skeptical of the claim that they are talented.
The other common response I get to this runs along the lines of “But we tested musical ability in a lab, and in our experiments, men and women did equally well.”
All that shows is that you got different results; it doesn’t explain why the dimorphism exists in the real world. There are exceedingly few top-selling musicians in the world (118 on Wikipedia’s list, plus or minus a few deaths,) and it’s highly doubtful that anyone of this caliber wandered into a university music lab. It may be that musicians of average quality show no dimorphism at all (or are even biased toward women) while exceptional musicians are disproportionately male, just because there is no particular reason to assume that two different groups of people have the same range of abilities even if they have the same average. In fact, men have a greater range than women in many documented areas, like height and IQ–that is, while there are more men than women in Mensa, there are also more boys than girls in Special Ed.
The first time Scottish concert promoter Andi Lothian booked the Beatles, in the frozen January of 1963, only 15 people showed up. The next time he brought them north of the border… it was as if a hurricane had blown into town.
The night almost unravelled when nervous local police insisted Lothian bring the Beatles on early to satisfy rowdily impatient fans, even though his bouncers were still in the pub. “The girls were beginning to overwhelm us,” remembers Lothian, now 73 and a business consultant. “I saw one of them almost getting to Ringo’s drumkit and then I saw 40 drunk bouncers tearing down the aisles. It was like the Relief of Mafeking! It was absolute pandemonium. Girls fainting, screaming, wet seats. The whole hall went into some kind of state, almost like collective hypnotism. I’d never seen anything like it.”
Gone are all the jerky body movements that once earned Elvis Presley the nickname of ‘The Pelvis’. Gone are all the actions that were dubbed vulgar by his critics. Presley’s stage performance is now restrained. But that did not stop 5,500 wildly excited spectators at the Bloch Arena, Pearl Harbour, Hawaii from going outrageously wild with unreserved enthusiasm last Saturday night. Never have I heard anything like it. Their enthusiasm was fever-pitch, and they were screaming non-stop from start to finish, making it impossible to identify some of the songs he sang. Whether he was talking, singing, raising his eyebrows or just breathing, it was a signal for the volume of excitement to rise higher and higher throughout this fantastic concert.
Hundreds of naval police at this U.S. Navy fortress were detailed to restrain fanatical fans from invading the stage, and they were kept busy for the entire show. …
The climax came when he closed with the all-out rocker ‘Hound Dog’, the signal for the greatest bout of unlimited pandemonium, many of the younger girls going completely berserk! Then came the trickiest part of all – ‘Operation Exit Elvis’ – to get Presley out of the building before the crowd could tear him apart from sheer adoration.
“Screaming girls”—that was a recurring theme in newspaper reviews of Elvis’s stage shows in 1956 and 1957. At almost every stop, the girls screamed so loud that no one could hear Elvis sing. Even the musicians on stage had trouble hearing each other. … Elvis himself explained that at times in 1957 he had to cover his ears with his hands so that he could hear himself sing. …
When I spoke with some women who had attended an Elvis concert back in 1957, most of them admitted they had screamed. …
“We screamed when he came out. I didn’t know I was going to yell and scream. I’d never done that in my whole life. It was spontaneous. … He could excite you with his music so much. My mom’s gone; I guess she wouldn’t care if I said it now … it was like a sexual experience. It went through your body kind of like that.”
A rumor went around in ninth grade English class. We went home and turned on MTV to find out for sure. I remember girls crying in the hallway. …
I was watching the news when I heard, and cried. It was believable and unbelievable, all at the same time. It’s our generation’s “Where were you…?” moment. My husband, our friends, all remember where we were when we heard the news and how devastated we were. …
I was in the bathroom getting ready for school, and my dad yelled “Hey, some guy from that band you like is dead.”
I walked into the living room and saw them playing footage from one of their performances on the TV. And then they said his name. I immediately started bawling. I don’t think my mom made me go to school that day.
Seattle bid goodbye to Kurt Cobain on April 10 in true grunge-rock style, bursting the ranks of a quickly organized public vigil and leaping into the nearby international fountain, a giant, water-spouting structure some 50 yards wide and ten feet deep that flanks the Flag Pavilion. … Weeping girls wore beauty pageant banners around their middles, made out of the plastic yellow, “POLICE LINE DO NOT CROSS” tape, the same kind of tape which, three days earlier, had criss-crossed the driveway to Cobain and Courtney Love’s home.
At this point, denying that women (especially teen girls) seem to have some sort of thing for rock stars is right up there with denying that men have a thing for fertile young women with hourglass figures.
Third, the sex:
Groupie sex, oh groupie sex. How many groupies have rockstars actually boned?
Cracked has a pretty good overview if you’ve never heard of groupies before:
We’ve already written about the sex tents that Van Halen’s Sammy Hagar had installed wherever he performed so that he could disappear mid-solo and indulge himself in a groupie or nine. But that’s not the only way Van Halen was entrepreneurial with his young fans. Let’s take a minute and discuss how original frontman David Lee Roth amused his roadies by sending them out on groupie scavenger hunts.
From his lofty position on the stage, Roth would instruct his roadies to dive into the crowd and collect very specific girls for him to have sex on. The lucky girl would be given a special backstage pass with the initials of the roadie who approached her written in the top corner. If that pass was then among the ones strewn on his floor the next morning, Roth would reward the roadie with a $100 bonus at breakfast the next morning, because exchanging money for sex works up an appetite.
Motley Crue came up with the, uh, creative solution of rubbing burritos on their crotches so their girlfriends wouldn’t smell the scents of groupie sex on them:
He tells Hustler magazine, “We were always f**king other chicks at the studio and backstage… We would take Tommy’s (Lee) van to a restaurant called Noggles to buy these egg burritos and then rub them on our crotches to cover the smell of the girls we had just f**ked.
Before they became a quartet of endless punchlines, Van Halen used to be one of the coolest bands in the world, and they demonstrated their status by having sex with every female who wandered within one mile of their powerful aura. Their career is a filthy memorial to how being in a band is a more powerful aphrodisiac than things like “not looking completely ridiculous,” …
One tour saw the band build a tent directly beneath the stage specifically for Sammy Hagar’s erection. During the mid-show 20-minute guitar solos Eddie Van Halen would launch into each night, Hagar would disappear to the tent and discover a group of naked fans waiting to swallow his penis.
Mick Jagger, by the way, has (at least) eight children via five different women.
Look, I feel a little silly having to spell out in great detail the fact that rock stars get laid a lot. You probably feel a little silly reading it, yet there are people who seem hellbent on arguing that there’s no particular evidence in favor of sexual selection for musical talent.
And no, you can’t explain this away by saying that musicians are “famous” and that women want to have sex with all sorts of famous people. Donald Trump is famous, but he doesn’t have sex tents. Leonardo diCaprio is famous and has legions of fans, but as far as I know, he also doesn’t have sex tents.
I agree that we can’t definitively prove how musical talent evolved among the first humans, (because we don’t have time machines,) but the correlation between sex and music today, in our own society, is overwhelming. A claim that it didn’t have similar effects on our ancestors needs to explain what changed so radically between then and now.
Likewise, we can’t assume that just because music works like this in our own society, it must also work this way in every other society. But conversely, just because something doesn’t work in one society doesn’t imply it doesn’t work in every society. There are a lot of groups out there, and some of them are obviously weird in ways that are’t relevant to everyone else. Some people, for example, like to dress up like anthropomorphic animals and go to conventions. We should be cautious about over-generalizing from small examples. Sure, there might be a random tribe somewhere that with weird traditions like killing any women who see a musical instrument being played, but these tribes generally have fewer people in them than one concert’s worth of screaming Elvis fans.
The results were striking. Various combinations of height, weight, and head shape were significantly related to 90% of the negative C-BARQ behavioral traits. Further, in nearly all cases, the smaller the dogs, the more problematic behaviors their owners reported. Here are some examples.
Height – Short breeds were more prone to beg for food, have serious attachment problems, be afraid of other dogs, roll in feces, be overly sensitive to touch, defecate and urinate when left alone, and be harder to train. They also were more inclined to hump people’s legs.
So what’s up with small dogs? Let’s run through the obvious factors first:
Culling: Behavioral and psychological problems obviously get bred out of large dogs more quickly. An anxious pug is cute; an anxious doberman is a problem. A chihuahua who snaps at children is manageable; a rottweiler who snaps at children gets put down.
Training: Since behavioral problems are more problematic in larger dogs, their owners (who chose them in the first place,) are stricter from the beginning about problematic behaviors. No one cares if a corgi begs at the dinner table; a St. Bernard who thinks he’s going to eat off your plate gets unmanageable fast.
Rational behaviors: Since small dogs are small, some of the behaviors listed in the article make sense. They pee indoors by accident more often because they have tiny bladders and just need to pee more often than large dogs (and they have to drink more often). They are more fearful because being smaller than everything around them actually is frightening.
Accident of Breeding: Breeding for one trait can cause other traits to appear by accident. For example, breeding for tameness causes changes to animals’ pelt colors, for reasons we don’t yet know. Breeding for small dogs simultaneously breeds for tiny brains, and dogs with tiny brains are stupider than dogs with bigger brains. Stupider dogs are harder to train and may just have more behavioral issues. They may also attempt behaviors (guarding, hunting, herding, etc) that are now very difficult for them due to their size.
Accident of training: people get small dogs and then stick them in doggy carriages, dress them in doggy clothes, and otherwise baby them, preventing them from being properly trained. No wonder such dogs are neurotic.
And finally, That’s not a Bug, it’s a Feature: Small dogs have issues because people want them to.
Small dogs are bred to be companions to people, usually women (often lonely, older women whose children have moved out of the house and don’t call as often as they should). As such, these dogs are bred to have amusing, human-like personalities–including psychological problems.
Lonely people desire dogs that will stay by them, and so favor anxious dogs. Energetic people favor hyperactive dogs. Anti-social people who don’t want to bond emotionally with others get a snake.
There’s an analogy here with other ways people meet their emotional/psychological needs, like Real Dolls and fake babies (aka “reborns”). The “reborn” doll community contains plenty of ordinary collectors and many grieving parents whose babies died or were stillborn and some older folks with Alzheimer’s, as well as some folks who clearly take it too far and enter the creepy territory.
Both puppies and babydolls are, in their way, stand-ins for the real thing (children,) but dogs are also actually alive, so people don’t feel stupid taking care of dogs. Putting your dog in a stroller or dressing it up in a cute outfit might be a bit silly, but certainly much less silly than paying thousands of dollars to do the same thing to a doll.
And unlike dolls, dogs actually respond to our emotions and have real personalities. As John Katz argues, we now use dogs, in effect, for their emotional work:
In an increasingly fragmented and disconnected society, dogs are often treated not as pets, but as family members and human surrogates. The New Work of Dogsprofiles a dozen such relationships in a New Jersey town, like the story of Harry, a Welsh corgi who provides sustaining emotional strength for a woman battling terminal breast cancer; Cherokee, companion of a man who has few friends and doesn’t know how to talk to his family; the Divorced Dogs Club, whose funny, acerbic, and sometimes angry women turn to their dogs to help them rebuild their lives; and Betty Jean, the frantic founder of a tiny rescue group that has saved five hundred dogs from abuse or abandonment in recent years.
Normally we’d call this “bonding,” “loving your dog,” or “having a friend,” but we moderns have to overthink everything and give it fussy labels like “emotional work.” We’re silly, but thankfully our dogs put up with us.
The ancestors of horses–small, multi-toed quadrupeds–emerged around 50 million years ago, but horses as we know them (and their wild cousins) evolved from a common ancestor around 6 million years ago. Horses in those days were concentrated in North America, but spread via the Bering land bridge to Eurasia and Africa, where they differentiated into zebras, asses, and “wild” horses.
When humans first encountered horses, we ate them. American horses became extinct around 14,000-10,000 years ago, first in Beringia and then in the rest of the continent–coincidentally about the time humans arrived here. The first known transition from hunting horses to herding and ranching them occurred around 6,000 years ago among the Botai of ancient Kazakhstan, not far from the proto-Indo European homeland (though the Botai themselves do not appear to have been pIEs). These herds were still managed for meat, of which the Botai ate tons, until some idiot teenager decided to impress his friends by riding one of the gol-dang things. Soon after, the proto-Indo-Europeans got the idea and went on a rampage, conquering Europe, Iran, and the Indian subcontinent, (and then a little later North and South America, Africa, Australia, and India again). Those horses were useful.
Oddly, though, it appears that those Botai horses are not the ancestors of the modern horses people ride today–but instead are the ancestors of the Przewalski “wild” horse. The Przewalski was though to be a truly wild, undomesticated species, but it appears to have been a kind of domesticated horse* that went feral, much like the mustangs of the wild west. Unlike the mustang, though, the Przewalski is a truly separate species, with 66 chromosomes. Domesticated horses have 64, so the two species cannot produce fertile hybrids. When exactly the Przewalski obtained their extra chromosomes, I don’t know.
*This, of course, depends on the assumption that the Botai horses were “domesticated” in the first place.
Instead, modern, domesticated horses are believed to have descended from the wild Tarpan, though as far as I know, genetic studies proving this have not yet been done. The Tarpan is extinct, but survived up to the cusp of the twentieth century. (Personally, I’m not putting odds on any major tarpan herds in the past couple thousand years having had 100% wild DNA, but I wouldn’t classify them as “feral” just because of a few escaped domestics.)
Thus the horse was domesticated multiple times–especially if we include that other useful member of the equus family, the ass (or donkey, if you’d prefer). The hardworking little donkey does not enjoy its cousin’s glamorous reputation, and Wikipedia reports,
Throughout the world, working donkeys are associated with the very poor, with those living at or below subsistence level. Few receive adequate food, and in general donkeys throughout the Third World are under-nourished and over-worked.
The donkey is believed to have been domesticated from the wild African ass, probably in ancient Nubia (southern Egypt/northern Sudan). From there it spread up the river to the rest of Egypt, where it became an important work animal, and from there to Mesopotamia and the rest of the world.
Wild African asses still exist, but they are critically endangered.
I have no idea while equines have so much chromosomal diversity; dogs have been domesticated for much longer than horses, but are still interfertile with wolves and even coyotes (tbf, maybe horses could breed with tarpans.)
Interestingly, domestication causes a suit of changes to a species’ appearance that are not obviously useful. Recently-domesticated foxes exhibit pelt colors and patterns similar to those of domesticated dogs, not wild foxes. We humans have long hair, unlike our chimp-like ancestors. Horses also have long manes, unlike wild zebras, asses, and tarpans. Horses have evolved, then, to look rather like humans.
Also like humans, horses have different male and female histories. Male horses were quite difficult to tame, and so early domesticators only obtained a few male horses. Females, by contrast, were relatively easy to gentle, so breeders often restocked their herds with wild females. As a result, domesticated horses show far more variation in their mitochondrial DNA than their Y chromosomes. The stocking of herds from different groups of wild horses most likely gave rise to 17 major genetic clusters:
From these sequences, a phylogenetic network was constructed that showed that most of the 93 different mitochondrial (mt)DNA types grouped into 17 distinct phylogenetic clusters. Several of the clusters correspond to breeds and/or geographic areas, notably cluster A2, which is specific to Przewalski’s horses, cluster C1, which is distinctive for northern European ponies, and cluster D1, which is well represented in Iberian and northwest African breeds. A consideration of the horse mtDNA mutation rate together with the archaeological timeframe for domestication requires at least 77 successfully breeding mares recruited from the wild. The extensive genetic diversity of these 77 ancestral mares leads us to conclude that several distinct horse populations were involved in the domestication of the horse.
The wild mustangs of North America might have even more interesting DNA:
The researchers said four family groups (13.8%) with 31 animals fell into haplogroup B, with distinct differences to the two haplogroup L lineages identified.
The closest mitochondrial DNA sequence was found in a Thoroughbred racing horse from China, but its sequence was still distinct in several areas.
The testing also revealed links to the mitochondrial DNA of an Italian horse of unspecific breed, the Yunnan horse from China, and the Yakutia horse from central Siberia, Russia.
Haplogroup B seems to be most frequent in North America (23.1%), with lower frequencies in South America (12.68%) and the Middle East (10.94%) and Europe (9.38%).
“Although the frequency of this lineage is low (1.7%) in the Asian sample of 587 horses, this lineage was found in the Bronze Age horses from China and South Siberia.”
Westhunter suggests that this haplogroup could have originated from some surviving remnant of American wild horses that hadn’t actually been completely killed off before the Spanish mustangs arrived and bred with them. I caution a more prosaic possibility that the Russians brought them while colonizing Alaska and the coast down to northern California. Either way, it’s an intriguing finding.
The horse has been man’s companion for thousands of years and helped him conquer most of the Earth, but the recent invention of internal and external combustion engines (eg, the Iron Horse) have put most horses out to pasture. In effect, they have become obsolete. Modern horses have much easier lives than their hard-working plow and wagon-pulling ancestors, but their populations have shrunk enormously. They’re not going to go extinct, because rich people still like them (and they are still useful in parts of the world where cars cannot easily go,) but they may suffer some of the problems of inbreeding found in genetically narrow dog breeds.
Maybe someday, significant herds of wild horses will roam free again.
Well, there’s a clickbaity title if ever I wrote one.
Nevertheless, human breasts are strange. Sure, all females of the class mammalia are equipped with mammary glands for producing milk, but humans alone posses permanent, non-functional breasts.
Yes, non-functional: the breast tissue that develops during puberty and that you see on women all around you is primarily fat. Fat does not produce milk. Milk ducts produce milk. They are totally different things.
Like all other mammals, the milk-producing parts of the breasts only activate–make milk–immediately after a baby is born. At any other time, milk production is a useless waste of calories. And when mothers begin to lactate, breasts noticeably increase in size due to the sudden production of milk.
A number of factors associated with low milk supply have been identified, such as nipple pain, ineffective nursing, hormonal disorders, breast surgery, certain medications, and maternal obesity. … Research into breast size and milk production shows that milk supply is not dependent on breast size, but rather on the amount of epithelial tissue contained in a breast that is capable of making milk …
However, in addition to baby attachment issues, accumulating evidence shows that a major factor preventing overweight and obese mothers to breastfeed is the inability of their breast epithelial cells to start producing copious amounts of milk after birth. This is often referred to as unsuccessful initiation of lactation. …
a recent study took advantage of breast epithelial cells non-invasively isolated from human milk. In these cells, certain genes are turned on, which enable the cells to gradually make milk as the breast matures during pregnancy, and then deliver it to the baby during breastfeeding.
The study reported a negative association between maternal BMI (body mass index), and the function of a gene that represents the milk-producing cells. This suggested that the breast epithelial tissue is not as mature and ready to make copious amounts of milk in mothers with higher BMI. Most likely, the large breasts of overweight or obese mothers contain more fat cells than milk-making cells, which can explain the low milk supply of many of these mothers.
Therefore, breast size does not necessarily translate to more milk-producing cells or higher ability to make milk.
More fat=less room for milk production.
Interestingly, average cup size varies by country. Of course the data may not be 100% accurate, and the lumping of everyone together at the national level obscures many smaller groups, like Siberians, but it otherwise still indicates some general trends that we can probably trust.
If breasts don’t actually make milk, then why on Earth do we have them? Why are women cursed with lumpy fat blobs hanging off their chests that have to be carefully smushed into specialized clothing just so we can run without them flopping around painfully?
And for that matter, why do we think they look nice?
One reasonable theory holds that breasts are really just front-butts. Our apish ancestors, like modern chimpanzees, most likely not copulate ad libitum like we do, but only when females were fertile. Female fertility among our chimpish relatives is signaled via a significant swelling and reddening of their rear ends, a clear signal in a species that wears no clothes and often walks on four limbs.
When humans began walking consistently on two legs, wearing clothes, and looking at each other’s faces, this obvious signal of female fertility was lost, but not our desire to look at rear-ends. So we simply transferred this desire to women’s fronts and selectively had more children with the women who piqued our interests by having more butt-shaped cleavage.
In support of this theory, many women go to fair lengths to increase the resemblance between their ample bosoms and an impressive behind; against this theory is the fact that no other bottom-obsessed species has accidentally evolved a front-butt.
I realized yesterday that there is an even simpler potential explanation: humans are just smart enough to be stupid.
Most of us know that breasts produce milk. Few of us really understand the mechanism of how they produce milk. I had to explain that fat lumps don’t produce milk at the beginning of this post because so few people actually understand this. Far more people think “Big breasts=lots of milk” than think “big breasts=lactation problems.” Humans have probably just been accidentally selecting for big breasts for millennia while trying to select for milk production.
Of course there are smart people who are insane, and dumb people who are completely rational. But if we define intelligence as having something to do with accurately understanding and interpreting the information we constantly receive from the world, necessary to make accurate predictions about the future and how one’s interactions with others will go, there’s a clear correlation between accurately understanding the world and being sane.
In other words, a sufficiently dumb person, even a very sane one, will be unable to distinguish between accurate and inaccurate depictions of reality and so can easily espouse beliefs that sound, to others, completely insane.
Is there any way to distinguish between a dumb person who believes wrong things by accident and a smart person who believes wrong things because they are insane?
Digression: I have a friend who was homeless for many years. Eventually he was diagnosed as mentally ill and given a disability check.
“Why?” he asked, but received no answer. He struggled (and failed) for years to prove that he was not disabled.
Eventually he started hearing voices, was diagnosed with schizophrenia, and put on medication. Today he is not homeless, due at least in part to the positive effects of anti-psychotics.
The Last Psychiatrist has an interesting post (deleted from his blog, but re-posted elsewhere,) on how SSI is determined:
Say you’re poor and have never worked. You apply for Welfare/cash payments and state Medicaid. You are obligated to try and find work or be enrolled in a jobs program in order to receive these benefits. But who needs that? Have a doctor fill out a form saying you are Temporarily Incapacitated due to Medical Illness. Yes, just like 3rd grade. The doc will note the diagnosis, however, it doesn’t matter what your diagnosis is, it only matters that a doctor says you are Temporarily Incapacitated. So cancer and depression both get you the same benefits.
Nor does it matter if he medicates you, or even believes you, so long as he signs the form and writes “depression.”(1) The doc can give you as much time off as he wants (6 months is typical) and you can return, repeatedly, to get another filled out. You can be on state medicaid and receive cash payments for up to 5 years. So as long as you show up to your psych appointments, you’ll can receive benefits with no work obligation.
“That’s not how it works for me”
you might say, which brings us to the whole point: it’s not for you. It is for the entire class of people we label as poor, about whom comic Greg Geraldo joked: “it’s easy to forget there’s so much poverty in the United States, because the poor people look just like black people.” Include inner city whites and hispanics, and this is how the government fights the War On Poverty.
In the inner cities, the system is completely automated. Poor person rolls in to the clinic, fills out the paperwork (doc signs a stack of them at the end of the day), he sees a therapist therapist, a doctor, +/- medications, and gets his benefits.
There’s no accountability, at all. I have never once been asked by the government whether the person deserved the money, the basis for my diagnosis– they don’t audit the charts, all that exists is my sig on a two page form. The system just is.
Enter SSI, Supplemental Security Income. You can earn lifetime SSI benefits (about $600/mo + medical insurance) if “you” can “show” you are “Permanently Disabled” due to a “medical illness.”
“You“= your doc who fills out a packet with specific questions; and maybe a lawyer who processes the massive amounts of other paperwork, and argues your case, and charges about 20% of a year’s award.
“show” has a very specific legal definition: whatever the judge feels like that day. I have been involved in thousands of these SSI cases, and to describe the system as arbitrary is to describe Blake Lively as “ordinary.”
“Permanently disabled” means the illness prevents you from ever working. “But what happens when you get cured?” What is this, the future? You can’t cure bipolar.
“Medical illness” means anything. The diagnosis doesn’t matter, only that “you” show how the diagnosis makes it impossible for you to work. Some diagnoses are easier than others, but none are impossible. “Unable to work” has specific meaning, and specific questions are asked: ability to concentrate, ability to complete a workweek, work around others, take criticism from supervisors, remember and execute simple/moderately difficult/complex requests and tasks, etc.
Fortunately, your chances of being awarded SSI are 100%…
It’s a good post. You should read the whole thing.
TLP’s point is not that the poor are uniformly mentally ill, but that our country is using the disability system as a means of routing money to poor people in order to pacify them (and maybe make their lives better.)
I’ve been playing a bit of sleight of hand, here, subbing in “poor” and “dumb.” But they are categories that highly overlap, given that dumb people have trouble getting jobs that pay well. Despite TLP’s point, many of the extremely poor are, by the standards of the middle class and above, mentally disabled. We know because they can’t keep a job and pay their bills on time.
“Disabled” is a harsh word to some ears. Who’s to say they aren’t equally able, just in different ways?
Living under a bridge isn’t being differently-abled. It just sucks.
Normativity bias happens when you assume that everyone else is just like you. Middle and upper-middle class people tend to assume that everyone else thinks like they do, and the exceptions, like guys who think the CIA is trying to communicate with them via the fillings in their teeth, are few and far between.
As for the vast legions of America’s unfortunates, they assume that these folks are basically just like themselves. If they aren’t very bright, this only means they do their mental calculations a little slower–nothing a little hard work, grit, mindfulness, and dedication can’t make up for. The fact that anyone remains poor, then, has to be the fault of either personal failure (immorality) or outside forces like racism keeping people down.
These same people often express the notion that academia or Mensa are crawling with high-IQ weirdos who can barely tie their shoes and are incapable of socializing with normal humans, to which I always respond that furries exist.
These people need to get out more if they think a guy successfully holding down a job that took 25 years of work in the same field to obtain and that requires daily interaction with peers and students is a “weirdo.” Maybe he wears more interesting t-shirts than a middle manager at BigCorp, but you should see what the Black Hebrew Israelites wear.
I strongly suspect that what we would essentially call “mental illness” among the middle and upper classes is far more common than people realize among the lower classes.
As I’ve mentioned before, there are multiple kinds of intellectual retardation. Some people suffer physical injuries (like shaken baby syndrome or encephalitis), some have genetic defects like Down’s Syndrome, and some are simply dull people born to dull parents. Intelligence is part genetic, so just as some people are gifted with lucky smart genes, some people are visited by the stupid fairy, who only leaves dumb ones. Life isn’t fair.
Different kinds of retardation manifest differently, with different levels of overall impairment in life skills. There are whole communities where the average person tests as mentally retarded, yet people in these communities go providing for themselves, building homes, raising their children, etc. They do not do so in the same ways as we would–and there is an eternal chicken and egg debate about whether the environment they are raised in causes their scores, or their scores cause their environment–but nevertheless, they do.
All of us humans are descended from people who were significantly less intelligent than ourselves. Australopithecines were little smarter than chimps, after all. The smartest adult pygmy chimps, (bonobos) like Kanzi, only know about 3,000 words, which is about the same as a 3 or 4 year old human. (We marvel that chimps can do things a kindergartener finds trivial, like turn on the TV.) Over the past few million years, our ancestors got a lot smarter.
How do chimps think about the world? We have no particular reason to assume that they think about it in ways that substantially resemble our own. While they can make tools and immediately use them, they cannot plan for tomorrow (dolphins probably beat them at planning.) They do not make sentences of more than a few words, much less express complex ideas.
Different humans (and groups of humans) also think about the world in very different ways from each other–which is horrifyingly obvious if you’ve spent any time talking to criminals. (The same people who think nerds are weird and bad at socializing ignore the existence of criminals, despite strategically moving to neighborhoods with fewer of them.)
Even non-criminals communities have all sorts of strange practices, including cannibalism, human sacrifice, wife burning, genital mutilation, coprophagy, etc. Anthropologists (and economists) have devoted a lot of effort to trying to understand and explain these practices as logical within their particular contexts–but a different explanation is possible: that different people sometimes think in very different ways.
For example, some people think there used to be Twa Pygmies in Ireland, before that nefarious St. Patrick got there and drove out all of the snakes. (Note: Ireland did’t have snakes when Patrick arrived.)
(My apologies for this being a bit of a ramble, but I’m hoping for feedback from other people on what they’ve observed.)
For the past three days, I have been seized with a passion for cleaning and organizing the house that my husband describes as “a little scary.” So far I’ve found a missing hairbrush, the video camera, (it was in a lunchbox under some papers under some toys), and the floor; reorganized the bedroom, built a mini-chest of drawers out of cardboard, and returned my mother’s plates–and I’m not even pregnant.
A mere week ago, my limbs hurt whenever I moved. I wasn’t sad or depressed, but it simply felt like pushing boulders every time I needed to walk over to the kitchen.
I woke up this morning with high spirits, sore arms from carrying laundry and a question: is spring cleaning an instinct?
You don’t hear much about fall cleaning or winter cleaning. No one bothers with night cleaning or rainy day cleaning. Only Spring receives special mention for its burst of cleaning.
But the drops in estrogen and serotonin aren’t the only things that spur the desire to clean up. Before your period, your progesterone levels also drop, which combines the impulse to clean with an instinct to “nest.” We see this tendency manifest itself more dramatically in pregnant women, who in their later months of pregnancy have low progesterone levels — which often lead them to go into a frenzy of cleaning house and nesting in order to prepare for the baby.
The PMS-related drop in progesterone is a less-intense version of the same phenomenon.
Well, it’s no myth; winter causes us to be inherently less active and motivated. That’s right; your brain creates melatonin when there is less sunlight on cold dreary days, making you sleepy! Come spring, Mother Nature provides us a natural energy boost by giving us warmer weather and extra sunlight. The dreary days of snow are (hopefully) over and our natural instinct is to explore and interact with others. Although it may seem like a western tradition, cultures from all over the world have been spring cleaning for thousands of years.
Hopefully I can use this newfound energy to write more, because my posting has been deficient of late.
Window Genie (which I suspect is really a window-cleaning service) also notes that spring-cleaning is a cross-cultural phenomenon. I was just commenting on this myself, in a flurry of dish-washing. Do the Jews not clean thoroughly before Passover? Don’t they go through the house, removing all of the bits of old bread, vacuuming and sweeping and dusting to get out even the slightest bit of crumbs or stray yeast? Some even purchase a special feather and spoon kit to dust up the last few crumbs from the corners of the cupboards, then burn them. Burning seems a bit extreme, yet enjoyable–your cleaning is thoroughly done when you’ve burned the last of it.
I would be surprised if “spring cleaning” exists in places that effectively don’t have spring because their weather is warm all-year-long. Likely they have some other traditions, like “Dry season dusting” or “annual migration.” (I find moving an especially effective way to motivate oneself to throw out excess belongings.)
It’s no secret that sales of cleaning and organizing products ramp up in spring, but the claim that our seasonal affection for washing is merely “cultural” is highly suspect–mere “culture” is an extremely ineffective way of getting me to do the laundry.
The claim that Spring Cleaning started in ancient Iran is even more nonsensical. This is simply mistaking the presence of written records in one place and not another for evidence that a tradition is older there. There is no cultural connection between modern American housewives vacuuming their carpets and ancient Iranian cleaning habits.
I do wish people wouldn’t say such idiotic things; I certainly didn’t work through dinner last night because of a love of Zoroaster. It is far more likely that I and the Persians–and millions of other people–simply find ourselves motivated by the same instincts, For we are both humans, and humans, like all higher animals, make and arrange our shelters to suit our needs and convenience. The spider has her web, the snake his hole, the bee her hive. Chimps build nests and humans, even in the warmest of climates, build homes.
These homes must be kept clean, occasionally refreshed and rid of dust and disease-bearing parasites.
Like the circle of the seasons, let us end with the beginning, from The Wind in the Willows:
The Mole had been working very hard all morning, spring-cleaning his little home. First with brooms, then with dusters; then on ladders and steps and chairs, with a brush and a pail of whitewash; till he had dust in his throat and eyes and splashes of whitewash all over his black fur, and an aching back and weary arms. Spring was moving in the air above and in the earth below and around him, penetrating even his dark and lowly little house with its spirit of divine discontent and longing. It was small wonder, then, that he suddenly flung down his brush on the floor, said “Bother!” and “Oh blow!” and also “Hang spring cleaning!” and bolted out of the house without even waiting to put on his coat. Something up above was calling to him…
A “social construct”–in the context of groups of people–is just a stereotype. We’ll call it an “idealized version.” We learn this idealized version by interacting with many individual instances of a particular type of thing and learning to predict its typical behaviors and characteristics.
Suppose I asked you to draw a picture of a man and woman. Go ahead, if you want; then you can compare it to the draw-a-man test.
Out in reality, there are about 7 billion men and women; there is no way you drew someone who looks like all of them. Chances are you drew the man somewhat taller than the woman, even though in reality, there are millions of men and women who are the same height. You might have even drawn hair on the figures–long hair for the woman, short for the man–and some typical clothing, even though you know there are many men with long hair and women with short.
In other words, you drew an idealized version of the pair in order to make it clear to someone else what, exactly, you were drawing.
Our idealized pictures work because they are true on average. The average woman is shorter than the average man, so we draw the woman shorter than the man–even though we know perfectly well that short men exist.
Once an ideal exists, people (it seems) start using artificial means to try to achieve it (like wearing makeup,) which shifts the average, which in turn prompts people to take more extreme measures to meet that ideal.
This may lead to run-away beauty or masculinity trends that look completely absurd from the outside, like foot binding, adult circumcision rituals, or peacocks’ tails. Or breasts–goodness knows why we have them while not nursing.
Our idealized images work less well for people far from the average, or who don’t want to do the activities society has determined are necessary to meet the ideal.
Here’s an interesting survey of whether people (in this case, whites) consider themselves masculine or feminine, broken down by political orientation.
The same trend holds for women–conservative women are much more likely to consider themselves to be very feminine than liberal women. Of course, ideology has an effect on people’s views, but the opposite is probably also true–people who don’t feel like they meet gender ideals are more likely to think those ideals are problematic, while people who do meet them are more likely to think they are perfectly sensible.
And this sort of thinking applies to all sorts of groups–not just men and women. Conservatives probably see themselves as better encapsulating the ideal of their race, religion, nationality (not just American conservatives, but conservatives of all stripes,) while liberals are probably more likely to see themselves as further from these ideals. The chief exceptions are groups where membership is already pre-determined as liberal, like vegetarians.
This may also account for the tendency people have, especially of late, to fight over certain representations. An idealized representation of “Americans” may default to white, since whites are still the majority in this country, but our growing population of non-whites would also like to be represented. This leads to pushback against what would be otherwise uncontroversial depictions (and the people who fit the ideal are not likely to appreciate someone else trying to change it on them.)
There is strength in numbers, but is there wisdom?
I’ve heard from multiple sources the claim that parenting, paradoxically, gets easier after the fourth child. There are several simple explanations for this phenomenon: people get more skilled at parenting after lots of practice; the older kids start helping out with the younger ones, etc.
But what if the phenomenon rests on something much more basic about human psychology–our desire to imitate others?
(Perhaps you don’t, dear reader. There are always exceptions.)
As Aristotle put it, man is a political animal–by which he meant that we are inherently social and prone to building communities (polities) together, not that we are inherently prone to arguing about who should govern North Carolina, though that may be political, too. In Aristotle’s words, a man who lives entirely alone is either a beast (living like an animal) or a god (able to fulfill all of his own needs without recourse to other humans.) Normal humans depend in many ways on other humans.
Compared to our pathetic ability to learn math (just look at most people’s SAT-math scores) and inability to read without direct instruction, humans learn socially-imparted skills like the ability to speak multiple languages, play games, assert dominance over each other, which clothes are fashionable, and how to crack a socially-appropriate joke with ease.
Social learning comes so naturally to people that we only notice it in cases of extreme deficit–like autism–or when parents protest that their children are becoming horribly corrupted by their peers.
So perhaps households with more than 4 children have hit a threshold beyond which social learning takes over and the younger children simply seem to “absorb” knowledge from their older siblings instead of having to be explicitly taught.
Consider learning to eat, a hopefully simple task. We are born with instincts to nurse, put random things in our mouths, and swallow. Preventing babies from eating random non-food objects is a bit of a problem for new parents. But learning things like “how to get this squishy food into your mouth with a spoon without also getting it everywhere else in the room” is much more complicated–and humans take food rituals to much more complicated heights than strained peas and carrots.
Parents of new children put a great deal of effort into teaching them to eat (something that ought to be an instinct.) Those with means puree fresh veggies, chop bits of meat, show a sudden interest in organics, and sit down to spoon every single last bit into their infants’ mouths. It is as if they are convinced that kids cannot learn to eat without at least as much instruction as a student learning to wield a welding torch. (And based on my own experience, they’re probably right.)
By contrast, parents of multiple children have–by necessity–relaxed. As a popular comic once depicted (though I can’t find it now,) feeding at this point becomes throwing Cheerios at the highchair as you run by.
Yet I’ve never seen any evidence that the younger children in large families are likely to be malnourished–they seem to catch the Cheerios on the fly and do just fine.
What if imitation is a strong factor in larger families, allowing infants and young children to learn skills like “how to eat” without needing direct parental instruction just by watching their older siblings? You might object that even infants in single parent households could learn to eat by imitating their parents (and they probably do,) but having more people around probably enforces the behavior more strongly, and having younger children around gives an example that is much more similar to the infant. We adults are massive compared to children, after all.
If basic learning of life skills proceeds more easily in an environment with more peers,(for infants or adults,) then what effects should we expect from our current trend toward extreme atomization?
To me, growing up in that trailer park meant playing until dark with neighborhood kids, building tree houses and snow forts. Listening out my bedroom window for the sound of my dad’s pickup truck leaving for work in the early morning. Riding my bike down the big hill at the top of the lot, avoiding potholes and feeling safe because there wasn’t much traffic and if I fell and skinned my knee, someone would come out on their front porch and ask if I was okay.
Some of the only happy memories I have of my childhood were from that time in my life, before my parents were thrust into insurmountable debt, before my mother was hospitalized, before I had to go live with my grandmother. Nana had a real house. She didn’t live in a trailer. But when she would scream at me or try to attack me as I squeezed by her and fled upstairs, I wished I had neighbors close by to hear her — to believe me, and to perhaps even help.
The most dysfunctional and unstable years of my life were spent in a real house, with four walls and a slanted roof — where fences went up between the houses so that no one ever had to feel responsible for what went on behind their neighbor’s front door.
This is more about atomization than learning, but still interesting. Is it good for humans to be so far apart? To live far from relatives, in houses with thick walls, as single children or single adults, working and commuting every day among strangers?
Certainly the downsides of being among relatives are well-documented. Many tribal societies have downright cruel customs directed at relatives, like sati or adult circumcision. But that doesn’t mean that the extreme opposite–total atomization–is perfect. Atomization carries other risks. Among them, staying indoors and not socializing with our neighbors may cause us to lose some of our social knowledge, our ability to learn how to exist together.
We might expect that physical atomization due to technological change (sturdier houses, more entertaining TV, comfier climate control systems,) could cause symptoms in people similar to those caused by medical deficits in social learning, like autism. A recent study on the subject found an interesting variation between the brains of normies and autists:
So great was the difference between the two groups that the researchers could identify whether a brain was autistic or neurotypical in 33 out of 34 of the participants—that’s 97% accuracy—just by looking at a certain fMRI activation pattern. “There was an area associated with the representation of self that did not activate in people with autism,” Just says. “When they thought about hugging or adoring or persuading or hating, they thought about it like somebody watching a play or reading a dictionary definition. They didn’t think of it as it applied to them.” This suggests that in autism, the representation of the self is altered, which researchers have known for many years, Just says.
This might explain the high rates of body dysmorphias in autism. It might also explain the high rates in society.
I remember another study which I read ages ago which found that people basically thought about “God” in the same parts of their brain where they thought about themselves. This explains why God tends to have the same morals as His believers. If autists have trouble imagining themselves, then they may also have trouble imagining God–and this might explain rising atheism rates.
Even our rising autism rates, though probably driven primarily by shifts in diagnostic fads, might be influenced by shrinking families and greater atomization, as kids with borderline conditions might show more severe symptoms if they are also more isolated.
On the other hand, social media is allowing people to come together and behave socially in new and ever larger groups.
For all their weaknesses, autists are probably better at normies at certain kinds of tasks, like abstract reasoning where you don’t want to think too much about yourself. I have long suspected that normies balk at philosophical dilemmas such as the trolley problem because they over-empathize with the subjects. Imagining themselves as one of the victims of the runaway trolley causes them distress, and distress causes them to attack the person causing them distress–the philosopher.
And so the citizens of Athens condemned Socrates to death.
But just as people can overcome their natural and very sensible fear of heights in order to work on skyscrapers, perhaps they can train themselves not to empathize with the subjects of trolley problems. Spending time on problems with no human subjects (such as mathematics or engineering) may also help people practice ways of approaching problems that don’t immediately resort to imagining themselves as the subject. On the converse, perhaps a bit of atomization (as seen historically in countries like Britain and France, and recently AFAIK in Japan,) helps equip people to think about difficult, non-human related mathematical or engineering problems.
This is a little quote from E. O. Wilson’s Sociobiology that I deleted from the previous post for being a little tangential, but it is still interesting
Guppies (Lebistes reticulatus) are well known for the stabilization of their populations in aquaria by the consumption of their excess young.
So that’s what happened to my pet fish! I always wondered why they seemed to appear and disappear at random. It wasn’t a big enough bowl to logically be losing them in.
Um. Poor guppies.
“Cannibalism is commonplace in the social insects, where it serves as a means of conserving nutrients as well as a precise mechanism for regulating colony size. The colonies of all termite species so far investigated promptly eat their own dead and injured. Cannibalism is in fact so pervasive in termites that it can be said to be a way of life in these insects. …
The eating of immature stages is common in the social Hymenoptera.
Hymenoptera is an order of insects with over 150,000 species, including ants and bees. (Termites, despite also being social, are not members of hymenoptera, and are more closely related to cockroaches.)
Among most or all hymenopterans, sex is determined by the number of chromosomes an individual possesses. Fertilized eggs get two sets of chromosomes (one from each parent’s respective gametes) and develop into diploid females, while unfertilized eggs only contain one set (from the mother) and develop into haploid males. The act of fertilization is under the voluntary control of the egg-laying female, giving her control of the sex of her offspring. This phenomenon is called haplodiploidy.
However, the actual genetic mechanisms of haplodiploid sex determination may be more complex than simple chromosome number. In many Hymenoptera, sex is actually determined by a single gene locus with many alleles. In these species, haploids are male and diploids heterozygous at the sex locus are female, but occasionally a diploid will be homozygous at the sex locus and develop as a male, instead. This is especially likely to occur in an individual whose parents were siblings or other close relatives. Diploid males are known to be produced by inbreeding in many ant, bee, and wasp species. Diploid biparental males are usually sterile but a few species that have fertile diploid males are known.
One consequence of haplodiploidy is that females on average actually have more genes in common with their sisters than they do with their own daughters. Because of this, cooperation among kindred females may be unusually advantageous, and has been hypothesized to contribute to the multiple origins of eusociality within this order. In many colonies of bees, ants, and wasps, worker females will remove eggs laid by other workers due to increased relatedness to direct siblings, a phenomenon known as worker policing.
Another consequence is that hymenopterans may be more resistant to the deleterious effects of inbreeding. As males are haploid, any recessive genes will automatically be expressed, exposing them to natural selection. Thus, the genetic load of deleterious genes is purged relatively quickly.
Back to Wilson:
In ant colonies, all injured eggs, larvae, and pupae are quickly consumed. When colonies are starved, workers begin attacking healthy brood as well. In fact, there exists a direct relation between colony hunger and the amount of brood cannibalism that is precise enough to warrant the suggestion that the brood functions normally as a last-ditch food supply to keep the queen and workers alive. In the army ants of the genus Eciton, cannibalism has apparently been further adapted to the purposes of caste determination. According to Schneirla (1971), most of the female larvae in the sexual generation (the generation destined to transform into males and queens) are consumed by workers. The protein is converted into hundred or thousands of males and several of the very large virgin queens. It seems to follow, but is far from proved, that female larvae are determined as queens by this special protein-rich diet. Other groups of ants, bees, and wasps show equally intricate patterns of specialized cannibalism…
Nomadic male lions of the Serengeti plains frequently invade the territories of prids and drive away or kill the resident males. The cubs are also sometimes killed and eaten during territorial disputes. … Infant mortality is much higher as a result of the disturbances [in the social order of langurs.] In the case of P. entellus, [a langur species,] the young are actually murdered by the usurper…