Before the rodeo [Terry Hawkins] had graduated out of the fields to the position of fry cook. It was better than being A.D.H.D. (A Dude with a Hoe and a Ditch)–after stirring fried rice or flipping hotcakes on a sove ten feet long, he could grill hamburgers, bag them, and stuff them down his pants to sell in the dorm. Sometimes he snuck out with fried chicken under his shirt and cuts of cheese in his socks. Payment came in cigarettes, the prison’s currency. Later he would stand outside the canteen, and trade a few packs for shampoo or soap or deoderant, or “zoo-zos”–snacks of candy bars or sardines. He knew which guards would allow the stealing, the selling. He made sure to send them plates of fried chicken.
While reading this I thought, “This man has, at least, something to offer his neighbors. He can sell them food, something they’re grateful for. The guy with cheese in his socks and hamburgers in his pants is probably a respected member of his community.”
What do I have to offer my neighbors? I have skills, but they’re only of interest to a corporate employer, my boss. I don’t make anything for sale. I can’t raise a barn or train a horse, and even if I could, my neighbors don’t need these services. Even if I had milk for sale from my personal cow, my neighbors would still prefer to buy their milk at the grocery store.
All of these needs that we used to fill by interacting with our neighbors are now routed through multinational corporations that build their products in immense sweatshops in foreign countries.
I don’t even have to go to the store to buy things if I don’t want to–I can order things online, even groceries.
Beyond the economic, modern prosperity has also eliminated many of the ways (and places) people used to interact. As Lewis Mumford recounts (H/T Wrath of Gnon):
To sum up the medieval dwelling house, one may say that it was characterized by lack of differentiated space and differentiated function. In the cities, however, this lack of internal differentiation was offset by a completer development of domestic functions in public institutions. Though the house might lack a private bake-oven, there was a public one at the baker’s or the cook-shop. Though it might lack a private bathroom, there was a municipal bath-house. Thought it might lack facilities for isolating and nursing a diseased member, there were numerous public hospitals. … As long as the conditions were rude–when people lived in the open, pissed freely in the garden or the street, bought and sold outdoors, opened their shutters and let in full sunlight–the defects of the house were far less serious than they were under a more refined regime.
Without all of the little, daily things that naturally brought people into contact with each other and knit them into communities, we simply have far fewer reasons to talk. We might think that people could simply make up for these changes by inventing new, leisure-oriented reasons to interact with each other, but so far, they’re struggling:
Americans’ circle of confidants has shrunk dramatically in the past two decades and the number of people who say they have no one with whom to discuss important matters has more than doubled, according to a new study by sociologists at DukeUniversity and the University of Arizona.
It compared data from 1985 and 2004 and found that the mean number of people with whom Americans can discuss matters important to them dropped by nearly one-third, from 2.94 people in 1985 to 2.08 in 2004.
Researchers also found that the number of people who said they had no one with whom to discuss such matters more than doubled, to nearly 25 percent. The survey found that both family and non-family confidants dropped, with the loss greatest in non-family connections.
I don’t know about you, but I just don’t trust most people, and most people have given me no reason to trust them.
(The bread of slavery, they say, is far sweeter than the bread of freedom.)
Children were born, safe from wolves, hunger, or cold
and you grew used to man.
And it seemed you outnumbered the stars
Perhaps your sons disappeared
But was it worse than wolves?
You could almost forget you were once wild
Could you return to the mountains, even if you wanted to?
And as they lead you away
Did I ever have a choice?
To explain: The process of domestication is fascinating. Some animals, like wolves, began associating with humans because they could pick up our scraps. Others, like cats, began living in our cities because they liked eating the vermin we attracted. (You might say the mice, too, are domesticated.) These relationships are obviously mutually beneficial (aside from the mice.)
The animals we eat, though, have a different–more existential–story.
Humans increased the number of wild goats and sheep available for them to eat by eliminating competing predators, like wolves and lions. We brought them food in the winter, built them shelters to keep them warm in the winter, and led them to the best pastures. As a result, their numbers increased.
But, of course, we eat them.
From the goat’s perspective, is it worth it?
There’s a wonderful metaphor in the Bible, enacted every Passover: matzoh.
If you’ve never had it, matzoh tastes like saltines, only worse. It’s the bread of freedom, hastily thrown on the fire, hastily thrown on the fire and carried away.
The bread of slavery tastes delicious. The bread of freedom tastes awful.
1And they took their journey from Elim, and all the congregation of the children of Israel came unto the wilderness of Sin, which is between Elim and Sinai, on the fifteenth day of the second month after their departing out of the land of Egypt. 2And the whole congregation of the children of Israel murmured against Moses and Aaron in the wilderness: 3And the children of Israel said unto them, Would to God we had died by the hand of the LORD in the land of Egypt, when we sat by the flesh pots, and when we did eat bread to the full… Exodus 16
Even if the goats didn’t want to be domesticated, hated it and fought against it, did they have any choice? If the domesticated goats have more surviving children than wild ones, then goats will become domesticated. It’s a simple matter of numbers:
Welcome back to our discussion of recent exciting advances in our knowledge of human evolution:
Ancient hominins in the US?
Humans evolved in Europe?
In two days, first H Sap was pushed back to 260,000 years,
then to 300,000 years!
Bell beaker paper
As we’ve been discussing for the past couple of weeks, the exact dividing line between “human” and “non-human” isn’t always hard and fast. The very first Homo species, such as Homo habilis, undoubtedly had more in common with its immediate Australopithecine ancestors than with today’s modern humans, 3 million years later, but that doesn’t mean these dividing lines are meaningless. Homo sapiens and Homo neandethalensis, while considered different species, interbred and produced fertile offspring (most non-Africans have 3-5% Neanderthal DNA as a result of these pairings;) by contrast, humans and chimps cannot produce fertile offspring, because humans and chimps have a different number of chromosomes. The genetic distance between the two groups is just too far.
The grouping of ancient individuals into Homo or not-Homo, Erectus or Habilis, Sapiens or not, is partly based on physical morphology–what they looked like, how they moved–and partly based on culture, such as the ability to make tools or control fire. While australopithecines made some stone tools (and chimps can make tools out of twigs to retrieve tasty termites from nests,) Homo habilis (“handy man”) was the first to master the art and produce large numbers of more sophisticated tools for different purposes, such as this Oldowan chopper.
But we also group species based on moral or political beliefs–scientists generally believe it would be immoral to say that different modern human groups belong to different species, and so the date when Homo ergaster transforms into Homo sapiens is dependent on the date when the most divergent human groups alive today split apart–no one wants to come up with a finding that will get trumpeted in media as “Scientists Prove Pygmies aren’t Human!” (Pygmies already have enough problems, what with their immediate neighbors actually thinking they aren’t human and using their organs for magic rituals.)
(Of course they would still be Human even if they part of an ancient lineage.)
But if an ecologically-minded space alien arrived on earth back in 1490 and was charged with documenting terrestrial species, it might easily decide–based on morphology, culture, and physical distribution–that there were several different Homo “species” which all deserve to be preserved.
But we are not space aliens, and we have the concerns of our own day.
So when a paper was published last year on archaic admixture in Pygmies and the Pygmy/Bushmen/everyone else split, West Hunter noted the authors used a fast–but discredited–estimate of mutation rate to avoid the claim that Pygmies split off 300,000 years ago, 100,000 years before the emergence of Homo sapiens:
There are a couple of recent papers on introgression from some quite divergent archaic population into Pygmies ( this also looks to be the case with Bushmen). Among other things, one of those papers discussed the time of the split between African farmers (Bantu) and Pygmies, as determined from whole-genome analysis and the mutation rate. They preferred to use the once-fashionable rate of 2.5 x 10-8 per-site per-generation (based on nothing), instead of the new pedigree-based estimate of about 1.2 x 10-8 (based on sequencing parents and child: new stuff in the kid is mutation). The old fast rate indicates that the split between Neanderthals and modern humans is much more recent than the age of early Neanderthal-looking skeletons, while the new slow rate fits the fossil record – so what’s to like about the fast rate? Thing is, using the slow rate, the split time between Pygmies and Bantu is ~300k years ago – long before any archaeological sign of behavioral modernity (however you define it) and well before the first known fossils of AMH (although that shouldn’t bother anyone, considering the raggedness of the fossil record).
Southern Africa is consistently placed as one of the potential regions for the evolution of Homo sapiens. To examine the region’s human prehistory prior to the arrival of migrants from East and West Africa or Eurasia in the last 1,700 years, we generated and analyzed genome sequence data from seven ancient individuals from KwaZulu-Natal, South Africa. Three Stone Age hunter-gatherers date to ~2,000 years ago, and we show that they were related to current-day southern San groups such as the Karretjie People. Four Iron Age farmers (300-500 years old) have genetic signatures similar to present day Bantu-speakers. The genome sequence (13x coverage) of a juvenile boy from Ballito Bay, who lived ~2,000 years ago, demonstrates that southern African Stone Age hunter-gatherers were not impacted by recent admixture; however, we estimate that all modern-day Khoekhoe and San groups have been influenced by 9-22% genetic admixture from East African/Eurasian pastoralist groups arriving >1,000 years ago, including the Ju|’hoansi San, previously thought to have very low levels of admixture. Using traditional and new approaches, we estimate the population divergence time between the Ballito Bay boy and other groups to beyond 260,000 years ago.
260,000 years! Looks like West Hunter was correct, and we should be looking at the earlier Pygmy divergence date, too.
Fossil evidence points to an African origin of Homo sapiens from a group called either H. heidelbergensis or H. rhodesiensis. However, the exact place and time of emergence of H. sapiens remain obscure … In particular, it is unclear whether the present day ‘modern’ morphology rapidly emerged approximately 200 thousand years ago (ka) among earlier representatives of H. sapiens1 or evolved gradually over the last 400 thousand years2. Here we report newly discovered human fossils from Jebel Irhoud, Morocco, and interpret the affinities of the hominins from this site with other archaic and recent human groups. We identified a mosaic of features including facial, mandibular and dental morphology that aligns the Jebel Irhoud material with early or recent anatomically modern humans and more primitive neurocranial and endocranial morphology. In combination with an age of 315 ± 34 thousand years (as determined by thermoluminescence dating)3, this evidence makes Jebel Irhoud the oldest and richest African Middle Stone Age hominin site that documents early stages of the H. sapiens clade in which key features of modern morphology were established.
Hublin–one of the study’s coauthors–notes that between 330,000 and 300,000 years ago, the Sahara was green and animals could range freely across it.
While the Moroccan fossils do look like modern H sapiens, they also still look a lot like pre-sapiens, and the matter is still up for debate. Paleoanthropologist Chris Stringer suggests that we should consider all of our ancestors after the Neanderthals split off to be Homo sapiens, which would make our species 500,000 years old. Others would undoubtedly prefer to use a more recent date, arguing that the physical and cultural differences between 500,000 year old humans and today’s people are too large to consider them one species.
According to the Atlantic:
[The Jebel Irhoud] people had very similar faces to today’s humans, albeit with slightly more prominent brows. But the backs of their heads were very different. Our skulls are rounded globes, but theirs were lower on the top and longer at the back. If you saw them face on, they could pass for a modern human. But they turned around, you’d be looking at a skull that’s closer to extinct hominids like Homo erectus. “Today, you wouldn’t be able to find anyone with a braincase that shape,” says Gunz.
Their brains, though already as large as ours, must also have been shaped differently. It seems that the size of the human brain had already been finalized 300,000 years ago, but its structure—and perhaps its abilities—were fine-tuned over the subsequent millennia of evolution.
No matter how we split it, these are exciting days in the field!
This all occasioned some very annoying conversations along the lines of “White skin tone couldn’t possibly have evolved within the past 20,000 years because humans evolved in Europe! Don’t you know anything about science?”
Ohkay. Let’s step back a moment and take a look at what Graecopithecus is and what it isn’t.
This is Graecopithecus:
I think there is a second jawbone, but that’s basically it–and that’s not six teeth, that’s three teeth, shown from two different perspectives. There’s no skull, no shoulder blades, no pelvis, no legs.
By contrast, here are Lucy, the famous Australopithecus from Ethiopia, and a sample of the over 1,500 bones and pieces of Homo naledi recently recovered from a cave in South Africa.
Now, given what little scientists had to work with, the fact that they managed to figure out anything about Graecopithecus is quite impressive. The study, reasonably titled “Potential hominin affinities of Graecopithecus from the Late Miocene of Europe,” by
Jochen Fuss, Nikolai Spassov, David R. Begun, and Madelaine Böhm, used μCT and 3D reconstructions of the jawbones and teeth to compare Graecopithecus’s teeth to those of other apes. They decided the teeth were different enough to distinguish Graecopithecus from the nearby but older Ouranopithecus, while looking more like hominin teeth:
G. freybergi uniquely shares p4 partial root fusion and a possible canine root reduction with this tribe and therefore, provides intriguing evidence of what could be the oldest known hominin.
My hat’s off to the authors, but not to all of the reporters who dressed up “teeth look kind of like hominin teeth” as “Humans evolved in Europe!”
First of all, you cannot make that kind of jump based off of two jawbones and a handfull of teeth. Many of the hominin species we have recovered–such as Homo naledi and Homo floresiensis, as you know if you already read the previous post–possessed a mosaic of “ape like” and “human like” traits, ie:
The physical characteristics of H. naledi are described as having traits similar to the genus Australopithecus, mixed with traits more characteristic of the genus Homo, and traits not known in other hominin species. The skeletal anatomy displays plesiomorphic (“ancestral”) features found in the australopithecines and more apomorphic (“derived,” or traits arising separately from the ancestral state) features known from later hominins.
If we only had six Homo naledi bones instead of 1,500 of them, we might be looking only at the part that looks like an Australopithecus instead of the parts that look like H. erectus or totally novel. You simply cannot make that kind of claim off a couple of jawbones. You’re far too likely to be wrong, and then not only will you end up with egg on your face, but you’ll only be giving more fuel to folks who like to proclaim that “Nebraska Man turned out to be a pig!”:
In February 1922, Harold Cook wrote to Dr. Henry Osborn to inform him of the tooth that he had had in his possession for some time. The tooth had been found years prior in the Upper Snake Creek beds of Nebraska along with other fossils typical of North America. … Osborn, along with Dr. William D. Matthew soon came to the conclusion that the tooth had belonged to an anthropoid ape. They then passed the tooth along to William K. Gregory and Dr. Milo Hellman who agreed that the tooth belonged to an anthropoid ape more closely related to humans than to other apes. Only a few months later, an article was published in Science announcing the discovery of a manlike ape in North America. An illustration of H. haroldcookii was done by artist Amédée Forestier, who modeled the drawing on the proportions of “Pithecanthropus” (now Homo erectus), the “Java ape-man,” for the Illustrated London News. …
Examinations of the specimen continued, and the original describers continued to draw comparisons between Hesperopithecus and apes. Further field work on the site in the summers of 1925 and 1926 uncovered other parts of the skeleton. These discoveries revealed that the tooth was incorrectly identified. According to these discovered pieces, the tooth belonged neither to a man nor an ape, but to a fossil of an extinct species of peccary called Prosthennops serus.
That basically sums up everything I learned about human evolution in highschool.
Second, “HUMANS” DID NOT EVOLVE 7 MILLION YEARS AGO.
Scientists define “humans” as members of the genus Homo, which emerged around 3 million years ago. These are the guys with funny names like Homo habilis, Homo neanderthalensis, and the embarrassingly named Homo erectus. The genus also includes ourselves, Homo sapiens, who emerged around 200-300,000 years ago.
Homo habilis descended from an Australopithecus, perhaps Lucy herself. Australopithecines are not in the Homo genus; they are not “human,” though they are more like us than modern chimps and bonobos are. They evolved around 4 million years ago.
Regardless, humans didn’t evolve 7 million years ago. Sahelanthropus and even Lucy do not look like anyone you would call “human.” Humans have only been around for about 3 million years, and our own specific species is only about 300,000 years old. Even if Graecopithecus turns out to be the missing link–the true ancestor of both modern chimps and modern humans–that still does not change where humans evolved, because Graecopithecus narrowly missed being a human by 4 million years.
If you want to challenge the Out of Africa narrative, I think you’d do far better arguing for a multi-regional model of human evolution that includes back-migration of H. erectus into Africa and interbreeding with hominins there as spurring the emergence of H. sapiens than arguing about a 7 million year old jawbone. (I just made that up, by the way. It has no basis in anything I have read. But it at least has the right characters, in the right time frame, in a reasonable situation.)
Sorry this was a bit of a rant; I am just rather passionate about the subject. Next time we’ll examine very exciting news about Bushmen and Pygmy DNA!
There are three categories of supersars who seem to attract excessive female interest. The first is actors, who of course are selected for being abnormally attractive and put into romantic and exciting narratives that our brains subconsciously interpret as real. The second are sports stars and other athletes, whose ritualized combat and displays of strength obviously indicate their genetic “fitness” for siring and providing for children.
The third and strangest category is professional musicians, especially rock stars.
I understand why people want to pass athletic abilities on to their children, but what is the evolutionary importance of musical talent? Does music tap into some deep, fundamental instinct like a bird’s attraction to the courtship song of its mate? And if so, why?
There’s no denying the importance of music to American courtship rituals–not only do people visit bars, clubs, and concerts where music is being played in order to meet potential partners, but they also display musical tastes on dating profiles in order to meet musically-like-minded people.
Of all the traits to look for in a mate, why rate musical taste so highly? And why do some people describe their taste as, “Anything but rap,” or “Anything but country”?
At least when I was a teen, musical taste was an important part of one’s “identity.” There were goths and punks, indie scene kids and the aforementioned rap and country fans.
Is there actually any correlation between musical taste and personality? Do people who like slow jazz get along with other slow jazz fans better than fans of classical Indian? Or is this all compounded by different ethnic groups identifying with specific musical styles?
Obviously country correlates with Amerikaner ancestry; rap with African American. I’m not sure what ancestry is biggest fans of Die Antwoord. Heavy Metal is popular in Finno-Scandia. Rock ‘n Roll got its start in the African American community as “Race Music” and became popular with white audiences after Elvis Presley took up the guitar.
While Europe has a long and lovely musical heritage, it’s indisputable that African Americans have contributed tremendously to American musical innovation.
Here are two excerpts on the subject of music and dance in African societies:
Both of these h/t HBD Chick and my apologies in advance if I got the sources reversed.
One of the major HBD theories holds that the three races vary–on average–in the distribution of certain traits, such as age of first tooth eruption or intensity of an infant’s response to a tissue placed over its face. Sub-Saharan Africans and Asians are considered two extremes in this distribution, with whites somewhere in between.
If traditional African dancing involves more variety in rhythmic expression than traditional European, does traditional Asian dance involve less? I really know very little about traditional Asian music or dance of any kind, but I would not be surprised to see some kind of continuum affected by whether a society traditionally practiced arranged marriages. Where people chose their own mates, it seems like they display a preference for athletic or musically talented mates (“sexy” mates;) when parents chose mates, they seem to prefer hard-working, devout, “good providers.”
Even in traditional European and American society, where parents played more of a role in courtship than they do today, music still played a major part. Young women, if their families could afford it, learned to play the piano or other instruments in order to be “accomplished” and thus more attractive to higher-status men; young men and women often met and courted at musical events or dances organized by the adults.
It is undoubtedly true that music stirs the soul and speaks to the heart, but why?
It’s been a rough day. So I’m going to complain about something totally mundane: salads.
I was recently privy to a conversation between two older women on why it is so hard to stay thin in the South: lack of good salads. Apparently when you go to a southern restaurant, they serve a big piece of meat (often deep-fried steak) a lump of mashed potatoes and gravy, and a finger-bowl with 5 pieces of iceberg lettuce, an orange tomato, and a slathering of dressing.
Sounds good to me.
Now, if you like salads, that’s fine. You’re still welcome here. Personally, I just don’t see the point. The darn things don’t have any calories!
From an evolutionary perspective, obviously food provides two things: calories and nutrients. There may be some foods that are mostly calorie but little nutrient (eg, honey) and some foods that are nutrient but no calorie (salt isn’t exactly a food, but it otherwise fits the bill.)
Food doesn’t seem like it should be that complicated–surely we’ve evolved to eat effectively by now. So any difficulties we have (besides just getting the food) are likely us over-thinking the matter. There’s no problem getting people to eat high-calorie foods, because they taste good. It’s also not hard to get people to eat salt–it also tastes good.
But people seem to have this ambivalent relationship with salads. What’s so important about eating a bunch of leaves with no calories and a vaguely unpleasant flavor? Can’t a just eat a nice potato? Or some corn? Or asparagus?
Don’t get me wrong. I don’t hate vegetables. Just everything that goes in a salad. Heck, I’ll even eat most salad fixins if they’re cooked. I won’t turn down fried green tomatoes, you know.
While there’s nothing wrong with enjoying a bowl of lettuce if that’s your think, I think our society has gone down a fundamentally wrong collective path when it comes to nutrition wisdom. The idea here is that your hunger drive is this insatiable beast that will force you to consume as much food as possible, making you overweight and giving you a heart attack, and so the only way to save yourself is to trick the beast by filling your stomach with fluffy, zero-calorie plants until there isn’t anymore room.
This seems to me like the direct opposite of what you should be doing. See, I assume your body isn’t an idiot, and can figure out whether you’ve just eaten something full of calories, and so should go sleep for a bit, or if you just ate some leaves and should keep looking for food.
I recently tried increasing the amount of butter I eat each day, and the result was I felt extremely full an didn’t want to eat dinner. Butter is a great way to almost arbitrarily increase the amount of calories per volume of food.
If you’re wondering about my weight, well, let’s just say that despite the butter, never going on a diet, and abhorring salads, I’m still not overweight–but this is largely genetic. (I should note though that I don’t eat many sweets at all.)
Obviously I am not a nutritionist, a dietician, nor a doctor. I’m not a good source for health advice. But it seems to me that increasing or decreasing the number of sweats you eat per day probably has a bigger impact on your overall weight than adding or subtracting a salad.
As a parent, I spend much of my day attempting to “socialize” my kids–“Don’t hit your brother! Stop jumping on the couch! For the umpteenth time, ‘yeah, right!’ is sarcasm.”
There are a lot of things that don’t come naturally to little kids. Many of them struggle to understand that these wiggly lines on paper can turn into words or that tiny, invisible things on their hands can make them sick.
“Yes, you have to brush your teeth and go to bed, no, I’m not explaining why again.”
And they definitely don’t understand why I won’t let them have ice cream for dinner.
“Don’t ride your bike down the hill and into the street like that! You could get hit by a car and DIE!”
Despite all of the effort I have devoted to transforming this wiggly bunch of feral children into respectable adults (someday, I hope,) I have never found myself concerned with the task of teaching them about gender. As a practical matter, whether the children behave like “girls” or “boys” makes little difference to the running of the household, because we have both–by contrast, whether the children put their dishes away after meals and do their homework without me having to threaten or cajole them makes a big difference.
Honestly, I can’t convince them not to pick their noses in public or that broccoli is tasty, but I’m supposed to somehow subtly convince them that they’ve got to play Minecraft because they’re boys (even while explicitly saying, “Hey, you’ve been playing that for two hours, go ride your bike,” or that they’re supposed to be walking doormats because they’re girls (even while saying, “Next time he pushes you, push him back!”)
And yet the boys still act like boys, the girls like girls–statistically speaking.
“Ah,” I hear some of you saying, “But you are just one parent! How do you know there aren’t legions of other parents who are out there doing everything they can to ensure that their sons succeed and daughters fail in life?”
This is, if you will excuse me, a very strange objection. What parent desires failure from their children?
People have long wondered if language is an instinct. If you raised a child without speaking to it, would it begin spontaneously speaking? I hear some folks tried this experience on an orphanage, wondering if the babies would start spontaneously speaking German or French or whatever, and all of the babies died due to neglect.
Of course they wouldn’t have started speaking even if they’d survived. We have no “speak French” instinct, but we do have an instinct to imitate the funny sounds other people make and possibly to “babble”–even deaf babies will “babble” in sign language.
Oh, I found the experiment (I think.) Looks like its a lot older than I thought. According to Wikipedia:
The experiments were recorded by the monk Salimbene di Adam in his Chronicles, who wrote that Frederick encouraged “foster-mothers and nurses to suckle and bathe and wash the children, but in no ways to prattle or speak with them; for he would have learnt whether they would speak the Hebrew language (which he took to have been the first), or Greek, or Latin, or Arabic, or perchance the tongue of their parents of whom they had been born. But he laboured in vain, for the children could not live without clappings of the hands, and gestures, and gladness of countenance, and blandishments.”
That said, we likely do have some instincts related to language acquisition.
If everyone in the world exhibits a particular behavior, chances are it’s innate. But I have been informed–by Harvard-educated people, no less–that humans do not have instincts. We are so smart, you see, that we don’t need instincts anymore.
This is nonsense, of course.
One amusing and well-documented human instinct is the nesting instinct, experienced by pregnant women shortly before going into labor. (As my father put it, “When shes starts rearranging the furniture, get the ready to head to the hospital.”) Having personally experienced this sudden, overwhelming urge to CLEAN ALL THE THINGS multiple times, I can testify that it is a real phenomenon.
Humans have other instincts–babies will not only pick up and try to eat pretty much anything they run across, to every parent’s consternation, but they will also crawl right up to puddles and attempt to drink out of them.
But we’re getting ahead of ourselves: What, exactly, is an instinct? According to Wikipedia:
Instinct or innate behavior is the inherent inclination of a livingorganism towards a particular complex behavior. The simplest example of an instinctive behavior is a fixed action pattern (FAP), in which a very short to medium length sequence of actions, without variation, are carried out in response to a clearly defined stimulus.
Any behavior is instinctive if it is performed without being based upon prior experience (that is, in the absence of learning), and is therefore an expression of innate biological factors. …
Instincts are inborn complex patterns of behavior that exist in most members of the species, and should be distinguished from reflexes, which are simple responses of an organism to a specific stimulus, such as the contraction of the pupil in response to bright light or the spasmodic movement of the lower leg when the knee is tapped.
The go-to example of an instinct is the gosling’s imprinting instinct. Typically, goslings imprint on their mothers, but a baby gosling doesn’t actually know what its mother is supposed to look like, and can accidentally imprint on other random objects, provided they are moving slowly around the nest around the time the gosling hatches.
Here we come to something I think may be useful for distinguishing an instinct from other behaviors: an instinct, once triggered, tends to keep going even if it has been accidentally or incorrectly triggered. Goslings look like they have an instinct to follow their mothers, but they actually have an instinct to imprint on the first large, slowly moving object near their nest when they hatch.
So if you find people strangely compelled to do something that makes no sense but which everyone else seems to think makes perfect sense, you may be dealing with an instinct. For example, women enjoy celebrity gossip because humans have an instinct to keep track of social ranks and dynamics within their own tribe; men enjoy watching other men play sports because it conveys the vicarious feeling of defeating a neighboring tribe at war.
So what about racism? Is it an instinct?
Strictly speaking–and I know I have to define racism, just a moment–I don’t see how we could have evolved such an instinct. Races exist because major human groups were geographically separated for thousands of years–prior to 1492, the average person never even met a person of another race in their entire life. So how could we evolve an instinct in response to something our ancestors never encountered?
Unfortunately, “racism” is a chimera, always changing whenever we attempt to pin it down, but the Urban Dictionary gives a reasonable definition:
An irrational bias towards members of a racial background. The bias can be positive (e.g. one race can prefer the company of its own race or even another) or it can be negative (e.g. one race can hate another). To qualify as racism, the bias must be irrational. That is, it cannot have a factual basis for preference.
Of course, instincts exist because they ensured our ancestors’ survival, so if racism is an instinct, it can’t exactly be “irrational.” We might call a gosling who follows a scientist instead of its mother “irrational,” but this is a misunderstanding of the gosling’s motivation. Since “racist” is a term of moral judgment, people are prone to defending their actions/beliefs towards others on the grounds that it can’t possibly be immoral to believe something that is actually true.
The claim that people are “racist” against members of other races implies, in converse, that they exhibit no similar behaviors toward members of their own race. But even the most perfunctory overview of history reveals people acting in extremely “racist” ways toward members of their own race. During the Anglo-Boer wars, the English committed genocide against the Dutch South Africans (Afrikaners.) During WWII, Germans allied with the the Japanese and slaughtered their neighbors, Poles and Jews. (Ashkenazim are genetically Caucasian and half Italian.) If Hitler were really racist, he’d have teamed up with Stalin and Einstein–his fellow whites–and dropped atomic bombs on Hiroshima. (And for their part, the Japanese would have allied with the Chinese against the Germans.)
The murder victim, a West African chimpanzee called Foudouko, had been beaten with rocks and sticks, stomped on and then cannibalised by his own community. …
“When you reverse that and have almost two males per every female — that really intensifies the competition for reproduction. That seems to be a key factor here,” says Wilson.
Jill Pruetz at Iowa State University, who has been studying this group of chimpanzees in south-eastern Senegal since 2001, agrees. She suggests that human influence may have caused this skewed gender ratio that is likely to have been behind this attack. In Senegal, female chimpanzees are poached to provide infants for the pet trade. …
Early one morning, Pruetz and her team heard loud screams and hoots from the chimps’ nearby sleep nest. At dawn, they found Foudouko dead, bleeding profusely from a bite to his right foot. He also had a large gash in his back and a ripped anus. Later he was found to have cracked ribs. Pruetz says Foudouko probably died of internal injuries or bled out from his foot wound.
Foudouko also had wounds on his fingers. These were likely to have been caused by chimps clamping them in their teeth to stretch his arms out and hold him down during the attack, says Pruetz.
After his death, the gang continued to abuse Foudouko’s body, throwing rocks and poking it with sticks, breaking its limbs, biting it and eventually eating some of the flesh.
“It was striking. The female that cannibalised the body the most, she’s the mother of the top two high-ranking males. Her sons were the only ones that really didn’t attack the body aggressively,” Pruetz says …
Historically, the vast majority of wars and genocides were waged by one group of people against their neighbors–people they were likely to be closely related to in the grand scheme of things–not against distant peoples they’d never met. If you’re a chimp, the chimp most likely to steal your banana is the one standing right in front of you, not some strange chimp you’ve never met before who lives in another forest.
Indeed, in Jane Goodall’s account of the Gombe Chimpanzee War, the combatants were not members of two unrelated communities that had recently encountered each other, but members of a single community that had split in two. Chimps who had formerly lived peacefully together, groomed each other, shared bananas, etc., now bashed each other’s brains out and cannibalized their young. Poor Jane was traumatized.
I think there is an instinct to form in-groups and out-groups. People often have multiple defined in-groups (“I am a progressive, a Christian, a baker, and a Swede,”) but one of these identities generally trumps the others in importance. Ethnicity and gender are major groups most people seem to have, but I don’t see a lot of evidence suggesting that the grouping of “race” is uniquely special, globally, in people’s ideas of in- and out-.
For example, as I am writing today, people are concerned that Donald Trump is enacting racist policies toward Muslims, even though “Muslim” is not a race and most of the countries targeted by Trump’s travel/immigration ban are filled with fellow Caucasians, not Sub-Saharan Africans or Asians.
Race is a largely American obsession, because our nation (like the other North and South American nations,) has always had whites, blacks, and Asians (Native Americans). But many countries don’t have this arrangement. Certainly Ireland didn’t have an historical black community, nor Japan a white one. Irish identity was formed in contrast to English identity; Japanese in contrast to Chinese and Korean.
Only in the context where different races live in close proximity to each other does it seem that people develop strong racial identities; otherwise people don’t think much about race.
Napoleon Chagnon, a white man, has spent years living among the Yanomamo, one of the world’s most murderous tribes, folks who go and slaughter their neighbors and neighbors’ children all the time, and they still haven’t murdered him.
Why do people insist on claiming that Trump’s “Muslim ban” is racist when Muslims aren’t a race? Because Islam is an identity group that appears to function similarly to race, even though Muslims come in white, black, and Asian.
If you’ve read any of the comments on my old post about Turkic DNA, Turkey: Not very Turkic, you’ll have noted that Turks are quite passionate about their Turkic identity, even though “Turkic” clearly doesn’t correspond to any particular ethnic groups. (It’s even more mixed up than Jewish, and that’s a pretty mixed up one after thousands of years of inter-breeding with non-Jews.)
Group identities are fluid. When threatened, groups merged. When resources are abundant and times are good, groups split.
What about evidence that infants identify–stare longer at–faces of people of different races than their parents? This may be true, but all it really tells us is that babies are attuned to novelty. It certainly doesn’t tell us that babies are racist just because they find people interesting who look different from the people they’re used to.
What happens when people encounter others of a different race for the first time?
We have many accounts of “first contacts” between different races during the Age of Exploration. For example, when escaped English convict William Buckley wandered into an uncontacted Aborigine tribe, they assumed he was a ghost, adopted him, taught him to survive, and protected him for 30 years. By contrast, the last guy who landed on North Sentinel Island and tried to chat with the natives there got a spear to the chest and a shallow grave for his efforts. (But I am not certain the North Sentinelese haven’t encountered outsiders at some point.)
But what about the lunchroom seating habits of the wild American teenager?
If people have an instinct to form in-groups and out-groups, then races (or religions?) may represent the furthest bounds of this, at least until we encounter aliens. All else held equal, perhaps we are most inclined to like the people most like ourselves, and least inclined to like the people least like ourselves–racism would thus be the strongest manifestation of this broader instinct. But what about people who have a great dislike for one race, but seem just fine with another, eg, a white person who likes Asians but not blacks, or a black who like Asians but not whites? And can we say–per our definition above–that these preferences are irrational, or are they born of some lived experience of positive or negative interactions?
Again, we are only likely to have strong opinions about members of other races if we are in direct conflict or competition with them. Most of the time, people are in competition with their neighbors, not people on the other side of the world. I certainly don’t sit here thinking negative thoughts about Pygmies or Aborigines, even though we are very genetically distant from each other, and I doubt they spend their free time thinking negatively about me.
Just because flamingos prefer to flock with other flamingos doesn’t mean they dislike horses; for the most part, I think people are largely indifferent to folks outside their own lives.
The autist’s greatest strength–and weakness–is his deficiency in the neural mechanisms of mimicry. Without the necessary feedback loops, he fails to subconsciously adopt of his peers’ words, actions, and beliefs, leaving him is free to develop his own–caring little about how strange they seem to everyone else.
At his most unfortunate, the infant autist lacks even the instincts necessary to imitate the mouth-shapes and mouth-sounds of his parents, leaving him unable to develop speech. Some of these autists understand speech perfectly well, but simply cannot produce it.
At his most fortunate, the autist, immune to other people’s preconceived notions, revolutionizes some field of science or math–or both:
Here is buried Isaac Newton, Knight, who by a strength of mind almost divine, and mathematical principles peculiarly his own, explored the course and figures of the planets, the paths of comets, the tides of the sea, the dissimilarities in rays of light, and, what no other scholar has previously imagined, the properties of the colours thus produced. Diligent, sagacious and faithful, in his expositions of nature, antiquity and the holy Scriptures, he vindicated by his philosophy the majesty of God mighty and good, and expressed the simplicity of the Gospel in his manners. Mortals rejoice that there has existed such and so great an ornament of the human race! He was born on 25 December 1642, and died on 20 March 1726/7.—Translation from G.L. Smyth, The Monuments and Genii of St. Paul’s Cathedral, and of Westminster Abbey (1826), ii, 703–4.