This is Balto, the famous Siberian Husky sled dog who led his team on the final leg of the 1925 serum run to Nome, Alaska. The windchill of the whiteout blizzard when Balto set out was −70 °F. The team traveled all night, with almost no visibility, over the 600-foot Topkok Mountain, and reached Nome at 5:30 AM.
Balto is not the only dog who deserves credit–Togo took a longer and even more dangerous stretch of the run.
Now, don’t get me wrong. He’s a beautiful dog. But he’s a very different dog. I think he’s trying to turn into a German Shepherd-wolf hybrid. Balto practically looks like a corgi next to him.
Siberian huskies were bred by people who depended on them for their lives, and had to endure some of nature’s very harshest weather. We moderns, by contrast, like to keep our dogs inside our warm, comfortable houses to play with our kids or guard our stuff. Have modern huskies been bred for looks rather than sled-pulling?
On the other hand, winning times for the Iditarod have dropped from 20 days to just 8 since the race began in the 1970s, so clearly there are some very fast huskies out there.
The Negritos are a fascinating group of short-statured, dark-skinned, frizzy-haired peoples from southeast Asia–chiefly the Andaman Islands, Malaysia, Philippines, and Thailand. (Spelling note: “Negritoes” is also an acceptable plural, and some sources use the Spanish Negrillos.)
Because of their appearance, they have long been associated with African peoples, especially the Pygmies. Pygmies are formally defined as any group where adult men are, on average 4’11” or less and is almost always used specifically to refer to African Pygmies; the term pygmoid is sometimes used for groups whose men average 5’1″ or below, including the Negritos. (Some of the Bushmen tribes, Bolivians, Amazonians, the remote Taron, and a variety of others may also be pygmoid, by this definition.)
However, genetic testing has long indicated that they, along with other Melanesians and Australian Aborigines, are more closely related to other east Asian peoples than any African groups. In other words, they’re part of the greater Asian race, albeit a distant branch of it.
But how distant? And are the various Negrito groups closely related to each other, or do there just happen to be a variety of short groups of people in the area, perhaps due to convergent evolution triggered by insular dwarfism?
They found that the Negrito groups they studied “are basal to other East and Southeast Asians,” (basal: forming the bottom layer or base. In this case, it means they split off first,) “and that they diverged from West Eurasians at least 38,000 years ago.” (West Eurasians: Caucasians, consisting of Europeans, Middle Easterners, North Africans, and people from India.) “We also found relatively high traces of Denisovan admixture in the Philippine Negritos, but not in the Malaysian and Andamanese groups.” (Denisovans are a group of extinct humans similar to Neanderthals, but we’ve yet to find many of their bones. Just as Neanderthal DNA shows up in non-Sub-Saharan-Africans, so Denisvoan shows up in Melanesians.)
Figure 1 (A) shows PC analysis of Andamanese, Malaysian, and Philippine Negritos, revealing three distinct clusters:
In the upper right-hand corner, the Aeta, Agta, Batak, and Mamanwa are Philippine Negritos. The Manobo are non-Negrito Filipinos.
In the lower right-hand corner are the Jehai, Kintak and Batek are Malaysian Negritos.
And in the upper left, we have the extremely isolated Andamanese Onge and Jarawa Negritos.
(Phil-NN and Mly-NN I believe are Filipino and Malaysian Non-Negritos.)
You can find the same chart, but flipped upside down, with Papuan and Melanesian DNA in the supplemental materials. Of the three groups, they cluster closest to the Philippine Negritos, along the same line with the Malaysians.
By excluding the Andamanese (and Kintak) Negritos, Figure 1 (B) allows a closer look at the structure of the Philippine Negritos.
The Agta, Aeta, and Batak form a horizontal “comet-like pattern,” which likely indicates admixture with non-Negrito Philipine groups like the Manobo. The Mamanawa, who hail from a different part of the Philippines, also show this comet-like patterns, but along a different axis–likely because they intermixed with the different Filipinos who lived in their area. As you can see, there’s a fair amount of overlap–several of the Manobo individuals clustered with the Mamanwa Negritos, and the Batak cluster near several non-Negrito groups (see supplemental chart S4 B)–suggesting high amounts of mixing between these groups.
ADMIXTURE analysis reveals a similar picture. The non-Negrito Filipino groups show up primarily as Orange. The Aeta, Agta, and Batak form a clear genetic cluster with each other and cline with the Orange Filipinos, with the Aeta the least admixed and Batak the most.
The white are on the chart isn’t a data error, but the unique signature of the geographically separated Mananwa, who are highly mixed with the Manobo–and the Manobo, in turn, are mixed with them.
But this alone doesn’t tell us how ancient these populations are, nor if they’re descended from one ancestral pop. For this, the authors constructed several phylogenetic trees, based on all of the data at hand and assuming from 0 – 5 admixture events. The one on the left assumes 5 events, but for clarity only shows three of them. The Denisovan DNA is fascinating and well-documented elsewhere in Melanesian populatons; that Malaysian and Philippine Negritos mixed with their neighbors is also known, supporting the choice of this tree as the most likely to be accurate.
Regardless of which you pick, all of the trees show very similar results, with the biggest difference being whether the Melanesians/Papuans split before or after the Andamanese/Malaysian Negritos.
In case you are unfamiliar with these trees, I’ll run down a quick explanation: This is a human family tree, with each split showing where one group of humans split off from the others and became an isolated group with its own unique genetic patterns. The orange and red lines mark places where formerly isolated groups met and interbred, producing children that are a mix of both. The first split in the tree, going back million of years, is between all Homo sapiens (our species) and the Denisovans, a sister species related to the Neanderthals.
All humans outside of sub-Saharan Africans have some Neanderthal DNA because their ancestors met and interbred with Neanderthals on their way Out of Africa. Melanesians, Papuans, and some Negritos also have some Denisovan DNA, because their ancestors met and made children with members of this obscure human species, but Denisovan DNA is quite rare outside these groups.
Here is a map of Denisovan DNA levels the authors found, with 4% of Papuan DNA hailing from Denisivan ancestors, and Aeta nearly as high. By contrast, the Andamanese Negritos appear to have zero Denisovan. Either the Andamanese split off before the ancestors of the Philippine Negritos and Papuans met the Denisovans, or all Denisovan DNA has been purged from their bloodlines, perhaps because it just wasn’t helpful for surviving on their islands.
Back to the Tree: The second node is where the Biaka, a group of Pygmies from the Congo Rainforest in central Africa. Pygmy lineages are among the most ancient on earth, potentially going back over 200,000 years, well before any Homo sapiens had left Africa.
The next group that splits off from the rest of humanity are the Yoruba, a single ethnic group chosen to stand in for the entirety of the Bantus. Bantus are the group that you most likely think of when you think of black Africans, because over the past three millennia they have expanded greatly and conquered most of sub-Saharan Africa.
Next we have the Out of Africa event and the split between Caucasians (here represented by the French) and the greater Asian clade, which includes Australian Aborigines, Melanesians, Polynesians, Chinese, Japanese, Siberians, Inuit, and Native Americans.
The first groups to split off from the greater Asian clade (aka race) were the Andamanese and Malaysian Negritos, followed by the Papuans/Melanesians Australian Aborigines are closely related to Papuans, as Australia and Papua New Guinea were connected in a single continent (called Sahul) back during the last Ice Age. Most of Indonesia and parts of the Philippines were also connected into a single landmass, called Sunda. Sensibly, people reached Sunda before Sahul, though (Perhaps at that time the Andaman islands, to the northwest of Sumatra, were also connected or at least closer to the mainland.)
Irrespective of the exact order in which Melanesians and individual Negrito groups split off, they all split well before all of the other Asian groups in the area.
This is supported by legends told by the Filipinos themselves:
Legends, such as those involving the Ten Bornean Datus and the Binirayan Festival, tell tales about how, at the beginning of the 12th century when Indonesia and Philippines were under the rule of Indianized native kingdoms, the ancestors of the Bisaya escaped from Borneo from the persecution of Rajah Makatunaw. Led by Datu Puti and Datu Sumakwel and sailing with boats called balangays, they landed near a river called Suaragan, on the southwest coast of Panay, (the place then known as Aninipay), and bartered the land from an Ati [Negrito] headman named Polpolan and his son Marikudo for the price of a necklace and one golden salakot. The hills were left to the Atis while the plains and rivers to the Malays. This meeting is commemorated through the Ati-atihan festival.
The study’s authors estimate that the Negritos split from Europeans (Caucasians) around 30-38,000 years ago, and that the Malaysian and Philippine Negritos split around
13-15,000 years ago. (This all seems a bit tentative, IMO, especially since we have physical evidence of people in the area going back much further than that, and the authors themselves admit in the discussion that their time estimate may be too short.)
The authors also note:
Both our NJ (fig. 3A) and UPGMA (supplementary fig. S10) trees show that after divergence from Europeans, the ancestral Asians subsequently split into Papuans, Negritos and East Asians, implying a one-wave colonization of Asia. … This is in contrast to the study based on whole genome sequences that suggested Australian Aboriginal/Papuan first split from European/East Asians 60 kya, and later Europeans and East Asians diverged 40 kya (Malaspinas et al. 2016). This implies a two-wave migration into Asia…
The matter is still up for debate/more study.
In conclusion: All of the Negrito groups are likely descended from a common ancestor, (rather than having evolved from separate groups that happened to develop similar body types due to exposure to similar environments,) and were among the very first inhabitants of their regions. Despite their short stature, they are more closely related to other Asian groups (like the Chinese) than to African Pygmies. Significant mixing with their neighbors, however, is quickly obscuring their ancient lineages.
I wonder if all ancient human groups were originally short, and height a recently evolved trait in some groups?
In closing, I’d like to thank Jinam et al for their hard work in writing this article and making it available to the public, their sponsors, and the unique Negrito peoples themselves for surviving so long.
Society itself is a thermodynamic system for entropy dissipation. Energy goes in–in the form of food and, recently, fuels like oil–and children and buildings come out.
Government is simply the entire power structure of a region–from the President to your dad, from bandits to your boss. But when people say, “government,” they typically mean the official one written down in laws that lives in white buildings in Washington, DC.
When the “government” makes laws that try to change the natural flow of energy or information through society, society responds by routing around the law, just as water flows around a boulder that falls in a stream.
The ban on trade with Britain and France in the early 1800s, for example, did not actually stop people from trading with Britain and France–trade just became re-routed through smuggling operations. It took a great deal of energy–in the form of navies–to suppress piracy and smuggling in the Gulf and Caribbean–chiefly by executing pirates and imprisoning smugglers.
When the government decided that companies couldn’t use IQ tests in hiring anymore (because IQ tests have a “disparate impact” on minorities because black people tend to score worse, on average, than whites,) in Griggs vs. Duke Power, they didn’t start hiring more black folks. They just started using college degrees as a proxy for intelligence, contributing to the soul-crushing debt and degree inflation young people know and love today.
Similarly, when the government tried to stop companies from asking about applicants’ criminal histories–again, because the results were disproportionately bad for minorities–companies didn’t start hiring more blacks. Since not hiring criminals is important to companies, HR departments turned to the next best metric: race. These laws ironically led to fewer blacks being hired, not more.
Where the government has tried to protect the poor by passing tenant’s rights laws, we actually see the opposite: poorer tenants are harmed. By making it harder to evict tenants, the government makes landlords reluctant to take on high-risk (ie, poor) tenants.
The passage of various anti-discrimination and subsidized housing laws (as well as the repeal of various discriminatory laws throughout the mid-20th century) lead to the growth of urban ghettos, which in turn triggered the crime wave of the 70s, 80s, and 90s.
Crime and urban decay have made inner cities–some of the most valuable real estate in the country–nigh unlivable, resulting in the “flight” of millions of residents and the collective loss of millions of dollars due to plummeting home values.
Work-arounds are not cheap. They are less efficient–and thus more expensive–than the previous, banned system.
Smuggled goods cost more than legally traded goods due to the personal risks smugglers must take. If companies can’t tell who is and isn’t a criminal, the cost of avoiding criminals becomes turning down good employees just because they happen to be black. If companies can’t directly test intelligence, the cost becomes a massive increase in the amount of money being spent on accreditation and devaluation of the signaling power of a degree.
We have dug up literally billions of dollars worth of concentrated sunlight in the form of fossil fuels in order to rebuild our nation’s infrastructure in order to work around the criminal blights in the centers of our cities, condemning workers to hour-long commutes and paying inflated prices for homes in neighborhoods with “good schools.”
Note: this is not an argument against laws. Some laws increase efficiency. Some laws make life better.
This is a reminder that everything is subject to thermodynamics. Nothing is free.
(The bread of slavery, they say, is far sweeter than the bread of freedom.)
Children were born, safe from wolves, hunger, or cold
and you grew used to man.
And it seemed you outnumbered the stars
Perhaps your sons disappeared
But was it worse than wolves?
You could almost forget you were once wild
Could you return to the mountains, even if you wanted to?
And as they lead you away
Did I ever have a choice?
To explain: The process of domestication is fascinating. Some animals, like wolves, began associating with humans because they could pick up our scraps. Others, like cats, began living in our cities because they liked eating the vermin we attracted. (You might say the mice, too, are domesticated.) These relationships are obviously mutually beneficial (aside from the mice.)
The animals we eat, though, have a different–more existential–story.
Humans increased the number of wild goats and sheep available for them to eat by eliminating competing predators, like wolves and lions. We brought them food in the winter, built them shelters to keep them warm in the winter, and led them to the best pastures. As a result, their numbers increased.
But, of course, we eat them.
From the goat’s perspective, is it worth it?
There’s a wonderful metaphor in the Bible, enacted every Passover: matzoh.
If you’ve never had it, matzoh tastes like saltines, only worse. It’s the bread of freedom, hastily thrown on the fire, hastily thrown on the fire and carried away.
The bread of slavery tastes delicious. The bread of freedom tastes awful.
1And they took their journey from Elim, and all the congregation of the children of Israel came unto the wilderness of Sin, which is between Elim and Sinai, on the fifteenth day of the second month after their departing out of the land of Egypt. 2And the whole congregation of the children of Israel murmured against Moses and Aaron in the wilderness: 3And the children of Israel said unto them, Would to God we had died by the hand of the LORD in the land of Egypt, when we sat by the flesh pots, and when we did eat bread to the full… Exodus 16
Even if the goats didn’t want to be domesticated, hated it and fought against it, did they have any choice? If the domesticated goats have more surviving children than wild ones, then goats will become domesticated. It’s a simple matter of numbers:
A species may live in relative equilibrium with its environment, hardly changing from generation to generation, for millions of years. Turtles, for example, have barely changed since the Cretaceous, when dinosaurs still roamed the Earth.
But if the environment changes–critically, if selective pressures change–then the species will change, too. This was most famously demonstrated with English moths, which changed color from white-and-black speckled to pure black when pollution darkened the trunks of the trees they lived on. To survive, these moths need to avoid being eaten by birds, so any moth that stands out against the tree trunks tends to get turned into an avian snack. Against light-colored trees, dark-colored moths stood out and were eaten. Against dark-colored trees, light-colored moths stand out.
This change did not require millions of years. Dark-colored moths were virtually unknown in 1810, but by 1895, 98% of the moths were black.
The time it takes for evolution to occur depends simply on A. The frequency of a trait in the population and B. How strongly you are selecting for (or against) it.
Let’s break this down a little bit. Within a species, there exists a great deal of genetic variation. Some of this variation happens because two parents with different genes get together and produce offspring with a combination of their genes. Some of this variation happens because of random errors–mutations–that occur during copying of the genetic code. Much of the “natural variation” we see today started as some kind of error that proved to be useful, or at least not harmful. For example, all humans originally had dark skin similar to modern Africans’, but random mutations in some of the folks who no longer lived in Africa gave them lighter skin, eventually producing “white” and “Asian” skin tones.
(These random mutations also happen in Africa, but there they are harmful and so don’t stick around.)
Natural selection can only act on the traits that are actually present in the population. If we tried to select for “ability to shoot x-ray lasers from our eyes,” we wouldn’t get very far, because no one actually has that mutation. By contrast, albinism is rare, but it definitely exists, and if for some reason we wanted to select for it, we certainly could. (The incidence of albinism among the Hopi Indians is high enough–1 in 200 Hopis vs. 1 in 20,000 Europeans generally and 1 in 30,000 Southern Europeans–for scientists to discuss whether the Hopi have been actively selecting for albinism. This still isn’t a lot of albinism, but since the American Southwest is not a good environment for pale skin, it’s something.)
You will have a much easier time selecting for traits that crop up more frequently in your population than traits that crop up rarely (or never).
Second, we have intensity–and variety–of selective pressure. What % of your population is getting removed by natural selection each year? If 50% of your moths get eaten by birds because they’re too light, you’ll get a much faster change than if only 10% of moths get eaten.
Selection doesn’t have to involve getting eaten, though. Perhaps some of your moths are moth Lotharios, seducing all of the moth ladies with their fuzzy antennae. Over time, the moth population will develop fuzzier antennae as these handsome males out-reproduce their less hirsute cousins.
No matter what kind of selection you have, nor what part of your curve it’s working on, all that ultimately matters is how many offspring each individual has. If white moths have more children than black moths, then you end up with more white moths. If black moths have more babies, then you get more black moths.
So what happens when you completely remove selective pressures from a population?
Back in 1968, ethologist John B. Calhoun set up an experiment popularly called “Mouse Utopia.” Four pairs of mice were given a large, comfortable habitat with no predators and plenty of food and water.
Predictably, the mouse population increased rapidly–once the mice were established in their new homes, their population doubled every 55 days. But after 211 days of explosive growth, reproduction began–mysteriously–to slow. For the next 245 days, the mouse population doubled only once every 145 days.
The birth rate continued to decline. As births and death reached parity, the mouse population stopped growing. Finally the last breeding female died, and the whole colony went extinct.
As I’ve mentioned before Israel is (AFAIK) the only developed country in the world with a TFR above replacement.
It has long been known that overcrowding leads to population stress and reduced reproduction, but overcrowding can only explain why the mouse population began to shrink–not why it died out. Surely by the time there were only a few breeding pairs left, things had become comfortable enough for the remaining mice to resume reproducing. Why did the population not stabilize at some comfortable level?
Professor Bruce Charlton suggests an alternative explanation: the removal of selective pressures on the mouse population resulted in increasing mutational load, until the entire population became too mutated to reproduce.
Unfortunately, randomly changing part of your genetic code is more likely to give you no skin than skintanium armor.
But only the worst genetic problems that never see the light of day. Plenty of mutations merely reduce fitness without actually killing you. Down Syndrome, famously, is caused by an extra copy of chromosome 21.
While a few traits–such as sex or eye color–can be simply modeled as influenced by only one or two genes, many traits–such as height or IQ–appear to be influenced by hundreds or thousands of genes:
Differences in human height is 60–80% heritable, according to several twin studies and has been considered polygenic since the Mendelian-biometrician debate a hundred years ago. A genome-wide association (GWA) study of more than 180,000 individuals has identified hundreds of genetic variants in at least 180 loci associated with adult human height. The number of individuals has since been expanded to 253,288 individuals and the number of genetic variants identified is 697 in 423 genetic loci.
Obviously most of these genes each plays only a small role in determining overall height (and this is of course holding environmental factors constant.) There are a few extreme conditions–gigantism and dwarfism–that are caused by single mutations, but the vast majority of height variation is caused by which particular mix of those 700 or so variants you happen to have.
The general figure for the heritability of IQ, according to an authoritative American Psychological Association report, is 0.45 for children, and rises to around 0.75 for late teens and adults. In simpler terms, IQ goes from being weakly correlated with genetics, for children, to being strongly correlated with genetics for late teens and adults. … Recent studies suggest that family and parenting characteristics are not significant contributors to variation in IQ scores; however, poor prenatal environment, malnutrition and disease can have deleterious effects.…
Despite intelligence having substantial heritability2 (0.54) and a confirmed polygenic nature, initial genetic studies were mostly underpowered3, 4, 5. Here we report a meta-analysis for intelligence of 78,308 individuals. We identify 336 associated SNPs (METAL P < 5 × 10−8) in 18 genomic loci, of which 15 are new. Around half of the SNPs are located inside a gene, implicating 22 genes, of which 11 are new findings. Gene-based analyses identified an additional 30 genes (MAGMA P < 2.73 × 10−6), of which all but one had not been implicated previously. We show that the identified genes are predominantly expressed in brain tissue, and pathway analysis indicates the involvement of genes regulating cell development (MAGMA competitive P = 3.5 × 10−6). Despite the well-known difference in twin-based heritability2 for intelligence in childhood (0.45) and adulthood (0.80), we show substantial genetic correlation (rg = 0.89, LD score regression P = 5.4 × 10−29). These findings provide new insight into the genetic architecture of intelligence.
The greater number of genes influence a trait, the harder they are to identify without extremely large studies, because any small group of people might not even have the same set of relevant genes.
High IQ correlates positively with a number of life outcomes, like health and longevity, while low IQ correlates with negative outcomes like disease, mental illness, and early death. Obviously this is in part because dumb people are more likely to make dumb choices which lead to death or disease, but IQ also correlates with choice-free matters like height and your ability to quickly press a button. Our brains are not some mysterious entities floating in a void, but physical parts of our bodies, and anything that affects our overall health and physical functioning is likely to also have an effect on our brains.
The study focused, for the first time, on rare, functional SNPs – rare because previous research had only considered common SNPs and functional because these are SNPs that are likely to cause differences in the creation of proteins.
The researchers did not find any individual protein-altering SNPs that met strict criteria for differences between the high-intelligence group and the control group. However, for SNPs that showed some difference between the groups, the rare allele was less frequently observed in the high intelligence group. This observation is consistent with research indicating that rare functional alleles are more often detrimental than beneficial to intelligence.
Greg Cochran has some interesting Thoughts on Genetic Load. (Currently, the most interesting candidate genes for potentially increasing IQ also have terrible side effects, like autism, Tay Sachs and Torsion Dystonia. The idea is that–perhaps–if you have only a few genes related to the condition, you get an IQ boost, but if you have too many, you get screwed.) Of course, even conventional high-IQ has a cost: increased maternal mortality (larger heads).
the difference between the fitness of an average genotype in a population and the fitness of some reference genotype, which may be either the best present in a population, or may be the theoretically optimal genotype. … Deleterious mutation load is the main contributing factor to genetic load overall. Most mutations are deleterious, and occur at a high rate.
There’s math, if you want it.
Normally, genetic mutations are removed from the population at a rate determined by how bad they are. Really bad mutations kill you instantly, and so are never born. Slightly less bad mutations might survive, but never reproduce. Mutations that are only a little bit deleterious might have no obvious effect, but result in having slightly fewer children than your neighbors. Over many generations, this mutation will eventually disappear.
(Some mutations are more complicated–sickle cell, for example, is protective against malaria if you have only one copy of the mutation, but gives you sickle cell anemia if you have two.)
Throughout history, infant mortality was our single biggest killer. For example, here is some data from Jakubany, a town in the Carpathian Mountains:
We can see that, prior to the 1900s, the town’s infant mortality rate stayed consistently above 20%, and often peaked near 80%.
When I first ran a calculation of the infant mortality rate, I could not believe certain of the intermediate results. I recompiled all of the data and recalculated … with the same astounding result – 50.4% of the children born in Jakubany between the years 1772 and 1890 would diebefore reaching ten years of age! …one out of every two! Further, over the same 118 year period, of the 13306 children who were born, 2958 died (~22 %) before reaching the age of one.
Historical infant mortality rates can be difficult to calculate in part because they were so high, people didn’t always bother to record infant deaths. And since infants are small and their bones delicate, their burials are not as easy to find as adults’. Nevertheless, Wikipedia estimates that Paleolithic man had an average life expectancy of 33 years:
Based on the data from recent hunter-gatherer populations, it is estimated that at 15, life expectancy was an additional 39 years (total 54), with a 0.60 probability of reaching 15.
In other words, a 40% chance of dying in childhood. (Not exactly the same as infant mortality, but close.)
Wikipedia gives similarly dismal stats for life expectancy in the Neolithic (20-33), Bronze and Iron ages (26), Classical Greece(28 or 25), Classical Rome (20-30), Pre-Columbian Southwest US (25-30), Medieval Islamic Caliphate (35), Late Medieval English Peerage (30), early modern England (33-40), and the whole world in 1900 (31).
Over at ThoughtCo: Surviving Infancy in the Middle Ages, the author reports estimates for between 30 and 50% infant mortality rates. I recall a study on Anasazi nutrition which I sadly can’t locate right now, which found 100% malnutrition rates among adults (based on enamel hypoplasias,) and 50% infant mortality.
As Priceonomics notes, the main driver of increasing global life expectancy–48 years in 1950 and 71.5 years in 2014 (according to Wikipedia)–has been a massive decrease in infant mortality. The average life expectancy of an American newborn back in 1900 was only 47 and a half years, whereas a 60 year old could expect to live to be 75. In 1998, the average infant could expect to live to about 75, and the average 60 year old could expect to live to about 80.
Michael A Woodley suggests that what was going on [in the Mouse experiment] was much more likely to be mutation accumulation; with deleterious (but non-fatal) genes incrementally accumulating with each generation and generating a wide range of increasingly maladaptive behavioural pathologies; this process rapidly overwhelming and destroying the population before any beneficial mutations could emerge to ‘save; the colony from extinction. …
The reason why mouse utopia might produce so rapid and extreme a mutation accumulation is that wild mice naturally suffer very high mortality rates from predation. …
Thus mutation selection balance is in operation among wild mice, with very high mortality rates continually weeding-out the high rate of spontaneously-occurring new mutations (especially among males) – with typically only a small and relatively mutation-free proportion of the (large numbers of) offspring surviving to reproduce; and a minority of the most active and healthy (mutation free) males siring the bulk of each generation.
However, in Mouse Utopia, there is no predation and all the other causes of mortality (eg. Starvation, violence from other mice) are reduced to a minimum – so the frequent mutations just accumulate, generation upon generation – randomly producing all sorts of pathological (maladaptive) behaviours.
Today, almost everyone in the developed world has plenty of food, a comfortable home, and doesn’t have to worry about dying of bubonic plague. We live in humantopia, where the biggest factor influencing how many kids you have is how many you want to have.
Back in 1930, infant mortality rates were highest among the children of unskilled manual laborers, and lowest among the children of professionals (IIRC, this is Brittish data.) Today, infant mortality is almost non-existent, but voluntary childlessness has now inverted this phenomena:
Yes, the percent of childless women appears to have declined since 1994, but the overall pattern of who is having children still holds. Further, while only 8% of women with post graduate degrees have 4 or more children, 26% of those who never graduated from highschool have 4+ kids. Meanwhile, the age of first-time moms has continued to climb.
Take a moment to consider the high-infant mortality situation: an average couple has a dozen children. Four of them, by random good luck, inherit a good combination of the couple’s genes and turn out healthy and smart. Four, by random bad luck, get a less lucky combination of genes and turn out not particularly healthy or smart. And four, by very bad luck, get some unpleasant mutations that render them quite unhealthy and rather dull.
Infant mortality claims half their children, taking the least healthy. They are left with 4 bright children and 2 moderately intelligent children. The three brightest children succeed at life, marry well, and end up with several healthy, surviving children of their own, while the moderately intelligent do okay and end up with a couple of children.
On average, society’s overall health and IQ should hold steady or even increase over time, depending on how strong the selective pressures actually are.
Or consider a consanguineous couple with a high risk of genetic birth defects: perhaps a full 80% of their children die, but 20% turn out healthy and survive.
Today, by contrast, your average couple has two children. One of them is lucky, healthy, and smart. The other is unlucky, unhealthy, and dumb. Both survive. The lucky kid goes to college, majors in underwater intersectionist basket-weaving, and has one kid at age 40. That kid has Down Syndrome and never reproduces. The unlucky kid can’t keep a job, has chronic health problems, and 3 children by three different partners.
Your consanguineous couple migrates from war-torn Somalia to Minnesota. They still have 12 kids, but three of them are autistic with IQs below the official retardation threshold. “We never had this back in Somalia,” they cry. “We don’t even have a word for it.”
People normally think of dysgenics as merely “the dumb outbreed the smart,” but genetic load applies to everyone–men and women, smart and dull, black and white, young and especially old–because we all make random transcription errors when copying our DNA.
I could offer a list of signs of increasing genetic load, but there’s no way to avoid cherry-picking trends I already know are happening, like falling sperm counts or rising (diagnosed) autism rates, so I’ll skip that. You may substitute your own list of “obvious signs society is falling apart at the genes” if you so desire.
Nevertheless, the transition from 30% (or greater) infant mortality to almost 0% is amazing, both on a technical level and because it heralds an unprecedented era in human evolution. The selective pressures on today’s people are massively different from those our ancestors faced, simply because our ancestors’ biggest filter was infant mortality. Unless infant mortality acted completely at random–taking the genetically loaded and unloaded alike–or on factors completely irrelevant to load, the elimination of infant mortality must continuously increase the genetic load in the human population. Over time, if that load is not selected out–say, through more people being too unhealthy to reproduce–then we will end up with an increasing population of physically sick, maladjusted, mentally ill, and low-IQ people.
If all of the above is correct, then I see only 4 ways out:
Do nothing: Genetic load increases until the population is non-functional and collapses, resulting in a return of Malthusian conditions, invasion by stronger neighbors, or extinction.
Sterilization or other weeding out of high-load people, coupled with higher fertility by low-load people
Abortion of high load fetuses
#1 sounds unpleasant, and #2 would result in masses of unhappy people. We don’t have the technology for #4, yet. I don’t think the technology is quite there for #2, either, but it’s much closer–we can certainly test for many of the deleterious mutations that we do know of.
Welcome back to our discussion of recent exciting advances in our knowledge of human evolution:
Ancient hominins in the US?
Humans evolved in Europe?
In two days, first H Sap was pushed back to 260,000 years,
then to 300,000 years!
Bell beaker paper
As we’ve been discussing for the past couple of weeks, the exact dividing line between “human” and “non-human” isn’t always hard and fast. The very first Homo species, such as Homo habilis, undoubtedly had more in common with its immediate Australopithecine ancestors than with today’s modern humans, 3 million years later, but that doesn’t mean these dividing lines are meaningless. Homo sapiens and Homo neandethalensis, while considered different species, interbred and produced fertile offspring (most non-Africans have 3-5% Neanderthal DNA as a result of these pairings;) by contrast, humans and chimps cannot produce fertile offspring, because humans and chimps have a different number of chromosomes. The genetic distance between the two groups is just too far.
The grouping of ancient individuals into Homo or not-Homo, Erectus or Habilis, Sapiens or not, is partly based on physical morphology–what they looked like, how they moved–and partly based on culture, such as the ability to make tools or control fire. While australopithecines made some stone tools (and chimps can make tools out of twigs to retrieve tasty termites from nests,) Homo habilis (“handy man”) was the first to master the art and produce large numbers of more sophisticated tools for different purposes, such as this Oldowan chopper.
But we also group species based on moral or political beliefs–scientists generally believe it would be immoral to say that different modern human groups belong to different species, and so the date when Homo ergaster transforms into Homo sapiens is dependent on the date when the most divergent human groups alive today split apart–no one wants to come up with a finding that will get trumpeted in media as “Scientists Prove Pygmies aren’t Human!” (Pygmies already have enough problems, what with their immediate neighbors actually thinking they aren’t human and using their organs for magic rituals.)
(Of course they would still be Human even if they part of an ancient lineage.)
But if an ecologically-minded space alien arrived on earth back in 1490 and was charged with documenting terrestrial species, it might easily decide–based on morphology, culture, and physical distribution–that there were several different Homo “species” which all deserve to be preserved.
But we are not space aliens, and we have the concerns of our own day.
So when a paper was published last year on archaic admixture in Pygmies and the Pygmy/Bushmen/everyone else split, West Hunter noted the authors used a fast–but discredited–estimate of mutation rate to avoid the claim that Pygmies split off 300,000 years ago, 100,000 years before the emergence of Homo sapiens:
There are a couple of recent papers on introgression from some quite divergent archaic population into Pygmies ( this also looks to be the case with Bushmen). Among other things, one of those papers discussed the time of the split between African farmers (Bantu) and Pygmies, as determined from whole-genome analysis and the mutation rate. They preferred to use the once-fashionable rate of 2.5 x 10-8 per-site per-generation (based on nothing), instead of the new pedigree-based estimate of about 1.2 x 10-8 (based on sequencing parents and child: new stuff in the kid is mutation). The old fast rate indicates that the split between Neanderthals and modern humans is much more recent than the age of early Neanderthal-looking skeletons, while the new slow rate fits the fossil record – so what’s to like about the fast rate? Thing is, using the slow rate, the split time between Pygmies and Bantu is ~300k years ago – long before any archaeological sign of behavioral modernity (however you define it) and well before the first known fossils of AMH (although that shouldn’t bother anyone, considering the raggedness of the fossil record).
Southern Africa is consistently placed as one of the potential regions for the evolution of Homo sapiens. To examine the region’s human prehistory prior to the arrival of migrants from East and West Africa or Eurasia in the last 1,700 years, we generated and analyzed genome sequence data from seven ancient individuals from KwaZulu-Natal, South Africa. Three Stone Age hunter-gatherers date to ~2,000 years ago, and we show that they were related to current-day southern San groups such as the Karretjie People. Four Iron Age farmers (300-500 years old) have genetic signatures similar to present day Bantu-speakers. The genome sequence (13x coverage) of a juvenile boy from Ballito Bay, who lived ~2,000 years ago, demonstrates that southern African Stone Age hunter-gatherers were not impacted by recent admixture; however, we estimate that all modern-day Khoekhoe and San groups have been influenced by 9-22% genetic admixture from East African/Eurasian pastoralist groups arriving >1,000 years ago, including the Ju|’hoansi San, previously thought to have very low levels of admixture. Using traditional and new approaches, we estimate the population divergence time between the Ballito Bay boy and other groups to beyond 260,000 years ago.
260,000 years! Looks like West Hunter was correct, and we should be looking at the earlier Pygmy divergence date, too.
Fossil evidence points to an African origin of Homo sapiens from a group called either H. heidelbergensis or H. rhodesiensis. However, the exact place and time of emergence of H. sapiens remain obscure … In particular, it is unclear whether the present day ‘modern’ morphology rapidly emerged approximately 200 thousand years ago (ka) among earlier representatives of H. sapiens1 or evolved gradually over the last 400 thousand years2. Here we report newly discovered human fossils from Jebel Irhoud, Morocco, and interpret the affinities of the hominins from this site with other archaic and recent human groups. We identified a mosaic of features including facial, mandibular and dental morphology that aligns the Jebel Irhoud material with early or recent anatomically modern humans and more primitive neurocranial and endocranial morphology. In combination with an age of 315 ± 34 thousand years (as determined by thermoluminescence dating)3, this evidence makes Jebel Irhoud the oldest and richest African Middle Stone Age hominin site that documents early stages of the H. sapiens clade in which key features of modern morphology were established.
Hublin–one of the study’s coauthors–notes that between 330,000 and 300,000 years ago, the Sahara was green and animals could range freely across it.
While the Moroccan fossils do look like modern H sapiens, they also still look a lot like pre-sapiens, and the matter is still up for debate. Paleoanthropologist Chris Stringer suggests that we should consider all of our ancestors after the Neanderthals split off to be Homo sapiens, which would make our species 500,000 years old. Others would undoubtedly prefer to use a more recent date, arguing that the physical and cultural differences between 500,000 year old humans and today’s people are too large to consider them one species.
According to the Atlantic:
[The Jebel Irhoud] people had very similar faces to today’s humans, albeit with slightly more prominent brows. But the backs of their heads were very different. Our skulls are rounded globes, but theirs were lower on the top and longer at the back. If you saw them face on, they could pass for a modern human. But they turned around, you’d be looking at a skull that’s closer to extinct hominids like Homo erectus. “Today, you wouldn’t be able to find anyone with a braincase that shape,” says Gunz.
Their brains, though already as large as ours, must also have been shaped differently. It seems that the size of the human brain had already been finalized 300,000 years ago, but its structure—and perhaps its abilities—were fine-tuned over the subsequent millennia of evolution.
No matter how we split it, these are exciting days in the field!
This all occasioned some very annoying conversations along the lines of “White skin tone couldn’t possibly have evolved within the past 20,000 years because humans evolved in Europe! Don’t you know anything about science?”
Ohkay. Let’s step back a moment and take a look at what Graecopithecus is and what it isn’t.
This is Graecopithecus:
I think there is a second jawbone, but that’s basically it–and that’s not six teeth, that’s three teeth, shown from two different perspectives. There’s no skull, no shoulder blades, no pelvis, no legs.
By contrast, here are Lucy, the famous Australopithecus from Ethiopia, and a sample of the over 1,500 bones and pieces of Homo naledi recently recovered from a cave in South Africa.
Now, given what little scientists had to work with, the fact that they managed to figure out anything about Graecopithecus is quite impressive. The study, reasonably titled “Potential hominin affinities of Graecopithecus from the Late Miocene of Europe,” by
Jochen Fuss, Nikolai Spassov, David R. Begun, and Madelaine Böhm, used μCT and 3D reconstructions of the jawbones and teeth to compare Graecopithecus’s teeth to those of other apes. They decided the teeth were different enough to distinguish Graecopithecus from the nearby but older Ouranopithecus, while looking more like hominin teeth:
G. freybergi uniquely shares p4 partial root fusion and a possible canine root reduction with this tribe and therefore, provides intriguing evidence of what could be the oldest known hominin.
My hat’s off to the authors, but not to all of the reporters who dressed up “teeth look kind of like hominin teeth” as “Humans evolved in Europe!”
First of all, you cannot make that kind of jump based off of two jawbones and a handfull of teeth. Many of the hominin species we have recovered–such as Homo naledi and Homo floresiensis, as you know if you already read the previous post–possessed a mosaic of “ape like” and “human like” traits, ie:
The physical characteristics of H. naledi are described as having traits similar to the genus Australopithecus, mixed with traits more characteristic of the genus Homo, and traits not known in other hominin species. The skeletal anatomy displays plesiomorphic (“ancestral”) features found in the australopithecines and more apomorphic (“derived,” or traits arising separately from the ancestral state) features known from later hominins.
If we only had six Homo naledi bones instead of 1,500 of them, we might be looking only at the part that looks like an Australopithecus instead of the parts that look like H. erectus or totally novel. You simply cannot make that kind of claim off a couple of jawbones. You’re far too likely to be wrong, and then not only will you end up with egg on your face, but you’ll only be giving more fuel to folks who like to proclaim that “Nebraska Man turned out to be a pig!”:
In February 1922, Harold Cook wrote to Dr. Henry Osborn to inform him of the tooth that he had had in his possession for some time. The tooth had been found years prior in the Upper Snake Creek beds of Nebraska along with other fossils typical of North America. … Osborn, along with Dr. William D. Matthew soon came to the conclusion that the tooth had belonged to an anthropoid ape. They then passed the tooth along to William K. Gregory and Dr. Milo Hellman who agreed that the tooth belonged to an anthropoid ape more closely related to humans than to other apes. Only a few months later, an article was published in Science announcing the discovery of a manlike ape in North America. An illustration of H. haroldcookii was done by artist Amédée Forestier, who modeled the drawing on the proportions of “Pithecanthropus” (now Homo erectus), the “Java ape-man,” for the Illustrated London News. …
Examinations of the specimen continued, and the original describers continued to draw comparisons between Hesperopithecus and apes. Further field work on the site in the summers of 1925 and 1926 uncovered other parts of the skeleton. These discoveries revealed that the tooth was incorrectly identified. According to these discovered pieces, the tooth belonged neither to a man nor an ape, but to a fossil of an extinct species of peccary called Prosthennops serus.
That basically sums up everything I learned about human evolution in highschool.
Second, “HUMANS” DID NOT EVOLVE 7 MILLION YEARS AGO.
Scientists define “humans” as members of the genus Homo, which emerged around 3 million years ago. These are the guys with funny names like Homo habilis, Homo neanderthalensis, and the embarrassingly named Homo erectus. The genus also includes ourselves, Homo sapiens, who emerged around 200-300,000 years ago.
Homo habilis descended from an Australopithecus, perhaps Lucy herself. Australopithecines are not in the Homo genus; they are not “human,” though they are more like us than modern chimps and bonobos are. They evolved around 4 million years ago.
Regardless, humans didn’t evolve 7 million years ago. Sahelanthropus and even Lucy do not look like anyone you would call “human.” Humans have only been around for about 3 million years, and our own specific species is only about 300,000 years old. Even if Graecopithecus turns out to be the missing link–the true ancestor of both modern chimps and modern humans–that still does not change where humans evolved, because Graecopithecus narrowly missed being a human by 4 million years.
If you want to challenge the Out of Africa narrative, I think you’d do far better arguing for a multi-regional model of human evolution that includes back-migration of H. erectus into Africa and interbreeding with hominins there as spurring the emergence of H. sapiens than arguing about a 7 million year old jawbone. (I just made that up, by the way. It has no basis in anything I have read. But it at least has the right characters, in the right time frame, in a reasonable situation.)
Sorry this was a bit of a rant; I am just rather passionate about the subject. Next time we’ll examine very exciting news about Bushmen and Pygmy DNA!
Continuing with our series on recent exciting discoveries in human genetics/paleo anthropology:
Ancient hominins in the US?
Humans evolved in Europe?
In two days, first H Sap was pushed back to 260,000 years,
then to 300,000 years!
Bell beaker paper
One of the most interesting things about our human family tree (the Homo genus and our near primate relatives, chimps, gorillas, orangs, gibbons, etc.) is that for most of our existence, “we” weren’t the only humans in town. We probably coexisted, mated with, killed, were killed by, and at times perhaps completely ignored 7 other human species–Homo erectus, floresiensis, Neanderthals, Denisovans, heidelbergensis, rhodesiensis, and now Naledi.
That said, these “species” are a bit controversial. Some scientists like to declare practically every jawbone and skull fragment they find a new species (“splitters”,) and some claim that lots of different bones actually just represent natural variation within a species (“lumpers.”)
Take the canine family: dogs and wolves can interbreed, but I doubt great danes and chihuahuas can. For practical purposes, though, the behavior of great danes and chihuahuas is similar enough to each other–and different enough from wolves’–that we class them as one species and wolves as another. Additionally, when we take a look at the complete variety of dogs in existence, it is obvious that there is actually a genetic gradient in size between the largest and smallest breeds, with few sharp breaks (maybe the basenji.) If we had a complete fossil record, and could reliably reconstruct ancient hominin behaviors and cultural patterns, then we could say with far more confidence whether we are looking at something like dogs vs. wolves or great danes vs. chihuahuas. For now, though, paleoanthropology and genetics remain exciting fields with constant new discoveries!
Homo naledi and homo Floresiensis may ultimately be small branches on the human tree, but each provides us with a little more insight into the whole.
Naledi’s story is particularly entertaining. Back in 2013, some spelunkers crawled through a tiny opening in a South African cave and found a chamber full of bones–hominin bones.
Anthropologists often have to content themselves with a handful of bones, sometimes just a fragment of a cranium or part of a jaw. (The recent claim that humans evolved in Europe is based entirely on a jaw fragment plus a few teeth.) But in the Rising Star Cave system, they found an incredible 1,500+ bones or bone fragments, the remains of at least 15 people, and they haven’t even finished excavating.
According to Wikipedia:
The physical characteristics of H. naledi are described as having traits similar to the genus Australopithecus, mixed with traits more characteristic of the genus Homo, and traits not known in other hominin species. The skeletal anatomy displays plesiomorphic (“ancestral”) features found in the australopithecines and more apomorphic (“derived,” or traits arising separately from the ancestral state) features known from later hominins.
Adult males are estimated to have stood around 150 cm (5 ft) tall and weighed around 45 kg (100 lb), while females would likely have been a little shorter and weighed a little less. An analysis of H. naledi‘s skeleton suggests it stood upright and was bipedal. Its hip mechanics, the flared shape of the pelvis are similar to australopithecines, but its legs, feet and ankles are more similar to the genus Homo.
I note that the modern humans in South Africa are also kind of short–According to Time, the Bushmen average about 5 feet tall, (that’s probably supposed to be Bushmen men, not the group average,) and the men of nearby Pygmy peoples of central Africa average 4’11” or less.
The hands of H. naledi appear to have been better suited for object manipulation than those of australopithecines. Some of the bones resemble modern human bones, but other bones are more primitive than Australopithecus, an early ancestor of humans. The thumb, wrist, and palm bones are modern-like while the fingers are curved, more australopithecine, and useful for climbing. The shoulders are configured largely like those of australopithecines. The vertebrae are most similar to Pleistocene members of the genus Homo, whereas the ribcage is wide distally as is A. afarensis. The arm has an Australopithecus-similar shoulder and fingers and a Homo-similar wrist and palm. The structure of the upper body seems to have been more primitive than that of other members of the genus Homo, even apelike. In evolutionary biology, such a mixture of features is known as an anatomical mosaic.
Four skulls were discovered in the Dinaledi chamber, thought to be two females and two males, with a cranial volume of 560 cm3 (34 cu in) for the males and 465 cm3 (28.4 cu in) for females, about 40% to 45% the volume of modern human skulls; average Homo erectus skulls are 900 cm3 (55 cu in). A fifth, male skull found in the Lesedi chamber has a larger estimated cranial volume of 610 cm3 (37 cu in) . The H. naledi skulls are closer in cranial volume to australopithecine skulls. Nonetheless, the cranial structure is described as more similar to those found in the genus Homo than to australopithecines, particularly in its slender features, and the presence of temporal and occipitalbossing, and the fact that the skulls do not narrow in behind the eye-sockets. The brains of the species were markedly smaller than modern Homo sapiens, measuring between 450 and 610 cm3 (27–37 cu in). The teeth and mandiblemusculature are much smaller than those of most australopithecines, which suggests a diet that did not require heavy mastication. The teeth are small, similar to modern humans, but the third molar is larger than the other molars, similar to australopithecines. The teeth have both primitive and derived dental development.
The overall anatomical structure of the species has prompted the investigating scientists to classify the species within the genus Homo, rather than within the genus Australopithecus. The H. naledi skeletons indicate that the origins of the genus Homo were complex and may be polyphyletic (hybrid), and that the species may have evolved separately in different parts of Africa.
Because caves don’t have regular sediment layers like riverbeds or floodplains, scientists initially had trouble dating the bones. Because of their relative “primitiveness,” that is, their similarity to our older, more ape-like ancestors, they initially thought Homo naledi must have lived a long time ago–around 2 million years ago. But when they finally got the bones dated, they found they were much younger–only around 335,000 and 236,000 years old, which means H naledi and Homo sapiens–whose age was also recently adjusted–actually lived at the same time, though not necessarily in the same place.
(On the techniques used for dating the bones:
Francis Thackeray, of the University of the Witwatersrand, suggested that H. naledi lived 2 ± 0.5 million years ago, based on the skulls’ similarities to H. rudolfensis, H. erectus, and H. habilis, species that existed around 1.5, 2.5, and 1.8 million years ago, respectively. Early estimates derived from statistical analysis of cranial traits yielded a range of 2 million years to 912,000 years before present.
H naledi is unlikely to be a major branch on the human family tree–much too recent to be one of our ancestors–but it still offers important information on the development of “human” traits and how human and ape-like traits can exist side-by-side in the same individual (a theme we will return to later.) (Perhaps, just as we modern Homo sapiens contain traits derived from ancestors who mated with Neanderthals, Denisovans, and others, H naledi owes some of its traits to hybridization between two very different hominins.) It’s also important because it is one more data point in favor of the recent existence of a great many different human varieties, not just a single group.
The Flores hominin, (aka the Hobbit,) tells a similar tale, but much further afield from humanity’s evolutionary cradle.
The island of Flores is part of the Indonesian archipelago, a surprisingly rich source of early hominin fossils. Homo erectus, the famous Java Man, arrived in the area around 1.7 million years ago, but to date no erectus remains have been discovered on the actual island of Flores. During the last Glacial Maximum, ocean levels were lower and most of Indonesia was connected in a single continent, called Sundaland. During one of these glacial periods, H erectus could have easily walked from China to Java, but Flores remained an island, cut off from the mainland by several miles of open ocean.
The diminutive Hobbits show up later, around 50,000 to 100,000 years ago, though stone tools recovered alongside their remains have been dated from 50,000 to 190,000 years ago. Homo erectus is generally believed to have lived between 2 million and 140,000 years ago, and Homo sapiens arrived in Indonesia around 50,000 years ago. This places Floresiensis neatly between the two–it could have interacted with either species–perhaps descended from erectus and wiped out, in turn, by sapiens. (Or perhaps floresiensis represents an altogether novel line of hominins who left Africa on a completely separate trek from erectus.)
Unlike H naledi, whose diminutive stature is still within the current human range (especially of humans in the area,) floresiensis is exceptionally small for a hominin. According to Wikipedia:
The first set of remains to have been found, LB1, was chosen as the type specimen for the proposed species. LB1 is a fairly complete skeleton, including a nearly complete cranium (skull), determined to be that of a 30-year-old female. LB1 has been nicknamed the Little Lady of Flores or “Flo”.
LB1’s height has been estimated at about 1.06 m (3 ft 6 in). The height of a second skeleton, LB8, has been estimated at 1.09 m (3 ft 7 in) based on measurements of its tibia. These estimates are outside the range of normal modern human height and considerably shorter than the average adult height of even the smallest modern humans, such as the Mbenga and Mbuti (< 1.5 m (4 ft 11 in)),Twa, Semang (1.37 m (4 ft 6 in) for adult women) of the Malay Peninsula, or the Andamanese (1.37 m (4 ft 6 in) for adult women).
By body mass, differences between modern pygmies and Homo floresiensis are even greater. LB1’s body mass has been estimated at 25 kg (55 lb). This is smaller than that of not only modern H. sapiens, but also H. erectus, which Brown and colleagues have suggested is the immediate ancestor of H. floresiensis. LB1 and LB8 are also somewhat smaller than the australopithecines from three million years ago, not previously thought to have expanded beyond Africa. Thus, LB1 and LB8 may be the shortest and smallest members of the extended human family discovered thus far.
Aside from smaller body size, the specimens seem otherwise to resemble H. erectus, a species known to have been living in Southeast Asia at times coincident with earlier finds purported to be of H. floresiensis.
There’s a lot of debate about whether floresiensis is a real species–perhaps affected by insular dwarfism–or just a hominin that had some severe problems. Interestingly, we have a find from about 700,000 years ago on Flores of another hominin, which we think was also a Hobbit, but is even smaller than Flo and her relatives.
Floresiensis, like Naledi, didn’t contribute to modern humans. Rather, it is interesting because it shows the breadth of our genus. We tend to assume that, ever since we split off from the rest of the great apes, some 7 or 8 million years ago, our path has been ever upward, more complex and successful. But these Hobbits, most likely descendants of one of the most successful human species, (Homo erectus, who mastered fire, was the first to leave Africa, spread across Asia and Indonesia, and lasted for over a million and half years, far longer than our puny 300,000 years,) went in the opposite direction from its ancestors. It became much smaller than even the smallest living human groups. Its brain shrank:
In addition to a small body size, H. floresiensis had a remarkably small brain size. The brain of the holotype LB1 is estimated to have had a volume of 380 cm3 (23 cu in), placing it at the range of chimpanzees or the extinct australopithecines. LB1’s brain size is half that of its presumed immediate ancestor, H. erectus (980 cm3 (60 cu in)). The brain-to-body mass ratio of LB1 lies between that of H. erectus and the great apes.
Nevertheless, it still made tools, probably controlled fire, and hunted cooperatively.
Whatever it was, it was like us–and very much not like us.
From the evolutionist point of view, the point of marriage is the production of children.
Let’s quickly analogize to food. Humans have a tremendous variety of customs, habits, traditions, and taboos surrounding foods. Foods enjoyed in one culture, like pork, crickets, and dog, are regarded as disgusting, immoral, or forbidden in another. Cheese is, at heart, rotten vomit–the enzyme used to make cheese coagulate is actually extracted from a calf’s stomach lining–and yet the average American eats it eagerly.
Food can remind you of your childhood, the best day of your life, the worst day of your life. It can comfort the sick and the mourning, and it accompanies our biggest celebrations of life.
We eat comfort food, holiday food, even sacrificial food. We have decadent luxuries and everyday staples. Some people, like vegans and ascetics, avoid large classes of food generally eaten by their own society for moral reasons.
People enjoy soda because it has water and calories, but some of us purposefully trick our taste buds by drinking Diet Coke, which delivers the sensation of drinking calories without the calories themselves. We enjoy the taste of calories even when we don’t need any more.
But the evolutionary purpose of eating is to get enough calories and nutrients to survive. If tomorrow we all stopped needing to eat–say, we were all hooked into a Matrix-style click-farm in which all nutrients were delivered automatically via IV–all of the symbolic and emotional content attached to food would wither away.
The extended helplessness of human infants is unique in the animal kingdom. Even elephants, who gestate for an incredible two years and become mature at 18, can stand and begin walking around shortly after birth. Baby elephants are not raised solely by their mothers, as baby rats are, but by an entire herd of related female elephants.
Elephants are remarkable animals, clever, communicative, and caring, who mourn their dead and create art:
But from the evolutionist point of view, the point of elephants’ family systems is still the production of elephant children.
Love is a wonderful, sweet, many-splendored thing, but the purpose of marriage, in all its myriad forms–polygamy, monogamy, polyandry, serial monogamy–is still the production of children.
In the Southwest United States, the Apache tribe practices a form of this, where the uncle is responsible for teaching the children social values and proper behavior while inheritance and ancestry is reckoned through the mother’s family alone. (Modern day influences have somewhat but not completely erased this tradition.)
Despite the long public argument over the validity of gay marriage, very few gay people actually want to get married. Gallop reports that after the Obergefell v. Hodges ruling, the percent of married gay people jumped quickly from 7.9% to 9.5%, but then leveled off, rising to only 9.6% by June 2016.
Between 1990 and 2010, the percentage of 50-year-old people who had never married roughly quadrupled for men to 20.1% and doubled for women to 10.6%. The Welfare Ministry predicts these numbers to rise to 29% of men and 19.2% of women by 2035. The government’s population institute estimated in 2014 that women in their early 20s had a one-in-four chance of never marrying, and a two-in-five chance of remaining childless.
Recent media coverage has sensationalized surveys from the Japan Family Planning Association and the Cabinet Office that show a declining interest in dating and sexual relationships among young people, especially among men. However, changes in sexuality and fertility are more likely an outcome of the decline in family formation than its cause. Since the usual purpose of dating in Japan is marriage, the reluctance to marry often translates to a reluctance to engage in more casual relationships.
In other words, marriage is functionally about providing a supportive way of raising children. In a society where birth control does not exist, children born out of wedlock tend not to survive, and people can easily get jobs to support their families, people tended to get married and have children. In a society where people do not want children, cannot afford them, are purposefully delaying childbearing as long as possible, or have found ways to provide for them without getting married, people simply see no need for marriage.
“Marriage” ceases to mean what it once did, reserved for old-fashioned romantics and the few lucky enough to afford it.
Mass acceptance of gay marriage did change how people think of marriage, but it’s downstream from what the massive, societal-wide decrease in child-bearing and increase in illegitimacy have done to our ideas about marriage.
There are three categories of supersars who seem to attract excessive female interest. The first is actors, who of course are selected for being abnormally attractive and put into romantic and exciting narratives that our brains subconsciously interpret as real. The second are sports stars and other athletes, whose ritualized combat and displays of strength obviously indicate their genetic “fitness” for siring and providing for children.
The third and strangest category is professional musicians, especially rock stars.
I understand why people want to pass athletic abilities on to their children, but what is the evolutionary importance of musical talent? Does music tap into some deep, fundamental instinct like a bird’s attraction to the courtship song of its mate? And if so, why?
There’s no denying the importance of music to American courtship rituals–not only do people visit bars, clubs, and concerts where music is being played in order to meet potential partners, but they also display musical tastes on dating profiles in order to meet musically-like-minded people.
Of all the traits to look for in a mate, why rate musical taste so highly? And why do some people describe their taste as, “Anything but rap,” or “Anything but country”?
At least when I was a teen, musical taste was an important part of one’s “identity.” There were goths and punks, indie scene kids and the aforementioned rap and country fans.
Is there actually any correlation between musical taste and personality? Do people who like slow jazz get along with other slow jazz fans better than fans of classical Indian? Or is this all compounded by different ethnic groups identifying with specific musical styles?
Obviously country correlates with Amerikaner ancestry; rap with African American. I’m not sure what ancestry is biggest fans of Die Antwoord. Heavy Metal is popular in Finno-Scandia. Rock ‘n Roll got its start in the African American community as “Race Music” and became popular with white audiences after Elvis Presley took up the guitar.
While Europe has a long and lovely musical heritage, it’s indisputable that African Americans have contributed tremendously to American musical innovation.
Here are two excerpts on the subject of music and dance in African societies:
Both of these h/t HBD Chick and my apologies in advance if I got the sources reversed.
One of the major HBD theories holds that the three races vary–on average–in the distribution of certain traits, such as age of first tooth eruption or intensity of an infant’s response to a tissue placed over its face. Sub-Saharan Africans and Asians are considered two extremes in this distribution, with whites somewhere in between.
If traditional African dancing involves more variety in rhythmic expression than traditional European, does traditional Asian dance involve less? I really know very little about traditional Asian music or dance of any kind, but I would not be surprised to see some kind of continuum affected by whether a society traditionally practiced arranged marriages. Where people chose their own mates, it seems like they display a preference for athletic or musically talented mates (“sexy” mates;) when parents chose mates, they seem to prefer hard-working, devout, “good providers.”
Even in traditional European and American society, where parents played more of a role in courtship than they do today, music still played a major part. Young women, if their families could afford it, learned to play the piano or other instruments in order to be “accomplished” and thus more attractive to higher-status men; young men and women often met and courted at musical events or dances organized by the adults.
It is undoubtedly true that music stirs the soul and speaks to the heart, but why?