Having the Rh- bloodtype makes reproduction difficult, because Rh- mothers paired with Rh+ fathers end up with a lot of miscarriages.*
The simplified version: Rh+ people have a specific antigen in their blood. Rh- people don’t have this antigen.
If a little bit of Rh+ blood gets into an Rh- person’s bloodstream, their immune system notices this new antibody they’ve never seen before and the immune response kicks into gear.
If a little bit of Rh- blood gets into an Rh+ person’s bloodstream, their immune system notices nothing because there’s nothing to notice.
During pregnancy, it is fairly normal for a small amount of the fetus’s blood to cross out of the placenta and get into the mother’s bloodstream. One of the effects of this is that years later, you can find little bits of their children’s DNA still hanging around in women’s bodies.
If the mother and father are both Rh- or Rh+, there’s no problem, and the mother’s body takes no note of the fetuses blood. Same for an Rh+ mother with an Rh- father. But when an Rh- mother and Rh+ father mate, the result is bloodtype incompatibility: the mother begins making antibodies that attack her own child’s blood.
The first fetus generally comes out fine, but a second Rh+ fetus is likely to miscarry. As a result, Female Rh- with Male Rh+ pairings tend not to have a lot of children. This seems really disadvantageous, so I’ve been trying to work out if Rh- bloodtype ought to disappear out over time.
Starting with a few simplifying assumptions and doing some quick back of the envelope calculations:
We’re in an optimal environment where everyone has 10 children unless Rh incompatibility gets in the way.
Blood type is inherited via a simple Mendelian model. People who are ++, +-, and -+ all have Rh+ blood. People with — are Rh-.
We start with a population that is 25% ++, +-, -+, and –, respectively.
50++, 40+-, 21-+, 30–, and some quantity of “It’s complicated.”
For the F–/M+- pairings, any — children will live and most -+ children will die. Since we’re assuming 10 children, we’re going to calculate the odds for ten kids. Dead kids in bold; live kids plain.
Obvious pattern is obvious: F–/M+- pairings lose 25% of their second kids, 37.5% of their third kids, 43.3% of their fourth kids, 46.4% of their fifth kids, etc, on to about 50% of their 10th kids.
Which I believe works out to an average of 5–, 1+-
The outcomes for F–/M-+ pairings are the same, of course: 5–, 1+-
So this gives us a total of:
50++, 41+-, 22-+, 40–, or 33% ++, 27% +-, 14% -+, 26% — (or, 54% of the alleles are + and 46% are -).
(This assumes, of course, that people cannot increase their number of pregnancies.)
Running the numbers through again (I will spare you my arithmetic), we get:
35% ++, 32% +-, 11.8%-+, 21.4% — (or, 57% of alleles are + and 43% are – ).
I’m going to be lazy and say that if this keeps up, it looks like the –s should become fewer and fewer over time.
But I’ve made a lot of simplifying assumptions to get here that might be affecting my outcome. For example, if people only have one kid, there’s no effect at all, because only second children on down get hit by the antibodies. Also, people can have additional pregnancies to make up for miscarriages. 20 pregnancies is obviously pushing the limits of what humans can actually get done, but let’s run with it.
So in the first generation, F–/M+- => 9–, 1+- ; F–/M-+ => 9–, 1-+ (that is, the extra pregnancies result in 8 extra — children.) The F–/M++ pairing still results in only one -+ child.
This gives us 50++, 41+-, 22-+, 48– children, or 31%++, 25%+-, 13.7%, 30%– (or 51% + vs 49% – alleles.)
At this point, the effect is tiny. However, as I noted before, having 20 pregnancies is a bit of a stretch for most people; I suspect the effect would still be generally felt under normal conditions. For example, I know an older couple who suffered Rh incompatibility; they wanted 4 children, but after many miscarriages, only had 3.
Which leads to the question of why Rh-s exist at all, which we’ll discuss tomorrow.
*Lest I worry anyone, take heart: modern medicine has a method to prevent the miscarriage of Rh+ fetuses of Rh- mothers. Unfortunately, it requires an injection of human blood serum, which I obviously find icky.
Today we come to a flaw in my methods: I usually write my posts a few weeks before they actually go up. Normally, this is not an issue–genetics tends not to change very much from week to week. And to keep a them evenly paced, I just write each Cathedral Round up on the day the previous one goes up. Since articles from the Yale Law bulletin or Princeton Magazine are not normally of interest to outsiders, the delay between publication and commentary hasn’t been a big issue.
But this month, all the stuff going on in the echelons of higher education has made it into the actual news! Do you know how weird it is to suddenly have relatives complaining about student protests at Yale or U Missouri? Obscure campus news–that’s my schtick, not theirs.
Next month, I’m going to try out a new methodology for keeping the Cathedral Round Up both on-schedule and topical. For today, though, here’s what was going on before all this stuff broke into the media:
This month, I decided to focus on Yale, Princeton, and Penn (though Stanford managed to sneak back in, because Stanford.)
Yale is in the process of cannibalizing itself. Princeton is halfway there, but some students are still holding out due to Princeton’s stronger culture of elitism. Poor Penn is never going to get taken seriously as an Ivy so long as it continues insisting on publishing mostly reasonable articles about itself, instead of concentrating on world-breaking levels of crazy.
The Yale Alumni magazine has a transcript of Deal Holloway’s Freshman Address, Yale’s Narrative, and Yours, (gosh, that comma bugs me. Commas are for lists of three or more things, or separating two different actors in a sentence, eg, “She went to the store, and I vacuumed the house.” This title should not have a comma,) which I am going to quote quite a bit from because it is just so awful:
Class of 2019, I am thrilled to see you and look forward to getting to know you well in the years ahead. … But who, exactly, are you? You hail from across this country and from around the world. Many of you are the children of parents who are already Yale alumni. More of you will be the first in your families to graduate from college at all. Most of you went to public school. Nearly half of you are receiving financial aid. …
I’d like you to turn to the images that are in your program. … The images you see are something of a triptych—three different paintings of British merchant Elihu Yale that when brought together tell a fascinating story. For those who don’t already know, Elihu Yale rose to power and accumulated wealth through his leadership in the East India Company. In 1718, Yale received a request to finance a new building for the Collegiate School of Connecticut, a small enterprise founded in 1701 for the training of Congregationalist ministers. Yale sent hundreds of books, a portrait of King George I, and bales of goods that were later sold to finance the building. In short order, the Collegiate School was renamed in his honor. …
In all of the paintings Elihu Yale is wearing and surrounded by sumptuous fabrics. … In the two paintings on side one we see ships in the distance—a reference to the fact that Elihu Yale built his career on trade that navigated the ports in the British empire. In the second and third paintings we see an unidentified attendant. Much like the wearing of exquisite clothes suggested, placing a servant in a portrait was an articulation of standing and wealth. But when we look more carefully at these two paintings we notice that in addition to the fine clothes the servant and page are wearing they also happen to have metal collars and clasps around their necks. What we are seeing in each painting, then, isn’t a servant or a page, but a slave.
We are fairly certain that Elihu Yale did not own any slaves himself, but there’s no doubting the fact that he participated in the slave trade, profiting from the sale of humans just as he profited from the sale of so many actual objects that were part of the East India trade empire. As such, Elihu Yale’s wealth was linked to a global economy that was deeply, practically inextricably, interwoven with the sale of human beings to other human beings. In fact, when we look at the paintings it is safe to assume that Elihu Yale was a willing participant in that economy. Since he could have selected anything to represent him in these paintings we can conclude that he chose to be depicted with enslaved people because he believed this narrative would best signify his wealth, power, and worldliness.
This is a difficult story to hear, especially on an occasion of welcoming and celebration. But I share it with you because just as proper histories are unafraid of their darker corners you should be unafraid to ask difficult questions of this university. Indeed, we expect you to do so.
… The first of your three images hangs in the Corporation Room of Woodbridge Hall—the nerve center of the university. That this specific portrait hangs there, however, is fairly recent history. Until 2007, the second painting of Elihu Yale you see in the program insert is what you would have found in the Corporation Room. That year, recognizing that this representation was terribly jarring whether it was understood in its historical context or not, the university removed the painting. …
So, Class of 2019: here you are, in a place that has been waiting a long time for you to arrive, a place where you emphatically belong. Whatever your race, religion, wealth, sport, political philosophy, taste in music; whatever your sexuality, your passport’s origin, or the number of stamps in your passport, this place is yours, ready for you to make your contribution to it. …
You have come here at a unique moment, when this university engages with questions of its own identity, at a time when national conversations about race have shined a light on social constructions and assumptions that for many (but not for all), have lain dormant for decades, if not centuries. …
I have to interrupt here. Who the fuck thinks that our ideas about race have been lying dormant for centuries? WHAT DOES THAT EVEN MEAN? Were there no Civil Rights marches in the 1950s? Did no one in the 60s and 70s ever mention race? Did we never celebrate Martin Luther King Day in school? Are there no streets named in his honor? People talk constantly about race, but for some strange reason keep claiming that we have not been talking about race.
These big questions will form part of the education that awaits you, even more than problem sets, term papers, or exams. But so will the conversation that begins today, as you write your own story and build your own Yale.
This is hard but joyous work, and you embark on it with many others. Joining you are your peers and your professors, the friends you are about to make, and the students who have preceded you. I join you, too.
Welcome to this work. Welcome to this place. Welcome to Yale.
TL; DR: White history is shit and white people should feel bad. Welcome to Yale!
The President of Yale, Peter Salovey, also gave a Freshman Address, “Launching a Difficult Conversation.” Let’s see if it is any better:
Good morning and welcome, Class of 2019, family members, and colleagues sharing the stage with me. …
Well, as the events in South Carolina shook the nation, many members of our own community could not avoid considering a matter that ties us here in New Haven to similar questions of history, naming, symbols, and narratives. …
About one in twelve of you has been assigned to Calhoun College, named, when the college system was instituted in the 1930s, for John C. Calhoun—a graduate of the Yale College Class of 1804 who achieved extremely high prominence in the early nineteenth century as a notable political theorist, a vice president to two different US presidents, a secretary of war and of state, and a congressman and senator representing South Carolina. …
Calhoun mounted the most powerful and influential defense of his day for slavery. …
Are we perhaps better off retaining before us the name and the evocative, sometimes brooding presence of Yale graduate John C. Calhoun? He may serve to remind us not only of Yale’s complicated and occasionally painful associations with the past, but to enforce in us a sense of our own moral fallibility as we ourselves face questions about the future.
So it was not surprising that within a short time of the massacre and subsequent debate in South Carolina, we found that the issues of honoring, naming, and remembering that have occasionally surfaced regarding Calhoun College returned to confront us again. … And inevitably we found ourselves wondering, and not for the first time, how best to address the undeniable challenges associated with the fact that Calhoun’s name graces a residential community in Yale College, an institution where, above all, we prize both the spirit and reality of full inclusion. …
As entering Yale students of the Class of 2019, what are your obligations to wrest from this place an education that encourages you to question tradition even while honoring it, to chart your own history even while learning from the past, to enter fully into difficult conversations even while respecting contradictory ideas and opinions?I know in the next four years, you will make progress on figuring all this out. Let’s get started together. Let’s get started today.
Yale has, apparently, no heroes worth honoring or inspiring its students to emulate, only villains. The grand duty of Yale students is to decide whether their past heroes should cast out and forgotten, or remembered solely as a warning about evil.
Take Yale’s bond from a Dutch water authority: it was originally issued in 1648, it is inscribed on goatskin, and recently, it added €136.20—about $153—to Yale’s coffers. … the bond was acquired as part of “a collection that traces the history of capital market development and financial innovation.”
A bestselling memoirist, the poet for Barack Obama’s first inauguration, and Yale’s first endowed professor of poetry, Elizabeth Alexander ’84 is one of Yale’s highest-profile professors. But not for long: Alexander is leaving the Yale faculty for Columbia next fall.
Her departure, along with that of anthropologist Vanessa Agard-Jones ’00, also for Columbia, was reported in the Yale Daily News as a sign of “systemic problems” in Yale’s efforts to make its faculty more diverse. (Alexander and Agard-Jones are both African American.)
Really? You could have fooled me:
Elizabeth Alexander, definitely a black person.Rachel Dolezal, definitely a white person.
At least Agard-Jones is actually a black person, though calling her an anthropologist is a bit misleading. She actually describes herself as, “an Assistant Professor of Women’s, Gender + Sexuality Studies at Yale University. … As a political anthropologist, I specialize in the study of gender and sexuality in the African diaspora.“ Anyway, the article goes on:
Columbia has invested $63 million in its faculty diversity initiative to finance “recruitment, support, and related programs” since 2012.
“We have not made nearly enough progress on diversifying the faculty, and my colleagues in the higher administration know that I have long believed we need to have powerful commitments from on high, both in continued, stated vision and also with extensive resource allocation,” Alexander told the News. “Yale lags behind its peers where we should be leaders, and [faculty diversity] goals, in my opinion, should be a priority, as they are elsewhere, including Columbia.”
Ultimately, it all comes down to money. Qualified black professors are few and far between, and so capable of commanding much higher salaries than they would if they were white.
The world does not need more scientists, engineers, or people who build complicated systems for the delivery of electricity or removal of waste. The world needs more vaguely black-looking poets and gender studies professors. Those are the folks who will bring us the next set of civilization-building innovations!
Here at Yale, your worth as a person is not determined by what you do, by what you accomplish, or by the content of your character, but by the color of your skin. And maybe your sexual proclivities and gender.
I have a proposal: Let’s rename the whole shebang. Get rid of “Yale”. Let’s rename them “Rosa Parks University” and “Caesar Chavez College” and be done with it. It’s not like anyone actually cares about Elihu or Calhoun, except as representatives of a hated history.
Penn had an interesting article on helping ex-cons start companies by teaching them how to fill out paperwork, but that kind of practical approach to the world will never get Penn the kind of attention it needs to be a world-class university.
Meanwhile, over at Princeton, one of the nation’s most prestigious and selective colleges, a student noticed that in order to have a functional social club that pursues a particular interest (in this case, literature), some people have to be excluded. The student therefore decided not to join a social club, because excluding people is bad.
A group of students is in the process of creating a new student organization that aims to raise awareness and educate the community on the subject of campus sexual assault. …
Because no one has ever done that before. Seriously, I bet no one on the entire Stanford campus has ever thought to raise awareness of sexual assault before.
The idea for the student group grew out of a Sophomore College course this summer called “One in Five: The Law, Policy and Politics of Sexual Assault” with law professor Michele Dauber. The group will be called One in Five after the class.
The three-week experience was “completely immersive,” according to Dauber.
“Immersive”? What, did they rape the students in the course?
In The Problem with Philanthropy, a Princetonian critiques Effective Altruism on the grounds that capitalism is evil:
Perhaps more troubling than Whitman or Rockefeller are the cases of individuals like Matt Wage ’12. Wage took Peter Singer’s ethics class and decided to work on Wall Street after graduation in order to make large amounts of money that he could then donate to life-saving causes. In his book, Singer argues that Wage exemplifies the model of effective altruism, a concept that enshrines individual charity as the most effective force for good while ignoring entirely the power of collective action against structural injustice.
Wage joined a toxic system of finance dominated by rent seekers that helps maintain an environmentally unsustainable global economy. This economy is already taking lives and bringing suffering [PDF] for some of the world’s most vulnerable populations. While Wage can take credit for the lives that he has supposedly saved with his Wall Street earnings, he can also conveniently ignore his complicity in a system of finance inextricable from climate injustice as well as other forms of oppression like private prisons, sweatshops, the domestic and global exchange of weapons and practices like insider trading, cronyism and corruption.
If you look at the PDF about “taking lives and bringing suffering,” you’ll note that Wage is being blamed for global warming.
While I actually dislike Wall Street and economic theories based on the idea of endless growth, which are bad for long-term resource maintenance necessary for people to have nice lives, this is not a critique of Effective Altruism. Coherent critiques of EA exist, but “EA => Global Warming!” is not one of them.
… it is time for our University to reevaluate its blind veneration to its deeply racist demigod. … This response assumes that Wilson’s racist actions were minuscule despite the fact that he actively worked to destroy, hinder and thwart the communities of black and brown peoples in America. … I told the administrator that Wilson is arguably the most racist U.S. and Princeton president, and the administrator agreed that Wilson was indeed racist.
I think the Cherokee might disagree with that assessment.
Much like religion and nationalism, I don’t really have the emotional impulses necessary to really get into the idea of a holiday dedicated to eating turkey. Maybe this is just my personal failing, or a side effect of not being a farmer, but either way here I am, grumbling under my breath about how I’d rather be getting stuff done than eat.
Nevertheless, I observe that other people seem to like holidays. They spend large amounts of money on them, decorate their houses, voluntarily travel to see relatives, and otherwise “get into the holiday mood.” While some of this seems to boil down to simple materialism, there does seem to be something more: people really do like their celebrations. I may not be able to hear the music, but I can still tell that people are dancing.
And if so many people are dancing, and they seem healthy and happy and well-adjusted, then perhaps dancing is a good thing.
The point of Thanksgiving, a made-up holiday, (though it does have its roots in real harvest celebrations,) is to celebrate the connection between family and nation. This is obvious enough, since Thanksgiving unifies “eating dinner with my family” with “founding myth of the United States.” We tell the story of the Pilgrims, not because they are everyone’s ancestors, but because they represent the symbolic founding of the nation. (My Jamestown ancestors actually got here first, but I guess Virginia was not in Lincoln’s good graces when he decided to make a holiday.)
In the founding mythos, the Pilgrims are brave, freedom-loving people who overcome tremendous odds to found a new nation, with the help of their new friends, the Indians.
Is the founding mythos true?
It doesn’t matter. Being “literally true” is not the point of a myth. The Iliad did not become one of the most popular books of all time because it provides a 100% accurate account of the Trojan war, but because it describes heroism, bravery, and conversely, cowardice. (“Hektor” has always been high on my names list.) Likewise, the vast majority of Christians do not take the Bible 100% literally (even the ones who claim they do.) Arguing about which day God created Eve misses the point of the creation story; arguing about whether the Exodus happened exactly as told misses the point of the story held for a people in exile.
The story of Thanksgiving instructs us to work hard, protect liberty, and be friends with the Indians. It reminds us both of the Pilgrims’ utopian goal of founding the perfect Christian community, a shining city upon the hill, and of the value of religious tolerance. (Of course, the Puritans would probably not have been keen on religious tolerance or freedom of religion, given that they exiled Anne Hutchins for talking too much about God.)
Most of us today probably aren’t descended from the Pilgrims, but the ritual creates a symbolic connection between them and us, for we are the heirs of the civilization they began. Likewise, each family is connected to the nation as a whole; without America, we wouldn’t be here, eating this turkey together.
Unless you don’t like turkey. In which case, have some pie.
The only reason why we started celebrating “Columbus Day” was to make the Irish and Italians feel like Catholics can be real Americans, too, not just Protestants.
“Columbus Day” isn’t really about celebrating Columbus. Not as a person. Nobody says, “Read this biography of a great man from infancy to dotage and try to be more like him!” Columbus day is about celebrating what Columbus did–find a New World and launch the Age of Exploration and discovery.
Do I care about Columbus Day? No. Don’t be silly. I don’t think I’ve ever met anyone who actually celebrates Columbus Day, but maybe the Italians are really into it. If so, I don’t begrudge them a holiday. However, I do care about Columbus’s accomplishments.
“But Columbus was an idiot who only found the New World by accident!” I hear someone protest.
Yeah, well, I don’t see you discovering any continents lately. Where does that put you on the intellect ladder? Also, Penicillin was discovered by accident, so I guess it doesn’t count, either.
Here, I’ll take all of the penicillin, and you can go play with rodents. We’ll see which of us survives the longest.
“But Columbus was an asshole,” someone protests. “He conquered and enslaved people!”
Guys, it was the 14 hundreds. Pretty much EVERYBODY in the 1400s thought it was okay to conquer and enslave people. If you start applying modern standards to people from the 1400s, you’ll discover that none of them meet your standards.
You want to celebrate “Indigenous Culture Day” instead of Columbus Day? Do you know what kind of assholes indigenous cultures were full of?
Let’s hear it for the Aztecs, one of those peaceful wonderful indigenous cultures Columbus’s Spanish employers went and conquered as a result of his voyages.
They liked to rip people’s beating hearts out of their bodies as human sacrifices to their gods.
The Spaniard’s pigs, however, they just killed and threw in a well. WTF do you do with one of those things? They didn’t know. Humans, however, they knew what to do with: eat them.
The Wikipedia records many documented cases of Aztec cannibalism:
Hernán Cortés wrote in one of his letters that his soldiers had captured an indigenous man who had a roasted baby ready for breakfast.
Francisco López de Gómara (c. 1511 – c. 1566) reported that, during the siege of Tenochtitlan, the Spaniards asked the Aztecs to surrender since they had no food. The Aztecs angrily challenged the Spaniards to attack so they could be taken as prisoners, sacrificed and served with “molli” sauce.
The Historia general… contains an illustration of an Aztec being cooked by an unknown tribe. This was reported as one of the dangers that Aztec traders faced. …Bernal Díaz’s The Conquest of New Spain (written by 1568, published 1632) contains several accounts of cannibalism among the people the conquistadors encountered during their warring expedition to Tenochtitlan.
About the city of Cholula, Díaz wrote of his shock at seeing young men in cages ready to be sacrificed and eaten.[1]
In the same work Diaz mentions that the Cholulan and Aztec warriors were so confident of victory against the conquistadors in an upcoming battle the following day, that “…they wished to kill us and eat our flesh, and had already prepared the pots with salt and peppers and tomatoes”[2]
About the Quetzalcoatl temple of Tenochtitlan Díaz wrote that inside there were large pots, where human flesh of sacrificed Natives was boiled and cooked to feed the priests.[3]
About the Mesoamerican towns in general Díaz wrote that some of the indigenous people he saw were—:
“
eating human meat, just like we take cows from the butcher’s shops, and they have in all towns thick wooden jail-houses, like cages, and in them they put many Indian men, women and boys to fatten, and being fattened they sacrificed and ate them.[4]
”
Díaz’s testimony is corroborated by other Spanish historians who wrote about the conquest. In History of Tlaxcala (written by 1585), Diego Muñoz Camargo (c. 1529 – 1599) states that:
“
Thus there were public butcher’s shops of human flesh, as if it were of cow or sheep.[5]
Is that what you want to fucking celebrate? THIS IS WHAT YOU THINK WAS BETTER THAN COLUMBUS?
No, hunter-gatherers were not peaceful paragons of gender equality. Stop fucking saying that. It is a lie. There is no evidence to back it up. Primitive, pre-modern societies had absolutely atrocious crime rates. There are real live fucking cannibals living right now in the Congo rainforest. They eat the Pygmies (and each other.)
And this is supposed to be my fault? “White privilege” is the magic sauce that explains why some cultures produce penicillin and others produce cannibals.
Of course, the Aztecs are only one group. The Pueblo peoples also practiced cannibalism. Cannibalism was practiced among various coastal tribes stretching from Texas to Louisiana.
When Captain John Smith of Jamestown fame inquired about the fate of the lost Roanoke Colony, Chief Powhatan–you know, the Pocahontas’s dad, the guy who’d tried to kill John Smith–confessed to having massacred them all. Historians aren’t sure if this is actually true–Powhatan might have just confused them with some other guys he’d massacred–but the fact remains that Powhatan and his people went around massacring their neighbors regularly enough that, “Oh yeah, we killed them all,” was seen as a reasonable explanation by everyone involved.
It wasn’t too many years later that the Powhatan tried to do the same thing to Jamestown, killing about a quarter of the people there.
Celebrating Columbus was never about Columbus, and denigrating Columbus isn’t about Columbus, either. Celebrating Columbus is about celebrating American history and the contributions of Catholic-Americans to that history; denigrating Columbus is about denigrating American history and European contributions to it.
Who should be the America’s moral superior and successor? Whose successes should we celebrate instead of Columbus’s? Should the people of Mexico overthrow the culture of their evil oppressors and go back to holding human sacrifices in the middle of Mexico City?
Funny, I don’t see a lot of people trying to go live in Mexico, much less return to the actual lives of their indigenous ancestors. Most people seem to like having things like penicillin, cell phones, cars, air conditioning and sewers, and dislike things like cannibalism and constant tribal warfare. The process by which civilization was made was not pretty, but civilization is good and we should celebrate it.
We should not attack people’s cultural heroes just to denigrate their nation.
Oh, and happy Thanksgiving, since the backlog means that this post isn’t going up for a month.
People often make the mistake of over-generalizing other people. We speak of “Indians,” “Native Americans,” or better yet, “Indigenous Peoples,” as though one couldn’t tell the difference between a Maori and an Eskimo; as though only two undifferentiated blocks of humanity existed, everywhere on the globe: noble first people who moved into the area thousands upon thousands of years ago, sat down, and never moved again, and evil invaders who showed up yesterday.
In reality, Group A has conquered and replaced Group B and been conquered and replaced in turn by Group C since time immemorial. Sometimes the conquered group gets incorporated into the new group, and years down the line we can still find their DNA in their descendants. At other times, all that’s left is an abrupt transition in the archeological record between one set of artifacts and skull types and another.
Even “Indigenous” peoples have been migrating, conquering and slaughtering each other since time immemorial. The only difference between them and Europeans is that the Europeans did it more recently and while white.
When we take a good look at the Indians’ DNA, we find evidence of multiple invasion waves, some of them genocidal. The Sururi, Pima, and Chippewyans are clearly distinct, as are the Eskimo and Aleuts:
DNA of the Eskimos and related peoplesDNA of the Aleuts and related peoples
Please note that Haak’s chart and the chart I have at the top of the blog use different colors to represent the same things; genetic admixture of course does not have any inherent color, so the choice of colors is entirely up to the person making the graph.)
The Karitiana are one of those mixed horticulturalist/hunter-gatherer tribes from deep in the Amazon Rainforest who have extremely little contact with the outside world and are suspected of having Denisovan DNA and thus being potentially descended from an ancient wave of Melanesians who either got to the Americas first, or else very mysteriously made it to the rainforest without leaving significant genetic traces elsewhere. I’m going with they got here first, because that explanation makes more sense.
The Pima People of southern Arizona had extensive trade and irrigation networks, and are believed to be descended from the Hohokam people, who lived in the same area and also built and maintained irrigation networks and cities, and are probably generally related to the Puebloan Peoples, who also built cities in the South West. An observer wrote about the Puebloans:
When these regions were first discovered it appears that the inhabitants lived in comfortable houses and cultivated the soil, as they have continued to do up to the present time. Indeed, they are now considered the best horticulturists in the country, furnishing most of the fruits and a large portion of the vegetable supplies that are to be found in the markets. They were until very lately the only people in New Mexico who cultivated the grape. They also maintain at the present time considerable herds of cattle, horses, etc. They are, in short, a remarkably sober and industrious race, conspicuous for morality and honesty, and very little given to quarrelling or dissipation … Josiah Gregg, Commerce of the Prairies: or, The journal of a Santa Fé trader, 1831–1839
Linguistically, the Pima speak an Uto-Aztecan language, connecting them with the Soshoni to the north, Hopi to the east, and the Aztecs to the south (and even further south, since the family is also spoken in Equador):
Map of Uto-Aztecan language distribution
The Aztecs, as you probably already know, had a large empire with cities, roads, trade, taxes, etc.
In other words, the Pima were far more technologically advanced than the Karitiana, which suggests that the arrow of conquering here goes from Pima-related people to Karitiana-related people, rather than the other way around.
Now, obviously, the Pima did not travel down to Bolivia, kill a bunch of Karitiana people living in Bolivia, rape their women, and then head back to Arizona. More likely, the ancestors of the Karitiana once lived throughout much of South and Central America, and perhaps even further afield. The ancestors of the Pima then invaded, killing a bunch of the locals and incorporating a few of their women into their tribes. The Karitiana managed to survive in the rainforest due to the rainforest being very difficult to conquer, and the Pima failed to mix with other groups due to being the only guys interested in living in the middle of the Arizona desert.
Those guys in the southern branch of the family are the Navajos and Apache. These languages are speculated to be linked to Siberian languages like the Yeniseian.
The Algonquin people (of whom the Ojibwe are part,) come from the North East US and Canada:
Map of Algonquian Language Family distribution
There also exist a couple of languages on the California coast which appear to be related to the Algonquin Family, possibly a case of Survival on the Fringes as a new wave of invaders migrated from the Bering Strait.
The Algonquins appear to have been semi-nomadic semi-horticulturalists. They grew corn and squash and beans, and also moved around hunting game and gathering wild plants as necessary.
Where we see red admixture in Haak’s graph, that means Siberian people. Where we see dark blue + orange + teal, that’s typical European. Most likely this means that the Algonquins in Haak’s data have some recent European ancestors due to a lot of inter-marriage happening over the past few hundred years in their part of the world. (The Chipewyans live in a much more isolated part of the continent.) However, some of that DNA might also have come with them when they migrated to North America years and yeas ago, due to their ancient Siberian ancestors having merged with an off-shoot of the same groups that modern Europeans are descended from. This is a likely explanation for the Aleuts and Tlingit peoples, whose dark blue and teal patches definitely look similar to those of other Siberian peoples. (Although, interestingly, they lack the red. Maybe the red was a later addition, or just didn’t make it over there in as large quantities.)
The Eskimo I have spoken of before; they appear to have wiped out everyone else in their immediate area. They live around the coastal rim of Alaska and northern Canada.
The Aleuts likely represent some kind of merger between the Eskimo and other Siberian peoples.
My summary interpretation:
Wave One: The Green People. Traces of their DNA appear to be in the Ojibwe, Eskimos, and Chileans, so they may have covered most of North and South America at one time.
Wave Two: The Pink People. They wiped out the vast majority of the Green people throughout North America, but as migration thinned their numbers, they ended up intermarrying instead of killing some of the Greens down in Central and South America.
The Green People only survived in any significant numbers deep in the rainforest, where the Pink People couldn’t reach. These Greens became the Karitiana.
Wave Three: The Brown people. These guys wiped out all of the Pink people in northwest Canada and Alaska, but as migration to the east thinned their numbers, they had to inter-marry with the local Pinks. This mixed group became the Algonquins, while the unmixed Browns became the Chipewyans.
Few Browns managed to push their way south, either because they just haven’t had enough time, or because they aren’t suited to the hotter climate. Either way, most of the Pink People went unconquered to the south, allowing the Pima and their neighbors to flourish.
Wave Four: The Eskimo, who wiped out most of the other people in their area.
Patterns–narratives–are how we understand the world.
Look away from the screen. What do you see? A collection of lines and colors? Or objects?
You can make sense of the light entering your eyes because your brain organizes them into patterns. You recognize that a colon and a parentheses are a face :) You recognize that orange and black stripes mean a tiger is nearby. Sounds coalesce into words and marks scratched in wet clay into epics.
If you spot a tiger every time you go to the watering hole, you notice a pattern–and if you’re lucky, find a new watering hole. If you can’t recognize patterns, chances are good you’ll be eaten by a tiger.
Brains love patterns so much, you can trigger a state of bliss just by repeating patterns to yourself. Former schizophrenics have related to me just how nice schizophrenia can feel, which I admit seems kind of counter-intuitive, but then, I found a pattern in some data today and was so happy as a result, that I can see how that might be so.
Suppose I read you numbers at random from some dataset–say, daily rainfall in Helsinki for the past 2000 years. Each number would tell you something about that particular day, but the dataset as a whole would tell you nothing. Random data is just noise. Even if I read the numbers in order, you’d probably hear little more than noise, though if you paid attention, you might start to hear a pattern after a year or two of rainfalls.
But if I made a graph, Helskinki’s rainy October and Novembers–and dry Aprils–would suddenly stand out. We could make graphs of rainfall over years, months, or centuries. We could look for all kinds of patterns–and interesting outliers.
Once we see patterns, we find meaning.
History is the study of change. An accounting of history without patterns soon devolves into random noise. Names, dates. Names, dates. The narratives give it meaning.
I first really discovered this while trying to research the French Revolution via Wikipedia. Wikipedia tries its darndest not to impart any particular bias to its historical articles, resulting in a lot of names and dates and places, without much that ties it all together. This actually makes them hard to read; after a while my eyes glaze over and my brain starts refusing to process anymore. By contrast, pick up any book on the French Revolution, and you’ll probably discover the author’s central thesis “The peasants made them do it!” or “Crop failures drove them to revolt!” or “System breakdown!” The author takes care to marshal evidence in favor of his thesis, drawing out the patterns for you.
It took only one small book on the French Revolution for it to suddenly make sense. The was a stark difference between my brain’s willingness to follow this author’s train of thought (“The peasants made them do it!”) and my brain’s willingness to follow the Wikipedia’s N-POV articles, even though I did not necessarily agree with the author’s thesis.
To be clear, the Wikipedia is not bad for avoiding POV; many, many theses are completely wrong. You could not even begin to write an article on the French Revolution if you wanted to make an accurate presentation of all the theses people have had on the subject, or even just the major ones. The best thing for the Wikipedia is to try to present factual information, and leave it up to the readers to find their own patterns.
(The “badly written” Wikipedia articles have bias and POV-issues and actually make sense, even if I often disagree with the author’s thesis.)
Much of what I do here on this blog is look for patterns in the data. “Here’s something interesting,” I say. “Can I find any patterns? Anything that might fit this data?” It is all very speculative. I know it is speculative. I hope that you know that I know that I am speculating, and not proclaiming to know the One True Truth.
Ultimately, I wager that a lot of my theories will turn out to be wrong. The real world does not care about patterns nearly so much as our little brains do, and we are prone to seeking out patterns in data even when there really aren’t any. Sometimes shit just happens and it doesn’t really mean anything bigger than the shit that is happening right now. Maybe there is no master plan. But we can’t live without meaning. We must have our patterns to make sense of the world, so our patterns we will have.
While searching for a children’s book about that incident with Teddy Roosevelt and the bear (which you really would think someone would write a kid’s book about,) I decided to rank the importance of historical figures by number of children’s books (not YA) about them in the library database.
The round numbers are estimates, due to searches generally returning a number of irrelevant or duplicate titles that just have an author or title with a similar name to what your looking for. With the rarest subjects, I was able to count how many relevant books there were (I decided to exclude, for example, a fictional series with characters named Nick and Tesla, but you might have included them,) but for the guys with multiple hundreds of books, I just subtracted about a quarter of their score. This did not change the rankings, but it does remove some granularity.
The most important guys in the room:
Jesus: 250
Einstein: 150
Columbus: 150
George Washington: 100
Lincoln: 100
Moderately Important:
MLK: 50
Jefferson: 40
Edison: 40
Sacajawea: 30
John Brown (raid on Harper’s Ferry): 30
Rosa Parks: 30
Harriet Tubman: 30
Sojourner Truth: 30
Amelia Earhart: 25
Darwin: 20
Gandhi: 15
Washington Carver (peanuts): 15
Frida Kahlo: 15
Marie Curie: 15
Nelson Mandela: 12
Unimportant:
Isaac Newton: 10
Malcolm X: 9
Botticelli: 6
Teddy Roosevelt: 5
Beyonce: 5
Malala Yousafzazi: 5
Mary Terrell (female civil rights activist): 3
Jonas Salk (Polio Vaccine): 3
John Snow (helped eliminate Cholera, but who cares about that?): 1
Tesla: 1
Niels Bohr (father of quantum physics): 0.1 (part of a series.)
Thoughts: This is a winner-take-all economy. The cultural leaders are clearly enshrined on top. Does the library really need 100 books about George Washington? Probably not. Could it use a few more books about Teddy Roosevelt or Niels Bohr? Probably.
The cultural leaders appear to be hanging on to their positions despite modern liberalism; John Lennon is not out-selling Jesus (at least among kids.) Columbus’s numbers were a surprise to me, given that a lot of people really hate him, but his popularity is probably due to the fact that Columbus Day is still celebrated in elementary schools and school kids have to write reports about Columbus. (I wouldn’t be surprised to see Columbus’s numbers shrink quite a bit over the next few decades.)
In the Moderately Important category, we have most of our diversity and civil rights inclusions. MLK might not have risen to the levels of George Washington and Abraham Lincoln (yet), but he’s beaten out Jefferson for third-most-famous American status.
This section most exemplifies how fame is created by cultural elites (aka the Cathedral). Jesus’s popularity isn’t going anywhere anytime soon, but the fact that you know Rosa Parks’s name and not that of thousands of other people who made similar stands against segregation is due simply to a committee deciding that Rosa Parks was more likeable than they were, and so they were going to publicize her case. If someone decided to make an obscure Serbian scientist who used to work for Thomas Edison famous, he might suddenly jump from John Snow-level obscurity to Amelia Earhart fame, though the acquisition of children’s books for the library would obviously lag by a few years. And if someone decides that maybe Teddy Roosevelt isn’t so important anymore, maybe we should talk about some other guys, then Roosevelt can drop pretty quickly from #4 American to the bottom of the list.
At the bottom, we have people who are even less important than Frida Kahlo and Amelia Earhart, like Jonas Salk and John Snow. I know I harp on this a lot, but I consider it a fucking tragedy that the guys who saved the lives of millions of people are less famous than some woman who crashed a plane into the Pacific Ocean.
But humans are not mere action-reaction systems; they have qualia, an inner experience of being.
One of my themes here is the idea that various psychological traits, like anxiety, guilt, depression, or disgust, might not be just random things we feel, but exist for evolutionary reasons. Each of these emotions, when experienced moderately, may have beneficial effects. Guilt (and its cousin, shame,) helps us maintain our social relationships with other people, aiding in the maintenance of large societies. Disgust protects us from disease and helps direct sexual interest at one’s spouse, rather than random people. Anxiety helps people pay attention to crucial, important details, and mild depression may help people concentrate, stay out of trouble, or–very speculatively–have helped our ancestors hibernate during the winter.
In excess, each of these traits is damaging, but a shortage of each trait may also be harmful.
I have commented before on the remarkable statistic that 25% of women are on anti-depressants, and if we exclude women over 60 (and below 20,) the number of women with an “anxiety disorder” jumps over 30%.
The idea that a full quarter of us are actually mentally ill is simply staggering. I see three potential causes for the statistic:
Doctors prescribe anti-depressants willy-nilly to everyone who asks, whether they’re actually depressed or not;
Something about modern life is making people especially depressed and anxious;
Mental illnesses are side effects of common, beneficial conditions (similar to how sickle cell anemia is a side effect of protection from malaria.)
As you probably already know, sickle cell anemia is a genetic mutation that protects carriers from malaria. Imagine a population where 100% of people are sickle cell carriers–that is, they have one mutated gene, and one regular gene. The next generation in this population will be roughly 25% people who have two regular genes (and so die of malaria,) 50% of people who have one sickle cell and one regular gene (and so are protected,) and 25% of people will have two sickle cell genes and so die of sickle cell anemia. (I’m sure this is a very simplified scenario.)
So I consider it technically possible for 25% of people to suffer a pathological genetic condition, but unlikely–malaria is a particularly ruthless killer compared to being too cheerful.
Skipping to the point, I think there’s a little of all three going on. Each of us probably has some kind of personality “set point” that is basically determined by some combination of genetics, environmental assaults, and childhood experiences. People deviate from their set points due to random stuff that happens in their lives, (job promotions, visits from friends, car accidents, etc.,) but the way they respond to adversity and the mood they tend to return to afterwards is largely determined by their “set point.” This is all a fancy way of saying that people have personalities.
The influence of random chance on these genetic/environmental factors suggests that there should be variation in people’s emotional set points–we should see that some people are more prone to anxiety, some less prone, and some of average anxiousness.
Please note that this is a statistical should, in the same sense that, “If people are exposed to asbestos, some of them should get cancer,” not a moral should, as in, “If someone gives you a gift, you should send a thank-you note.”
Natural variation in a trait does not automatically imply pathology, but being more anxious or depressive or guilt-ridden than others can be highly unpleasant. I see nothing wrong, a priori, with people doing things that make their lives more pleasant and manageable (and don’t hurt others); this is, after all, why I enjoy a cup of coffee every morning. If you are a better, happier, more productive person with medication (or without it,) then carry on; this post is not intended as a critique of anyone’s personal mental health management, nor a suggestion for how to take care of your mental health.
Our medical/psychological health system, however, operates on the assumption that medications are for pathologies only. There is not form to fill out that says, “Patient would like anti-anxiety drugs in order to live a fuller, more productive life.”
That said, all of these emotions are obviously responses to actual stuff that happens in real life, and if 25% of women are coming down with depression or anxiety disorders, I think we should critically examine whether anxiety and depression are really the disease we need to be treating, or the body’s responses to some external threat.
In a mixed group, women become quieter, less assertive, and more compliant. This deference is shown only to men and not to other women in the group. A related phenomenon is the sex gap in self-esteem: women tend to feel less self-esteem in all social settings. The gap begins at puberty and is greatest in the 15-18 age range (Hopcroft, 2009).
If more women enter the workforce–either because they think they ought to or because circumstances force them to–and the workforce triggers depression, then as the percent of women formally employed goes up, we should see a parallel rise in mental illness rates among women. Just as Adderal and Ritalin help little boys conform to the requirements of modern classrooms, Prozac and Lithium help women cope with the stress of employment.
As we discussed yesterday, fever is not a disease, but part of your body’s system for re-asserting homeostasis by killing disease microbes and making it more difficult for them to reproduce. Extreme fevers are an over-reaction and can kill you, but a normal fever below 104 degrees or so is merely unpleasant and should be allowed to do its work of making you better. Treating a normal fever (trying to lower it) interferes with the body’s ability to fight the disease and results in longer sicknesses.
Likewise, these sorts of emotions, while definitely unpleasant, may serve some real purpose.
We humans are social beings (and political animals.) We do not exist on our own; historically, loneliness was not merely unpleasant, but a death sentence. Humans everywhere live in communities and depend on each other for survival. Without refrigeration or modern storage methods, saving food was difficult. (Unless you were an Eskimo.) If you managed to kill a deer while on your own, chances are you couldn’t eat it all before it began to rot, and then your chances of killing another deer before you started getting seriously hungry were low. But if you share your deer with your tribesmates, none of the deer goes to waste, and if they share their deer with yours, you are far less likely to go hungry.
If you end up alienated from the rest of your tribe, there’s a good chance you’ll die. It doesn’t matter if they were wrong and you were right; it doesn’t matter if they were jerks and you were the nicest person ever. If you can’t depend on them for food (and mates!) you’re dead. This is when your emotions kick in.
People complain a lot that emotions are irrational. Yes, they are. They’re probably supposed to be. There is nothing “logical” or “rational” about feeling bad because someone is mad at you over something they did wrong! And yet it happens. Not because it is logical, but because being part of the tribe is more important than who did what to whom. Your emotions exist to keep you alive, not to prove rightness or wrongness.
This is, of course, an oversimplification. Men and women have been subject to different evolutionary pressures, for example. But this is close enough for the purposes of the current conversation.
If modern people are coming down with mental illnesses at astonishing rates, then maybe there is something about modern life that is making people ill. If so, treating the symptoms may make life more bearable for people while they are subject to the disease, but still does not fundamentally address whatever it is that is making them sick in the first place.
It is my own opinion that modern life is pathological, not (in most cases,) people’s reactions to it. Modern life is pathological because it is new and therefore you aren’t adapted to it. Your ancestors have probably only lived in cities of millions of people for a few generations at most (chances are good that at least one of your great-grandparents was a farmer, if not all of them.) Naturescapes are calming and peaceful; cities noisy, crowded, and full of pollution. There is some reason why schizophrenics are found in cities and not on farms. This doesn’t mean that we should just throw out cities, but it does mean we should be thoughtful about them and their effects.
People seem to do best, emotionally, when they have the support of their kin, some degree of ethnic or national pride, economic and physical security, attend religious services, and avoid crowded cities. (Here I am, an atheist, recommending church for people.) The knowledge you are at peace with your tribe and your tribe has your back seems almost entirely absent from most people’s modern lives; instead, people are increasingly pushed into environments where they have no tribe and most people they encounter in daily life have no connection to them. Indeed, tribalism and city living don’t seem to get along very well.
To return to healthy lives, we may need to re-think the details of modernity.
Politics
Philosophically and politically, I am a great believer in moderation and virtue as the ethical, conscious application of homeostatic systems to the self and to organizations that exist for the sake of humans. Please understand that this is not moderation in the conventional sense of “sometimes I like the Republicans and sometimes I like the Democrats,” but the self-moderation necessary for bodily homeostasis reflected at the social/organizational/national level.
For example, I have posted a bit on the dangers of mass immigration, but this is not a call to close the borders and allow no one in. Rather, I suspect that there is an optimal amount–and kind–of immigration that benefits a community (and this optimal quantity will depend on various features of the community itself, like size and resources.) Thus, each community should aim for its optimal level. But since virtually no one–certainly no one in a position of influence–advocates for zero immigration, I don’t devote much time to writing against it; it is only mass immigration that is getting pushed on us, and thus mass immigration that I respond to.
Similarly, there is probably an optimal level of communal genetic diversity. Too low, and inbreeding results. Too high, and fetuses miscarry due to incompatible genes. (Rh- mothers have difficulty carrying Rh+ fetuses, for example, because their immune systems identify the fetus’s blood as foreign and therefore attack it, killing the fetus.) As in agriculture, monocultures are at great risk of getting wiped out by disease; genetic heterogeneity helps ensure that some members of a population can survive a plague. Homogeneity helps people get along with their neighbors, but too much may lead to everyone thinking through problems in similar ways. New ideas and novel ways of attacking problems often come from people who are outliers in some way, including genetics.
There is a lot of talk ’round these parts that basically blames all the crimes of modern civilization on females. Obviously I have a certain bias against such arguments–I of course prefer to believe that women are superbly competent at all things, though I do not wish to stake the functioning of civilization on that assumption. If women are good at math, they will do math; if they are good at leading, they will lead. A society that tries to force women into professions they are not inclined to is out of kilter; likewise, so is a society where women are forced out of fields they are good at. Ultimately, I care about my doctor’s competence, not their gender.
In a properly balanced society, male and female personalities complement each other, contributing to the group’s long-term survival.
Women are not accidents of nature; they are as they are because their personalities succeeded where women with different personalities did not. Women have a strong urge to be compassionate and nurturing toward others, maintain social relations, and care for those in need of help. These instincts have, for thousands of years, helped keep their families alive.
When the masculine element becomes too strong, society becomes too aggressive. Crime goes up; unwinable wars are waged; people are left to die. When the feminine element becomes too strong, society becomes too passive; invasions go unresisted; welfare spending becomes unsustainable. Society can’t solve this problem by continuing to give both sides everything they want, (this is likely to be economically disastrous,) but must actually find a way to direct them and curb their excesses.
I remember an article on the now-defunct neuropolitics (now that I think of it, the Wayback Machine probably has it somewhere,) on an experiment where groups with varying numbers of ‘liberals” and “conservatives” had to work together to accomplish tasks. The “conservatives” tended to solve their problems by creating hierarchies that organized their labor, with the leader/s giving everyone specific tasks. The “liberals” solved their problems by incorporating new members until they had enough people to solve specific tasks. The groups that performed best, overall, were those that had a mix of ideologies, allowing them to both make hierarchical structures to organize their labor and incorporate new members when needed. I don’t remember much else of the article, nor did I read the original study, so I don’t know what exactly the tasks were, or how reliable this study really was, but the basic idea of it is appealing: organize when necessary; form alliances when necessary. A good leader recognizes the skills of different people in their group and uses their authority to direct the best use of these skills.
Our current society greatly lacks in this kind of coherent, organizing direction. Most communities have very little in the way of leadership–moral, spiritual, philosophical, or material–and our society seems constantly intent on attacking and tearing down any kind of hierarchies, even those based on pure skill and competence. Likewise, much of what passes for “leadership” is people demanding that you do what they say, not demonstrating any kind of competence. But when we do find competent leaders, we would do well to let them lead.
Disease is the enemy of civilization. Wherever civilization arises, so does disease; many of our greatest triumphs have been the defeat of disease.
Homeostasis is the idea that certain systems are designed to self-correct when things go wrong–for example, when you get hot, you sweat; when you get cold, you shiver. Both actions represent your body’s natural, automatic process for keeping your body temperature within a proper range.
All living things are homeostatic systems, otherwise they could not control the effects of entropy and would fall apart. (When this happens, we call it death):
Non-living things, like robots and corporations, can also be homeostatic–by hiring new employees when old ones leave, or correcting themselves when they start to fall:
Like organisms, organizations that are not homeostatic will tend to fall apart.
For this post, we will consider four important forms of homeostasis:
Normal homeostasis: the normal feedback loops that keep the body (or organization) in its normal state under normal conditions.
Defensive homeostasis: feedback loops that are activated to defend the body against severe harm, such as disease, and reassert normal homeostasis.
Inadequate homeostasis: a body that cannot maintain or reassert normal homeostasis.
Over-aggressive homeostasis: an excessive defensive response that harms the self.
Normal Homeostasis
Normal homeostasis creates (and depends on) moderate, temperate behavior. Mundanely, when you have not eaten in a while, you grow hungry and so eat; when you have had enough, you feel satiated and so cease. When you have not slept in a long while, you grow tired and head to bed; when you have slept enough, you wake.
Obesity and starvation are both symptoms of normal homeostasis not operating as it should. They can be caused by environmental disorder (eg, crop failures,) or internal disorders, (pituitary tumors can cause weight gain,) or even just the individual’s psyche (stress renders some people unable to eat, while others cope with chocolate.)
If your body is forced out of its normal homeostatic rhythms, things begin to degenerate. After too long without sleep, (perhaps due to too many final exams, an all-night TV binge, or too many 5-hour energy drinks,) your body loses its ability to thermo-regulate; the hungry, cold, and malnourished lose their ability to fend off disease and succumb to pneumonia. Even something as obviously beneficial as hygiene can go too far–too much washing deprives the skin of its natural, protective layer of oils and beneficial microbes, leaving it open to invasion and colonization by other, less friendly microbes, like skin-eating fungi. Most of this seems obvious, but it took people a rather long time to figure out things like, “eating a 100% corn diet is bad for you.”
A body that is not in tune quickly degrades and becomes easy prey to sickness and disease; thus moderation is upheld as a great virtue and excesses as vice. A body that is properly in tune–balanced in diet, temperate in consumption, given enough exercise and rest, and nourished socially and morally–is a body that is strong, healthy, and able to deal with most of life’s vicissitudes.
(Gut bacteria are an interesting case of normal homeostasis in action. Antibiotics, while obviously beneficial in many cases, also kill much of the body’s natural gut bacteria, leading to a variety of unpleasant side effects [mostly diarrhea,] showing that too little gut bacteria is problematic. But the idea that our gut bacteria are entirely harmless is probably an over-simplification; while being effectively “along for the ride” means that their interests align roughly with ours, that is no guarantee that they will always be well-behaved. Too much gut bacteria may also be a problem. One theory I have read on why people need to sleep–and why we feel cruddy when we haven’t slept–is that our gut bacteria tend to be active during the day, which produces waste, and the buildup of bacterial waste in your bloodstream makes you feel bad. While you sleep, your body temperature drops, slowing down the bacteria and giving you a chance to clean out your systems.)
The homeostasis theory of disease–the idea that an unbalanced body loses its ability to fend off diseases and so becomes ill–should not be seen as competing with the Germ Theory of Disease, but complementing it. Intellectually, HTD has been around for a long time, informing the Greek medical treatises on the “four humours,” traditional Chinese medical ideas of the effects of “hot” and “cold” food, the general principle of Yin and Yang, many primitive notions of magic, and modern notions about probiotics. HTD has led to some obviously (in retrospect) bad ideas, like bleeding patients or eating things that aren’t particularly non-toxic. But it has also led to plenty of decent ideas, like that you should eat a “balanced” diet, enjoy life’s pleasures in moderation, or that cholera sufferers should be given lots of water.
Defensive Homeostasis
Defensive homeostasis is an extreme version of normal homeostasis. Your body is always defending itself against pathogens and injuries, but some assaults are more noticeable than others.
One of the most miserable sicknesses I have endured happened after eating raw vegetables while on vacation; I had washed them, but obviously not enough. Not only my stomach hurt, but every part of me; even my skin hurt. My body, reasoning that something was deeply wrong, did its very mighty best to eliminate any ingested toxins by every route available, profuse sweat and tears included.
Luckily, it was all over by morning, and I was left with a deep gratitude toward my body for the steps it had taken–however extreme–to make me well again.
it is important to distinguish between the effects of sickness and the effects of the homeostatic system attempting to cure itself. This is a crucial mistake people make all the time. In my case, the sickness made me feel ill by flooding my body with pathogens and their resultant toxins. The vomiting felt awful, but the vomiting was not the sickness; vomiting was my body’s attempt to rid itself of the pathogens. Taking steps to prevent the vomiting, say, by taking an anti-nausea medication, would have let the pathogens remain inside of me, doing more harm.
(Of course, it is crucial to make sure that a vomiting person does not become dehydrated.)
To use a more general example, fevers are your body’s way of killing viruses and slowing their reproduction–just as we kill microbes by cooking our food. Fevers feel unpleasant, but they are not diseases. Using medication to lower mild fevers may actually increase [PDF] mortality by interfering with the body’s ability to kill the disease. Quoting from the PDF:
“…children with chickenpox who are treated with acetaminophen have been shown to have a longer time to total crusting of lesions than do placebo-treated control subjects [15]. In addition, adults with rhinovirus infections exhibit a longer duration of viral shedding and increased nasal signs and symptoms when treated with antipyretic medications [16].”
Additionally, artificially depressing how sick you feel increases the likelihood of getting out of bed and moving around, which in turn increases the likelihood of spreading your sickness to other people.
Fevers of 105 degrees F or above are excessive and do have the potential to harm you, and should be treated. But a fever of 102 should be allowed to do its work.
Likewise, in the case of cholera, the most effective treatment is to keep the sufferer hydrated (or re-hydrate them) until their body can wipe out the disease. (Cholera basically makes you lose all of your bodily fluids and die of dehydration.) It is easy to underestimate just how much water the sufferer has lost; according to Wikipedia, “Ten percent of a person’s body weight in fluid may need to be given in the first two to four hours.[12]” Keep in mind the need to replenish potassium levels while you re-hydrate; if you don’t have any special re-hydration drinks, you can just boil 1 liter of water and add 1/2 teaspoon of salt, 6 teaspoons of sugar, and 1 mashed banana; in a pinch, probably any clean beverage is better than nothing. Untreated, 50-90% of cholera victims die; with rehydration, the death rate amazingly drops below 1%:
“In untreated cases the death rate is high, averaging 50%, and as high as 90% in epidemics, but with effective treatment the death rate is less than 1%. The intravenous and oral replacement of body fluids and essential electrolytes and the restoration of kidney function are more important in therapy than the administration of antibacterial drugs.”
This is super important, so I’m going to repeat it: Don’t confuse the effects of sickness and the effects of the homeostatic system attempting to cure itself. This goes for organizations and societies, too.
Unfortunately, much of our economic theory is not based on the idea that societies–or the Earth–trend toward homeostasis, but on the assumption of infinite growth. The economic proponents of open borders, for example, basically seem to think that there are no theoretical limits to the number of people who can move to Europe and the US and take up a Western lifestyle.
Pension plans (and Social Security) were also designed with infinite growth in mind. Now that TFRs have dropped below replacement across the developed world, many countries are faced with the horrifying prospect that old people may not be able to depend on the incomes of children they didn’t create for their retirement. I suppose the solution to such a problem is that you only let people with 3+ children have pensions, or design a pension system that doesn’t require a never-ending process of population expansion, because the planet cannot hold infinite numbers of people.
Declining TFR is not a disease, it is a symptom, most likely of countries where ordinary people struggle to afford children. The fertility rate will pick back up once the population has shrunk enough that there are enough resources per person–including space–to make having children an attractive option.
But to those obsessively focused on their unsustainable pensions, low TFR is a disease, and it has to be fixed by bringing in more people, preferably people who will have lots of children.
“Japan must import more people!” the NY Times constantly screams. “They don’t have enough to fill the pensions!”
Just as treating a fever inhibits your body’s ability to fight the real disease, so importing people to combat a low TFR inhibits your country’s ability to return to a proper ratio of resource to people, making the problem much, much worse.
Mass immigration => bigger labor market => lower wages => lower TFR => underfunded pensions => demands for more immigrants.
Inadequate Homeostasis
Inadequate and over-active homeostastic systems are pathologic conditions rendering the self unable to respond appropriately to changing conditions in order to reassert normal homeostasis. For example, people with a certain mutation in the ITPR2 gene cannot sweat, increasing their chances of dangerously overheating. People with AIDS, of course, have deficient immune systems, because the virus specifically attacks immune cells.
Inability to maintain or reassert homeostasis in biological systems is most likely a result of damage due to mutation or infection. In a non-organism, it is more likely a result of the organization or entity just having been created with inadequate homeostatic systems.
A mundane example is a city that has expanded and so can no longer handle the amount of traffic, trash, and rainwater run-off it produces. The original systems, such as sewers, roads, and trash collection, could handle the city’s normal variations back when they were designed, but no longer. Traffic jams, flooding, and giant piles of trash ensue.
At this point, a city has two choices: increase systemic complexity (ie, upgrade the infrastructure,) or decrease the amount of waste it produces by people dying/moving away.
Rome had obviously been in decline since around 100 AD, probably due to the Antonin Plague–most plagues are, of course, homeostasis violently reasserting itself as a result of human societies becoming too big for their hygiene systems. In the 400s, the Roman empire collapsed, leading to sieges, famines, and violent barbarian invasions and an end to tax revenues and supply networks that had formerly supported the city.
By 752, Rome had dropped from 1.65 million people to 40,000 people, but the city reached its true nadir in 1347, when plague reduced the population to 17,000, which is even lower than the estimates for 800 BC. Rome would not return to its previous high until 1850, though if I know anything about near-vertical lines on graphs, it’s that they don’t go up forever. When the collapse begins again, I wonder if the city will return to its 1000s population, or stabilize at some new level.
I’ve spoken before of La Griffe du Lion‘s Smart Fraction Theory, which posits that a country’s GDP correlates with the percent of its population with (verbal) IQs over 120. These are the people who can plan and maintain complex systems. This suggests that, unless IQs increase over time, counties may have a natural limit complexity limit they can’t pass, (but many countries may not be operating at their complexity limits.)
A different kind of inadequate homeostasis is Mission creep, when organizations start seeing it as their job to do more and more things not within their original mandate, as when the Sierra Club starts championing SJW causes; in these cases, the organization lacks proper feedback mechanisms to keep itself on-task. Eventually, like MTV, the organization loses sight entirely of its original purpose (though to be fair, MTV still exists, so it’s strategy hasn’t been unsuccessful.)
Over-Active Homeostasis
Allergies and auto-immune disorders are classic examples of over-active homeostatic systems. Allergies happen when the body responds to normal stimuli like pollen or food as though they were pathogens; auto-immune disorders involve the immune system accidentally attacking the body’s own cells instead of pathogens.
Millions of years of evolution have equipped our bodies with self-correcting systems to keep us functioning, so that human pathologies are relatively easy to identify. Organizations, however, have endured far fewer years of evolutionary pressure, so their homeostatic systems are much cruder and more likely to fail. We can understand biological pathologies fairly well, but often fail to identify organizational pathologies entirely; even when we do have some sense* that things are definitely wrong, it’s hard to say exactly what, much less identify a coherent plan to fix it and then convince other people to actually do it.
*or perhaps in your case, dear reader, a definite sense
For organizations to continue working, they need adequate homeostatic systems to keep them on track and prevent both under and over reactions. The US Constitution, for example, establishes a system of “checks and balances” and “separate powers” mandated to the executive, legislative, and judicial branches, not to mention federal, state, and individual levels (via voting and citizen juries.) For all its flaws, this system has managed to basically keep going for over 200 years, making it one of the oldest systems of continuous governance in the world, (most of the world’s governments were established following the breakup of colonial empires and the Soviet Union), but these system probably needs revision over time to keep it functioning. (We can further discuss a variety of ways to keep systems functional elsewhere, but Slate Star Codex’s post on Why don’t Whales get Cancer? [basically, the theory is that whales are so big that their cancers get cancer and kill themselves before they kill the whale] seems relevant.)
All human civilization depends on homeostatic systems to keep everyone in them alive. We may think of civilization as order, but it is not perfect order. Perfect order is a crystal; perfect order is absolute zero. It is not alive; it does not change, move, or adapt. Life is a braid in spacetime; civilization is homeostatic.
Simply put, European cities prior to the installation of underground sewers and water purification plants were disgusting, filth-ridden cesspools where the average citizen stood an astronomical chance of being felled by fecal-born diseases. How the cities got to be so revolting is beyond me–it may just be a side effect of living in any kind of city before the invention of effective sewers. Nevertheless, European city dwellers drank their own feces until everyone started catching cholera. (Not to mention E. coli, smallpox, syphilis, typhus, tuberculosis, measles, dysentery, Bubonic Plague, gonorrhea, leprosy, malaria, etc.)
The average superstitious “primitive” knows that dead bodies contain mystical evil contamination properties, and that touching rotting carcasses can infect you with magical death particles that will then kill you (or if you are a witch, your intended victims,) but Europeans were too smart for such nonsense; Ignaz Semmelweis, the guy who insisted that doctors were killing mothers by infecting them with corpse particles by not washing their hands between autopsying dead bodies and delivering babies, was hauled off to an insane asylum and immediately stomped to death by the guards.
The women, of course, had figured out that some hospitals murdered their patients and some hospitals did not; the women begged not to be sent to the patient-murdering hospitals, but such opinions were, again, mere superstitions that the educated classes knew to ignore.
It is amazing what man finds himself suddenly unable to comprehend so long as his incomprehension is necessary for making money, whether it be the amount of food necessary to prevent a child from starving or that you should not wallow in feces.
Forgive me my vitriol, but there are few things I hate worse than disease, and those who willfully spread death and suffering should be dragged into the desert and shot.
Cleanliness is next to Godliness.
Anyway, back to our story. The much-beleagured “Dark Ages” of Medieval Europe was actually a time of relatively few diseases, just because the population was too low for much major disease transmission, but as the trade routes expanded and cities grew, epidemic after epidemic swept the continent. The Black Death came in 1346, carrying off 75 to 200 million people, or 30-60% of the population. According to the Wikipedia, “Before 1350, there were about 170,000 settlements in Germany, and this was reduced by nearly 40,000 by 1450.” The Black Plague would not disappear from Europe until the 1700s, though it returned again around 1900–infecting San Francisco at the same time–in the little known “Third Plague” outbreak that killed approximately 15 million people, (most of them in India and China,) and officially ended in 1959.
(BTW, rodents throughout much of the world, including America, still harbor plague-bearing fleas which do actually still give people the plague, so be cautious about contact with wild rodents or their carcases, and if you think you have been infected, get to a hospital immediately because modern medicine can generally cure it.)
Toward the end of the 1700s, smallpox killed about 400,000 Europeans per year, wiping out 20-60% of those it infected.
Cholera spreads via the contamination of drinking water with cholera-laden diarrhea. Prevention is simple: don’t shit in the drinking water. If you can’t convince people not to shit in the water supply, then boil, chlorinate, sterilize, filter, or do whatever it takes to get your water clean.
In 1832, Cholera struck the UK, killing 53,000 people; France lost 100,000. In 1854, epidemiologist John Snow risked his life to track the cholera outbreak in Soho, London. His work resulted in one of history’s most important maps:
Each black line represents a death from cholera.
The medical profession of Snow’s day believed that cholera was spread through bad air–miasmas–and that Snow was a madman for being anywhere near air breathed out by cholera sufferers. Snow’s map not only showed that the outbreak was concentrated around one water source, (the PUMP in the center of the map,) but also showed one building on Broad street that had been mysteriously spared the contagion, suffering zero deaths: the brewery.
The monks of the brewery did not drink unadulterated water from the pump; they were drinking beer, breakfast, lunch, and dinner. Drinking nothing but beer might sound like a bad strategy, especially if you need to drive anywhere, but beer has a definite advantage over water: fermentation kills pathogens.
It wasn’t until 1866 that the establishment finally started admitting the unpleasant truth that people were catching cholera because they were drinking poop water, but since then, John Snow’s work has saved the lives of millions of people.
Good luck finding anyone who remembers Snow’s name today–much less Semmelweis’s–but virtually every school child in America knows about Amelia Earhart, a woman who’s claim to fame is that she failed to cross the Pacific Ocean in a plane. (Sorry, I was looking at children’s biographies today, and Amelia Earhart remains one of my pet peeves in the category of “Why would I try to inspire girls via failure?”)
But that is all beside the point, which is simply that Europeans who drank lots of beer lived, while Europeans who drank water died. This is the sort of thing that can exert a pretty strong selective pressure on people to drink lots of beer.
Meanwhile, Back in America…
While Americans were not immune to European diseases, lower population density made it harder for epidemics to spread. The same plague that killed 13 million people in China and India killed a mere couple hundred in San Francisco, and appears to have never killed significant numbers in other states.
Low population density meant, among other things, far less excrement in the water. American water was probably far less contaminated than European water, and so Americans had undergone much less selective pressure to drink nothing but beer.
Many American religious groups took a dim view of alcohol. The Puritans did not ban alcohol, but believed it should be drunk in moderation and looked down on drunkenness. The Methodists, another Protestant group that broke away from the Anglican Church in the late 1700s and spread swiftly in America, were against alcohol from their start. Methodist ministers were to drink chiefly water, and by the mid-1880s, they were using “unfermented wine” for their sacraments. The Presbyterians began spreading the anti-alcohol message during the Second Great Awakening, and by 1879, Catherine Booth, co-founder of the Salvation Army, claimed that in America, “almost every [Protestant] Christian minister has become an abstainer.” (source) Even today, many Southern Baptists, Mormons, and Seventh Day Adventists abstain entirely from alcohol, the Mormons apparently going so far as to use water instead of wine in their sacraments.
Temperance movements also existed in Europe and other European colonies, but never reached the same heights as they did in the US. Simply put, where the water was bad, poor people could not afford to drink non-fermented beverages. Where the water was pure, people could claim drinking it a necessary piece of salvation.
As American cities filled with poor, desperate foreigners fleeing the famine and filth of Europe, their penchant for violent outbursts following over-indulgence in alcohol was not lost on their new neighbors, and so Prohibition’s coalition began to form: women, who were most often on the receiving end of drunken violence; the Ku Klux Klan, which had it out for foreigners generally and Papists especially; and the Protestant ministers, who were opposed to both alcoholism and Papism.
The Germans were never considered as problematic as the Irish, being more likely to be employed and less likely to be engaged in drunken crime, but they held themselves apart from the rest of society, living in their own communities, joining German-specific social clubs, and still speaking German instead of English, which did not necessarily endear them to their neighbors.
Prohibition was opposed primarily by wealthy Germans, (especially the brewers among them;) Episcopalians, (who were afraid their sacramental wine would be banned;) and Catholics. The breweries also campaigned against Women’s Suffrage, on the grounds that pretty much all of the suffragettes were calling for Prohibition.
WWI broke the German community by making it suddenly a very bad idea to be publicly German, and people decided that using American grain to brew German beer instead of sending that grain to feed the fighting men on the front lines was very unpatriotic indeed. President Wilson championed the income tax, which allowed the Federal Government to run off something other than alcohol taxes, women received the right to vote, and Prohibition became the law of the land–at least until 1933, when everyone decided it just wasn’t working out so well.
But by that time, the drinking water problem had been mostly worked out, so people at least had a choice of beverages they could safely and legally imbibe.