Anthropology Friday: Japan pt 1

Sidney Lewis Guilick, 1919

Welcome to Anthropology Friday. Today we’ll be reading Sidney L. Gulick’s Evolution of the Japanese, Social and Psychic, published in 1903.

Gulick, son and grandson of missionaries, born in the Marshall Islands, received degrees from Dartmouth, Oberlin, and Yale, and was ordained as a Congregational (ie Puritan) minister. His own missionary work took him to Japan in 1888, where he lived for 25 years, observing first-hand many of the remarkable changes overtaking that nation.

In writing this book, Gulick doubtless had several objectives, among them describing and demystifying Japan–a nation about which very little reliable information existed and was undergoing an incredible transformation–and refuting the notion of an essential racial character of “Asians,” (an evolutionist notion that respectable American Christians have forcefully opposed.)

After returning to the US, Gulick campaigned for world peace and tried hard to improve American/Japanese relations. Unfortunately he died in 1945–he must have been very sad about the state of the world as he passed away. It’s a pity he didn’t live long enough to see America and Japan becomes friends.

Japan, in case you are unfamiliar with its history, was largely closed to the rest of the world from about 1635 through 1853. During this time it developed in relative isolation, cut off from trade and technological developments going on in the rest of the world. In 1853, the Japanese appeared “backward” to the rest of the world because of their relatively low level of technological development, but this was largely an illusion–once exposed to the outside world, Japan industrialized at an incredible rate. (Probably the fastest industrialization undertaken by a native population without an outside force telling them what to do.)

But let’s let Gulick tell the story (quotes are in “” instead of blockquotes, as usual):

“Seldom, perhaps never, has the civilized world so suddenly and completely reversed an estimate of a nation as it has that with reference to Japan. Before the recent war, to the majority even of fairly educated men, Japan was little more than a name for a few small islands somewhere near China, whose people were peculiar and interesting. To-day there is probably not a man, or woman, or child attending school in any part of the civilized world, who does not know the main facts about the recent war: how the small country and the men of small stature, sarcastically described by their foes as “Wojen,” pygmy, attacked the army and navy of a country ten times their size.

“Such a universal change of opinion regarding a nation, especially regarding one so remote from the centers of Western civilization as Japan, could not have taken place in any previous generation. The telegraph, the daily paper, the intelligent reporters and writers of books and magazine articles, the rapid steam travel and the many travelers—all these have made possible this sudden acquisition of knowledge and startling reversal of opinion. …

“In important ways, therefore, Japan seems to be contradicting our theories of national growth. We have thought that no “heathen” nation could possibly gain, much less wield, unaided by Westerners, the forces of civilized Christendom. We have likewise held that national growth is a slow process, a gradual evolution, extending over scores and centuries of years. In both respects our theories seem to be at fault. This “little nation of little people,” which we have been so ready to condemn as “heathen” and “uncivilized,” and thus to despise, or to ignore, has in a single generation leaped into the forefront of the world’s attention.”

History

“How many of the stories of the Kojiki (written in 712 A.D.) and Nihongi (720 A.D.) are to be accepted is still a matter of dispute among scholars. Certain it is, however, that Japanese early history is veiled in a mythology which seems to center about three prominent points: Kyushu, in the south; Yamato, in the east central, and Izumo in the west central region. This mythological history narrates the circumstances of the victory of the southern descendants of the gods over the two central regions. And it has been conjectured that these three centers represent three waves of migration that brought the ancestors of the present inhabitants of Japan to these shores. The supposition is that they came quite independently and began their conflicts only after long periods of residence and multiplication.

“Though this early record is largely mythological, tradition shows us the progenitors of the modern Japanese people as conquerors from the west and south who drove the aborigines before them and gradually took possession of the entire land. That these conquerors were not all of the same stock is proved by the physical appearance of the Japanese to-day, and by their language. Through these the student traces an early mixture of races—the Malay, the Mongolian, and the Ural-Altaic. Whether the early crossing of these races bears vital relation to the plasticity of the Japanese is a question which tempts the scholar. …

“The national governmental system was materially affected by the need, throughout many centuries, of systematic methods of defense against the Ainu. The rise of the Shogunate dates back to 883 A.D., when the chief of the forces opposing the Ainu was appointed by the Emperor and bore the official title, “The Barbarian-expelling Generalissimo.”

During the Isolation Period

“Of Old Japan little more needs to be said. Without external commerce, there was little need for internal trade; ships were small; roads were footpaths; education was limited to the samurai, or military class, retainers of the daimyo, “feudal lords”; inter-clan travel was limited and discouraged; Confucian ethics was the moral standard. From the beginning of the seventeenth century Christianity was forbidden by edict, and was popularly known as the “evil way”; Japan was thought to be especially sacred, and the coming of foreigners was supposed to pollute the land and to be the cause of physical evils. Education, as in China, was limited to the Chinese classics. Mathematics, general history, and science, in the modern sense, were of course wholly unknown. Guns and powder were brought from the West in the sixteenth century by Spaniards and Portuguese, but were never improved. Ship-building was the same in the middle of the nineteenth century as in the middle of the sixteenth, perhaps even less advanced. Architecture had received its great impulse from the introduction of Buddhism in the ninth and tenth centuries and had made no material improvement thereafter. …

“But while there was little progress in the external and mechanical elements of civilization, there was progress in other respects. During the “great peace,” first arose great scholars. Culture became more general throughout the nation. Education was esteemed. The corrupt lives of the priests were condemned and an effort was made to reform life through the revival of a certain school of Confucian teachers known as “Shin-Gaku”—”Heart-Knowledge.” Art also made progress, both pictorial and manual.”

The Transition from Clannishness to Nationalism

“A natural outcome of the Restoration is the exuberant patriotism that is so characteristic a feature of New Japan. The very term “ai-koku-shin” is a new creation, almost as new as the thing. This word is an incidental proof of the general correctness of the contention of this chapter that true nationality is a recent product in Japan. The term, literally translated, is “love-country heart”; but the point for us to notice particularly is the term for country, “koku”; this word has never before meant the country as a whole, but only the territory of a clan. If I wish to ask a Japanese what part of Japan is his native home, I must use this word. And if a Japanese wishes to ask me which of the foreign lands I am a native of, he must use the same word. The truth is that Old Japan did not have any common word corresponding to the English term, “My country.” In ancient times, this could only mean, “My clan-territory.” But with the passing away of the clans the old word has taken on a new significance. The new word, “ai-koku-shin,” refers not to love of clan, but to love of the whole nation. The conception of national unity has at last seized upon the national mind and heart, and is giving the people an enthusiasm for the nation, regardless of the parts, which they never before knew. Japanese patriotism has only in this generation come to self-consciousness. This leads it to many a strange freak. It is vociferous and imperious, and often very impractical and Chauvinistic. It frequently takes the form of uncompromising disdain for the foreigner, and the most absolute loyalty to the Emperor of Japan; it demands the utmost respect of expression in regard to him and the form of government he has graciously granted the nation.”

Flexibility:

“We naturally begin with that characteristic of Japanese nature which would seem to be more truly congenital than any other to be mentioned later. I refer to their sensitiveness to environment. More quickly than most races do the Japanese seem to perceive and adapt themselves to changed conditions.

“The history of the past thirty years is a prolonged illustration of this characteristic. The desire to imitate foreign nations was not a real reason for the overthrow of feudalism, but there was, rather, a more or less conscious feeling, rapidly pervading the whole people, that the feudal system would be unable to maintain the national integrity. As intimated, the matter was not so much reasoned out as felt. But such a vast illustration is more difficult to appreciate than some individual instances, of which I have noted several.

“During a conversation with Drs. Forsythe and Dale, of Cambridge, England, I asked particularly as to their experience with the Japanese students who had been there to study. They both remarked on the fact that all Japanese students were easily influenced by those with whom they customarily associated; so much so that, within a short time, they acquired not only the cut of coats and trousers, but also the manner and accent, of those with whom they lived. It was amusing, they said, to see what transformations were wrought in those who went to the Continent for their long vacations. From France they returned with marked French manners and tones and clothes, while from Germany they brought the distinctive marks of German stiffness in manner and general bearing. It was noted as still more curious that the same student would illustrate both variations, provided he spent one summer in Germany and another in France.

“Japanese sensitiveness is manifested in many unexpected ways. An observant missionary lady once remarked that she had often wondered how such unruly, self-willed children as grow up under Japanese training, or its lack, finally become such respectable members of society. She concluded that instead of being punished out of their misbehaviors they were laughed out of them. The children are constantly told that if they do so and so they will be laughed at—a terrible thing. …

“The Japanese young man who is making a typewritten copy of these pages for me says that, when still young, he heard an address to children which he still remembers. The speaker asked what the most fearful thing in the world was. Many replies were given by the children—”snakes,” “wild beasts,” “fathers,” “gods,” “ghosts,” “demons,” “Satan,” “hell,” etc. These were admitted to be fearful, but the speaker told the children that one other thing was to be more feared than all else, namely, “to be laughed at.” This speech, with its vivid illustrations, made a lasting impression on the mind of the boy, and on reading what I had written he realized how powerful a motive fear of ridicule had been in his own life; also how large a part it plays in the moral education of the young in Japan. …

“Closely connected with this sensitiveness to environment are other qualities which make it effective. They are: great flexibility, adjustability, agility (both mental and physical), and the powers of keen attention to details and of exact imitation.

“As opposed to all this is the Chinese lack of flexibility. Contrast a Chinaman and a Japanese after each has been in America a year. The one to all appearances is an American; his hat, his clothing, his manner, seem so like those of an American that were it not for his small size, Mongolian type of face, and defective English, he could easily be mistaken for one. How different is it with the Chinaman! He retains his curious cue with a tenacity that is as intense as it is characteristic. His hat is the conventional one adopted by all Chinese immigrants. His clothing likewise, though far from Chinese, is nevertheless entirely un-American. …

“The Japanese desire to conform to the customs and appearances of those about him is due to what I have called sensitiveness; his success is due to the flexibility of his mental constitution.”

Gulick’s explanation for these differences:

“The difference between Japanese imitation and that of other nations lies in the fact that whereas the latter, as a rule, despise foreign races, and do not admit the superiority of alien civilizations as a whole, imitating only a detail here and there, often without acknowledgment and sometimes even without knowledge, the Japanese, on the other hand, have repeatedly been placed in such circumstances as to see the superiority of foreign civilizations as a whole, and to desire their general adoption. This has produced a spirit of imitation among all the individuals of the race. It has become a part of their social inheritance. … The Japanese go to the West in order to acquire all the West can give. The Chinaman goes steeled against its influences. … Under special circumstances, when a Chinaman has been liberated from the prepossession of his social inheritance, he has shown himself as capable of Occidentalization in clothing, speech, manner, and thought as a Japanese.”

EvX: Remember, Gulick is responding to the idea that there exists a singular “racial character” particular to Asians by contrasting Japanese and Chinese, and claiming that the social and environmental conditions in each country result in their differences.

“But a still more effective factor in the development of the characteristics under consideration is the nature of Japanese feudalism. Its emphasis on the complete subordination of the inferior to the superior was one of its conspicuous features. This was a factor always and everywhere at work in Japan. No individual was beyond its potent influence. Attention to details, absolute obedience, constant, conscious imitation, secretiveness, suspiciousness, were all highly developed by this social system. Each of these traits is a special form of sensitiveness to environment. From the most ancient times the initiative of superiors was essential to the wide adoption by the people of any new idea or custom. …

“Susceptibility to slight changes in the feelings of lords and masters and corresponding flexibility were important social traits, necessary products of the old social order. Those deficient in these regards would inevitably lose in the struggle for social precedence, if not in the actual struggle for existence. These characteristics would, accordingly, be highly developed.”

The Value of Viral Memes

Note: “Memes” on this blog is used as it is in the field of memetics, representing units of ideas that are passed from person to person, not in the sense of “funny cat pictures on the internet.”

“Mitochondrial memes” are memes that are passed vertically from parent to child, like “it’s important to eat your dinner before desert” or “brush your teeth twice a day or your teeth will rot out.”

“Meme viruses” (I try to avoid the confusing phrase, “viral memes,”) are memes that are transmitted horizontally through society, like chain letters and TV news.

I’ve spent a fair amount of time warning about some of the potential negative results of meme viruses, but today I’d like to discuss one of their greatest strengths: you can transmit them to other people without using them yourself.

Let’s start with genetics. It is very easy to quickly evolve in a particular direction if a variety of relevant traits already exist in a population. For example, humans already vary in height, so if you wanted to, say, make everyone on Earth shorter, you would just have to stop all of the tall people from reproducing. The short people would create the next generation, and it would be short.

But getting the adult human height below 3″ tall requires not just existing, normal human height variation, but exploiting random mutations. These are rare and the people who have them normally incur huge reductions in fitness, as they often have problems with bone growth, intelligence, and giving birth.

Most random mutations simply result in an organism’s death. Very few are useful, and those that are have to beat out all of the other local genetic combinations to actually stick around.

Suppose you happen to be born with a very lucky genetic trait: a rare mutation that lets you survive more easily in an arctic environment.

But you were born in Sudan.

Your genetic trait could be really useful if you could somehow give it away to someone in Siberia, but no, you are stuck in Sudan and you are really hot all of the time and then you die of heatstroke.

With the evolution of complex thought, humans (near alone among animals) developed the ability to go beyond mere genetic abilities, instincts, and impulses, and impart stores of knowledge to the next generation. Humanity has been accumulating mitochondrial memes for millions of years, ever since the first human showed another human how to wield fire and create stone tools. (Note: the use of fire and stone tools predates the emergence of homo Sapiens by a long while, but not the Homo genus.)

But mitochondrial memes, to get passed on, need to offer some immediate benefit to their holders. Humans are smart enough–and the utility of information unpredictable enough–that we can hold some not obviously useful or absurd ideas, but the bulk of our efforts have to go toward information that helps us survive.

(By definition, mitochondrial memes aren’t written down; they have to be remembered.)

If an idea doesn’t offer some benefit to its holder, it is likely to be quickly forgotten–even if it could be very useful to someone else.

Suppose one day you happen to have a brilliant new idea for how to keep warm in a very cold environment–but you live in Sudan. If you can’t tell your idea to anyone who lives somewhere cold, your idea will never be useful. It will die with you.

But introduce writing, and ideas of no use to their holder can be recorded and transmitted to people who can use them. For example, in 1502, Leonardo da Vinci designed a 720-foot (220 m) bridge for Ottoman Sultan Beyazid II of Constantinople. The sultan never built Leonardo’s bridge, but in 2001, a bridge based on his design was finally built in Norway. Leonardo’s ideas for flying machines, while also not immediately useful, inspired generations of future engineers.

Viral memes don’t have to be immediately useful to stick around. They can be written down, tucked into a book, and picked up again a hundred years later and a thousand miles away by someone who can use them. A person living in Sudan can invent a better way to stay warm, write it down, and send it to someone in Siberia–and someone in Siberia can invent a better way to stay cool, write it down, and send it back.

Original Morse Telegraph machine, circa 1835

Many modern scientific and technological advances are based on the contributions of not one or two or ten inventors, but thousands, each contributing their unpredictable part to the overall whole. Electricity, for example, was a mere curiosity when Thales of Miletus wrote about effects of rubbing amber to produce static electricity (the word “electricity” is actually derived from the Greek for “amber”;) between 1600 and 1800, scientists began studying electricity in a more systematic way, but it still wasn’t useful. It was only with the invention of the telegraph from many different electrical parts and systems, (first working model, 1816; first telegram sent in the US, 1838;) that electricity became useful. With the invention of electric lights and the electrical grids necessary to power them (1870s and 80s,) electricity moved into people’s homes.

The advent of meme viruses has thus given humanity two gifts: 1. People can use technology like books and the internet to store more information than we can naturally, like external hard-drives for our brains; and 2. we can preserve and transmit ideas that aren’t immediately useful to ourselves to people who can use them.

Dangerous Memes

Homo sapiens is about 200-300,000 years old, depending on exactly where you draw the line between us and our immediate ancestors. Printing (and eventually mass literacy) only got going about 550 years ago, with the development of the Gutenberg press. TV, radio, movies, and the internet only became widespread within the past century, and internet in the past 25 years.

In other words, for 99.99% of human history, “mass media” didn’t exist.

How did illiterate peasants learn about the world, if not from books, TV, or Youtube videos? Naturally, from each other: parents passed knowledge to children; tribal elders taught their wisdom to other members of their tribes; teenagers were apprenticed to masters who already knew a trade, etc.

A hundred years ago, if you wanted to know how to build a wagon, raise a barn, or plant corn, you generally had to find someone who knew how to do so and ask them. Today, you ask the internet.

Getting all of your information from people you know is limiting, but it has two advantages: you can easily judge whether the source of your information is reliable, (you’re not going to take farming advice from your Uncle Bob whose crops always fail,) and most of the people giving you information have your best interests at heart.

Forgoing reproduction tends to be a pretty big hit to one’s reproductive success (source)

The internet’s strength is that it lets us talk to people from outside our own communities; it’s weakness is that this makes it much easier for people (say, Nigerian princes with extra bank accounts,) to get away with lying. They also have no particular interest one way or another in your survival–unlike your parents.

In a mitochondrial memetic environment (that is, an environment where you get most of your information from relatives,) memes that could kill you tend to get selected against: parents who encourage their children to eat poison tend not to have grandchildren. From an evolutionary perspective, deadly memes are selected against in a mitochondrial environment; memes will evolve to support your survival.

By contrast, in a viral meme environment, (that is, an environment where ideas can easily pass from person to person without anyone having to give birth,) your personal survival is not all that important to the idea’s success.

Total Fertility Rate by Country–odd that the Guardian’s anti-fertility message wasn’t aimed at the people with the highest fertility

So one of the risks of viral memes is getting scammed: memetically, infected by an idea that sounds good but actually benefits someone else at your expense.

In the mitochondrial environment, we expect people to be basically cautious; in the viral, less cautious.

Suppose we have two different groups (Group A and Group B) interacting. 25% of Group B is violent criminals, versus 5% of Group A. Folks in group A would quite logically want to avoid Group B. But 75% of Group B is not violent criminals, and would logically not want to be lumped in with criminals. (For that matter, neither do the 25% who are.)

If you think my numbers are unrealistic, consider that the NAACP says that African Americans are incarcerated at 5x the rates of whites,  and if you look at specific subpops–say, black men between the ages of 15 and 35 vs white women over the age of 40–the difference in incarceration rates is even larger (HuffPo claims that 33% of black men will go to prison sometime in their lifetimes.)

In an ideal world, we could easily sort out violent criminals from the rest of the population, allowing the innocent people to freely associate. In the real world, we have to make judgment calls. Lean a bit toward the side of caution, and you exclude more criminals, but also more innocents; lean the opposite direction and innocent people have an easier time finding jobs and houses, but more people get killed by criminals.

Let’s put it less abstractly: suppose you are walking down a dimly-lit street at night and see a suspicious looking person coming toward you. It costs you almost nothing to cross the street to avoid them, while not crossing the street could cost you your life. The person you avoided, if they are innocent, incurs only the expense of potentially having their feelings hurt; if they are a criminal, they have lost a victim.

Companies also want to avoid criminals, which makes it hard for ex-cons to get jobs (which is an issue if we want folks who are no longer in prison to have an opportunity to earn an honest living besides going on welfare.) Unfortunately, efforts to improve employment chances for ex-cons by preventing employers from inquiring directly about criminal history have resulted in employers using rougher heuristics to exclude felons, like simply not hiring young African American males. Since most companies have far more qualified job applicants than available jobs, the cost to them of excluding young African American males is fairly low–while the cost to African Americans is fairly high.

One of the interesting things about the past 200 years is the West’s historically unprecedented shift from racial apartheid/segregation and actual race-based slavery to full legal (if not always de facto) racial integration.

One of the causes of this shift was doubtless the transition from traditional production modes like farming and horticulture to the modern, industrial economy. Subsistence farming didn’t require a whole lot of employees. Medieval peasants didn’t change occupations very often: most folks ended up working in the same professions as their parents, grandparents, and great-grandparents (usually farming,) probably even on the same estate.

It was only with industrialization that people and their professions began uncoupling; a person could now hold multiple different jobs, in different fields, over the span of years.

Of course, there were beginnings of this before the 1800s–just as people read books before the 1800s–but accelerating technological development accelerated the trends.

But while capitalists want to hire the best possible workers for the lowest possible wages, this doesn’t get us all the way to the complete change we’ve witnessed in racial mores. After all, companies don’t want to hire criminals, either, and any population that produces a lot of criminals tends not to produce a whole lot of really competent workers.

However, the rise of mass communication has allowed us to listen to and empathize with far more people than ever before. When Martin Luther King marched on Washington and asked to be judged by the content of his character rather than the color of his skin, his request only reached national audiences because of modern media, because we now live in a society of meme viruses. And it worked: integration happened.

Also, crime went up dramatically:

While we’re at it:

Integration triggered a massive increase in crime, which only stopped because… well, we’re not sure, but a corresponding massive increase in the incarceration rate (and sentences) has probably stopped a lot of criminals from committing additional crimes.

Most of these homicides were black on black, but plenty of the victims were white, even as they sold their devalued homes and fled the violence. (Housing integration appears to have struck America’s “ethnic” neighborhoods of Italians, Irish, and Jews particularly hard, destroying coherent communities and, I assume, voting blocks.)

From the white perspective, integration was tremendously costly: people died. Segregation might not be fair, it might kill black people, but it certainly prevented the murder of whites. But segregation, as discussed, does have some costs for whites: you are more limited in all of your transactions, both economic and personal. You can’t sell your house to just anyone you want. Can’t hire anyone you want. Can’t fall in love with anyone you want.

But obviously segregation is far more harmful to African Americans.

Despite all of the trouble integration has caused for whites, the majority claim to believe in it–even though their feet tell a different story. This at least superficial change in attitudes, I believe, was triggered by the nature of the viral memetic environment.

Within the mitochondrial meme environment, you listen to people who care about your survival and they pass on ideas intended to help you survive. They don’t typically pass on ideas that sacrifice your survival for the sake of others, at least not for long. Your parents will tell you that if you see someone suspicious, you should cross the street and get away.

In the viral environment, you interact far more with people who have their own interests in mind, not yours, and these folks would be perfectly happy for you to sacrifice your survival for their sake. The good folks at Penn State would like you to know that locking your car door when a black person passes by is a “microaggression:”

Former President Obama once said in his speech that he was followed when he was shopping in a store, heard the doors of cars locked as he was walking by, and a woman showed extremely nervousness as he got on an elevator with him (Obama, 2013). Those are examples of nonverbal microaggressions. It is disturbing to learn that those behaviors are often automatic that express “put-downs” of individuals in marginalized groups (Pierce et al., 1977). What if Obama were White, would he receive those unfair treatments?

(If Obama were white, like Hillary Clinton, he probably wouldn’t have been elected president.)

For some reason, black people shoplifting, carjacking, or purse-snatching are never described as “microaggressions;” a black person whose feelings are hurt has been microaggressed, but a white person afraid of being robbed or murdered has not been.

This post was actually inspired by an intra-leftist debate:

Shortly after the highly successful African-star-studded movie Black Panther debuted, certain folks, like Faisal Kutty, started complaining that the film is “Islamophobic” because of a scene where girls are rescued from a Boko Haram-like organization.

Never mind that Boko Haram is a real organization, that it actually kidnaps girls, that it has killed more people than ISIS and those people it murders are Africans. Even other Black African Muslims think Boko Haram is shit. (Though obviously BH has its supporters.)

Here we have two different groups of people with different interests: one, Muslims with no particular ties to Africa who don’t want people to associate them with Boko Haram, and two, Black Muslims who don’t want to get killed by folks like Boko Haram.

It is exceedingly disingenuous for folks like Faisal Kutty to criticize as immoral an accurate portrayal of a group that is actually slaughtering thousands of people just because he might accidentally be harmed by association. More attention on Boko Haram could save lives; less attention could result in more deaths–the dead just wouldn’t be Kutty, who is safe in Canada.

Without mass media, I don’t think this kind of appeal works: survival memes dominate and people take danger very seriously. “Some stranger in Canada might be inconvenienced over this” loses to “these people slaughter children.” With mass media, the viral environment allows appeals to set aside your own self-interest and ignore danger in favor of “fairness” and “equality” for everyone in the conversation to flourish.

So far this post has focused primarily on the interests of innocent people, but criminals have interests, too–and criminals would like you to make it easier for them to commit crime.

Steve Sailer highlighted the case of social justice activist and multiple award winner Simon Mol (quotes are from Mol’s Wikipedia article):

Simon Mol (6 November 1973 in Buea, Cameroon – 10 October 2008) was the pen name of Simon Moleke Njie, a Cameroon-born journalist, writer and anti-racist political activist. In 1999 he sought political asylum in Poland; it was granted in 2000, and he moved to Warsaw, where he became a well-known anti-racist campaigner. …

In 2005 he organized a conference with Black ambassadors in Poland to protest the claims in an article in Wiedza i Życie by Adam Leszczyński about AIDS problems in Africa, which quoted research stating that a majority of African women were unable to persuade their HIV positive husbands to wear condoms, and so later got caught HIV themselves. Mol accused Leszczyński of prejudice because of this publication.

Honorary member of the British International Pen Club Centre.

In 2006 Mol received the prestigious award “Oxfam Novib/PEN Award for Freedom of Expression”.

In February 2006, further to his partner’s request for him to take an HIV test, Mol declined and published a post on his blog explaining why not:

Character assassination isn’t a new phenomenon. However, it appears here the game respects no rules. It wouldn’t be superfluous to state that there is an ingrained, harsh and disturbing dislike for Africans here. The accusation of being HIV positive is the latest weapon that as an African your enemy can raise against you. This ideologically inspired weapon, is strengthened by the day with disturbing literature about Africa from supposed-experts on Africa, some of whom openly boast of traveling across Africa in two weeks and return home to write volumes. What some of these hastily compiled volumes have succeeded in breeding, is a social and psychological conviction that every African walking the street here is supposedly HIV positive, and woe betide anyone who dares to unravel the myth being put in place.

On the 3rd of January 2007 Mol was taken into custody by the Polish police and charged with infecting his sexual partners with HIV. …

According to the Rzeczpospolita newspaper, he was diagnosed with HIV back in 1999 while living in a refugee shelter, but Polish law does not force an HIV carrier to reveal his or her disease status.

According to the police inspector who was investigating his case, a witness stated that Mol refused to wear condoms during sex. An anonymous witness in one case said that he accused a girl who demanded he should wear them that she was racist because as he was Black she thought he must be infected with HIV. After sexual intercourse he used to say to his female partners that his sperm was sacred.

In an unusual move, his photo with an epidemiological warning, was ordered to be publicly displayed by the then Minister of Justice Zbigniew Ziobro. MediaWatch, a body that monitors alleged racism, quickly denounced this decision, asserting that it was a breach of ethics with racist implications, as the picture had been published before any court verdict. They saw it as evidence of institutional racism in Poland, also calling for international condemnation. …

After police published Mol’s photo and an alert before the start of court proceedings, Warsaw HIV testing centers were “invaded by young women”. A few said that they knew Mol. Some of the HIV tests have been positive. According to the police inspector who had been monitoring the tests and the case: “Some women very quickly started to suffer drug-resistant tonsillitis and fungal infections. They looked wasted, some lost as many as 15 kilograms and were deeply traumatized, impeding us taking the witness statements. 18 additional likely victims have been identified thereby”. Genetic tests of the virus from the infectees and Simon proved that it was specific to Cameroon.

In other words, Simon Mol was a sociopath who used the accusation of “racism” to murder dozens of women.

Criminals–of any race–are not nice people. They will absolutely use anything at their disposal to make it easier to commit crime. In the past, they posed as police officers, asked for help finding their lost dog, or just rang your doorbell. Today they can get intersectional feminists and international human rights organizations to argue on their behalf that locking your door or insisting on condoms is the real crime.

Critical criminology, folks.

Re Nichols: Times the Experts were Wrong, pt 3/3

Welcome to our final post of “Times the Experts were Wrong,” written in preparation for our review of The Death of Expertise: The Campaign Against Established Knowledge and Why it Matters. Professor Nichols, if you ever happen to read this, I hope it give you some insight into where we, the common people, are coming from. If you don’t happen to read it, it still gives me a baseline before reading your book. (Please see part 1 for a discussion of relevant definitions.)

Part 3 Wars:

WWI, Iraq, Vietnam etc.

How many “experts” have lied to convince us to go to war? We were told we had to attack Iraq because they had weapons of mass destruction, but the promised weapons never materialized. Mother Jones (that source of all things pro-Trump) has a timeline:

November 1999: Chalabi-connected Iraqi defector “Curveball”—a convicted sex offender and low-level engineer who became the sole source for much of the case that Saddam had WMD, particularly mobile weapons labs—enters Munich seeking a German visa. German intel officers describe his information as highly suspect. US agents never debrief Curveball or perform background check. Nonetheless, Defense Intelligence Agency (DIA) and CIA will pass raw intel on to senior policymakers. …

11/6/00: Congress doubles funding for Iraqi opposition groups to more than $25 million; $18 million is earmarked for Chalabi’s Iraqi National Congress, which then pays defectors for anti-Iraq tales. …

Jan 2002: The FBI, which favors standard law enforcement interrogation practices, loses debate with CIA Director George Tenet, and Libi is transferred to CIA custody. Libi is then rendered to Egypt. “They duct-taped his mouth, cinched him up and sent him to Cairo,” an FBI agent told reporters. Under torture, Libi invents tale of Al Qaeda operatives receiving chemical weapons training from Iraq. “This is the problem with using the waterboard. They get so desperate that they begin telling you what they think you want to hear,” a CIA source later tells ABC. …

Feb 2002: DIA intelligence summary notes that Libi’s “confession” lacks details and suggests that he is most likely telling interrogators what he thinks will “retain their interest.” …

9/7/02: Bush claims a new UN International Atomic Energy Agency (IAEA) report states Iraq is six months from developing a nuclear weapon. There is no such report. …

9/8/02: Page 1 Times story by Judith Miller and Michael Gordon cites anonymous administration officials saying Saddam has repeatedly tried to acquire aluminum tubes “specially designed” to enrich uranium. …

Tubes “are only really suited for nuclear weapons programs…we don’t want the smoking gun to be a mushroom cloud.”—Rice on CNN …

“We do know, with absolute certainty, that he is using his procurement system to acquire the equipment he needs in order to enrich uranium to build a nuclear weapon.”—Cheney on Meet the Press

Oct 2002: National Intelligence Estimate produced. It warns that Iraq “is reconstituting its nuclear program” and “has now established large-scale, redundant and concealed BW agent production capabilities”—an assessment based largely on Curveball’s statements. But NIE also notes that the State Department has assigned “low confidence” to the notion of “whether in desperation Saddam would share chemical or biological weapons with Al Qaeda.” Cites State Department experts who concluded that “the tubes are not intended for use in Iraq’s nuclear weapons program.” Also says “claims of Iraqi pursuit of natural uranium in Africa” are “highly dubious.” Only six senators bother to read all 92 pages. …

10/4/02: Asked by Sen. Graham to make gist of NIE public, Tenet produces 25-page document titled “Iraq’s Weapons of Mass Destruction Programs.” It says Saddam has them and omits dissenting views contained in the classified NIE. …

2/5/03: In UN speech, Powell says, “Every statement I make today is backed up by sources, solid sources. These are not assertions. What we’re giving you are facts and conclusions based on solid intelligence.” Cites Libi’s claims and Curveball’s “eyewitness” accounts of mobile weapons labs. (German officer who supervised Curveball’s handler will later recall thinking, “Mein Gott!”) Powell also claims that Saddam’s son Qusay has ordered WMD removed from palace complexes; that key WMD files are being driven around Iraq by intelligence agents; that bioweapons warheads have been hidden in palm groves; that a water truck at an Iraqi military installation is a “decontamination vehicle” for chemical weapons; that Iraq has drones it can use for bioweapons attacks; and that WMD experts have been corralled into one of Saddam’s guest houses. All but the last of those claims had been flagged by the State Department’s own intelligence unit as “WEAK.”

I’m not going to quote the whole article, so if you’re fuzzy on the details, go read the whole darn thing.

If you had access to the actual documents from the CIA, DIA, British intelligence, interrogators, etc., you could have figured out that the “experts” were not unanimously behind the idea that Iraq was developing WMDs, but we mere plebes were dependent on what the government, Fox, and CNN told us the “experts” believed.

For the record, I was against the Iraq War from the beginning. I’m not sure what Nichols’s original position was, but in Just War, Not Prevention (2003) Nichols argued:

More to the point, Iraq itself long ago provided ample justifications for the United States and its allies to go to war that have nothing to do with prevention and everything to do with justice. To say that Saddam’s grasping for weapons of mass destruction is the final straw, and that it is utterly intolerable to allow Saddam or anyone like to gain a nuclear weapon, is true but does not then invalidate every other reason for war by subsuming them under some sort of putative ban on prevention.

The record provides ample evidence of the justice of a war against Saddam Hussein’s regime. Iraq has shown itself to be a serial aggressor… a supreme enemy of human rights that has already used weapons of mass destruction against civilians, a consistent violator of both UN resolutions and the therms of the 1991 cease-fire treaty … a terrorist entity that has attempted to reach beyond its own borders to support and engage in illegal activities that have included the attempted assassination of a former U.S. president; and most important, a state that has relentlessly sought nuclear arms against all international demands that it cease such efforts.

Any one of these would be sufficient cause to remove Saddam and his regime … but taken together they are a brief for what can only be considered a just war. ..

Those concerned that the United States is about to revise the international status quo might conside that Western inaction will allow the status quo to be revised in any case, only under the gun of a dictator commanding an arsenal of the most deadly materials on earthy. These are the two alternatives, and sadly, thee is no third choice.

Professor Nichols, I would like to pause here.

First: you think Trump is bad, you support the President under whom POWs were literally tortured, and you call yourself a military ethicist?

Second: you, an expert, bought into this “WMD” story (invented primarily by “Curveball,” an unreliable source,) while I, a mere plebe, knew it was a load of garbage.

Third: while I agree Saddam Hussein killed a hell of a lot of people–according to Wikipedia, Human Rights Watch estimates a quarter of a million Iraqis were killed or “disappeared” in the last 25 years of Ba’th party rule, the nine years of the Iraq war killed 150,000 to 460,000 people (depending on which survey you trust,) and based on estimates from the Iraq Body Count, a further 100,000 have died since then. Meanwhile, instability in Iraq allowed the horrifically violent ISIS to to sprout into existence. I Am Syria (I don’t know if they are reliable) estimates that over half a million Syrians have died so far because of the ISIS-fueled civil war rampaging there.

In other words, we unleashed a force that is twice as bad as Saddam in less than half the time–and paid a lovely 2.4 TRILLION dollars to accomplish this humanitarian feat! For that much money you could have just evacuated all of the Kurds and built them their own private islands to live on. You could have handed out $90,000 to every man, woman, and child in Iraq in exchange for “being friends with the US” and still had $150 BILLION left over to invest in things like “cancer treatments for children” and “highspeed rail infrastructure.”

Seriously, you could have spent the entire 2.4 trillion on hookers and blow and we would have still come out ahead.

Back in 2015, you tried to advise the Republican frontrunners on how to answer questions about the Iraq War:
First, let’s just stipulate that the question is unfair.

It’s asking a group of candidates to re-enact a presidential order given 12 years ago, while Hillary Clinton isn’t even being asked about decisions in which she took part, much less about her husband’s many military actions. …

Instead, Republican candidates should change the debate. Leadership is not about what people would do with perfect information; it’s about what people do when faced with danger and uncertainty. So here’s an answer that every Republican, from Paul to Bush, could give:

“Knowing exactly what we know now, I would not have invaded when we did or in the way we did. But I do not regret that we deposed a dangerous maniac like Saddam Hussein, and I know the world is better for it. What I or George Bush or anyone else would have done with better information is irrelevant now, because the next president has to face the world as it is, not as we would like to imagine it. And that’s all I intend to say about second-guessing a tough foreign-policy decision from 12 years ago, especially since we should have more pressing questions about foreign policy for Hillary Clinton that are a lot more recent than that.”

While I agree that Hillary should have been questioned about her own military decisions, Iraq was a formally declared war that the entire Republican establishment, think tanks, newspapers, and experts like you supported. They did such a convincing job of selling the war that even most of the Democratic establishment got on board, though never quite as enthusiastically.

By contrast, there was never any real Democratic consensus on whether Obama should remove troops or increase troops, on whether Hillary should do this or that in Libya. Obama and Hillary might have hideously bungled things, but there was never enthusiastic, party-wide support for their policies.

This makes it very easy for any Dem to distance themselves from previous Dem policies: “Yeah, looks like that was a big whoopsie. Luckily half our party knew that at the time.”

But for better or worse, the Republicans–especially the Bushes–own the Iraq War.

The big problem here is not that the Republican candidates (aside from Trump and Rand Paul) were too dumb to come up with a good response to the question (though that certainly is a problem.) The real problem is that none of them had actually stopped to take a long, serious look at the Iraq War, ask whether it was a good idea, and then apologize.

The Iraq War deeply discredited the Republican party.

Ask yourself: What did Bush conserve? What have I conserved? Surely being a “conservative” means you want to conserve something, so what was it? Iraqi freedom? Certainly not. Mid East stability? Nope. American lives? No. American tax dollars? Definitely not.

The complete failure of the Republicans to do anything good while squandering 2.4 trillion dollars and thousands of American lives is what triggered the creation of the “alt” right and set the stage for someone like Trump–someone willing to make a formal break with past Republican policies on Iraq–to rise to power.

Iraq I, the prequel:

But Iraq wasn’t the first war we were deceived into fighting–remember the previous war in Iraq, the one with the other President Bush? The one where we were motivated to intervene over stories of poor Kuwaiti babies ripped from their incubators by cruel Iraqis?

The Nayirah testimony was a false testimony given before the Congressional Human Rights Caucus on October 10, 1990 by a 15-year-old girl who provided only her first name, Nayirah. The testimony was widely publicized, and was cited numerous times by United States senators and President George H. W. Bush in their rationale to back Kuwait in the Gulf War. In 1992, it was revealed that Nayirah’s last name was al-Ṣabaḥ (Arabic: نيره الصباح‎) and that she was the daughter of Saud Al-Sabah, the Kuwaiti ambassador to the United States. Furthermore, it was revealed that her testimony was organized as part of the Citizens for a Free Kuwait public relations campaign which was run by an American public relations firm Hill & Knowlton for the Kuwaiti government. Following this, al-Sabah’s testimony has come to be regarded as a classic example of modern atrocity propaganda.[1][2]

In her emotional testimony, Nayirah stated that after the Iraqi invasion of Kuwait she had witnessed Iraqi soldiers take babies out of incubators in a Kuwaiti hospital, take the incubators, and leave the babies to die.

Her story was initially corroborated by Amnesty International[3] and testimony from evacuees. Following the liberation of Kuwait, reporters were given access to the country. An ABC report found that “patients, including premature babies, did die, when many of Kuwait’s nurses and doctors… fled” but Iraqi troops “almost certainly had not stolen hospital incubators and left hundreds of Kuwaiti babies to die.”[4][5]

Kuwaiti babies died because Kuwaiti doctors and nurses abandoned them. Maybe the “experts” at the UN and in the US government should vet their sources a little better (like actually find out their last names) before starting wars based on the testimony of children?

Vietnam:

And then there was Vietnam. Cold War “experts” were certain it was very important for us to spend billions of dollars in the 1950s to prop of the French colony in Indochina. When the French gave up, fighting the war somehow became America’s problem. The Cold War doctrine of the “Domino Theory” held that the loss of even one obscure, third-world country to Communism would unleash an unstoppable chain-reaction of global Soviet conquest, and thus the only way to preserve democracy anywhere in the world was to oppose communism wherever it emerged.

Of course, one could not be a Cold War “expert” in 1955, as we had never fought a Cold War before. This bi-polar world lead by a nuclear-armed communist faction on one side and a nuclear-armed democratic faction on the other was entirely new.

Atop the difficulties of functioning within an entirely novel balance of powers (and weapons), almost no one in America spoke Vietnamese (and no one in Vietnam spoke English) in 1955. We couldn’t even ask the Vietnamese what they thought. At best, we could play a game of telephone with Vietnamese who spoke French and translators who spoke French and English, but the Vietnamese who had learned the language of their colonizers were not a representative sample of average citizens.

In other words, we had no idea what we were getting into.

I lost family in Vietnam, so maybe I take this a little personally, but I don’t think American soldiers exist just to enrich Halliburton or protect French colonial interests. And you must excuse me, but I think you “experts” grunting for war have an extremely bad track record that involves people in my family getting killed.

While we are at it, what is the expert consensus on Russiagate?

Well, Tablet Mag thinks it’s hogwash:

At the same time, there is a growing consensus among reporters and thinkers on the left and right—especially those who know anything about Russia, the surveillance apparatus, and intelligence bureaucracy—that the Russiagate-collusion theory that was supposed to end Trump’s presidency within six months has sprung more than a few holes. Worse, it has proved to be a cover for U.S. intelligence and law-enforcement bureaucracies to break the law, with what’s left of the press gleefully going along for the ride. Where Watergate was a story about a crime that came to define an entire generation’s oppositional attitude toward politicians and the country’s elite, Russiagate, they argue, has proved itself to be the reverse: It is a device that the American elite is using to define itself against its enemies—the rest of the country.

Yet for its advocates, the questionable veracity of the Russiagate story seems much less important than what has become its real purpose—elite virtue-signaling. Buy into a storyline that turns FBI and CIA bureaucrats and their hand-puppets in the press into heroes while legitimizing the use of a vast surveillance apparatus for partisan purposes, and you’re in. Dissent, and you’re out, or worse—you’re defending Trump.

“Russia done it, all the experts say so” sounds suspiciously like a great many other times “expert opinion” has been manipulated by the government, industry, or media to make it sound like expert consensus exists where it does not.

Let’s look at a couple of worst case scenarios:

  1. Nichols and his ilk are right, but we ignore his warnings, overlook a few dastardly Russian deeds, and don’t go to war with Russia.
  2. Nichols is wrong, but we trust him, blame Russia for things it didn’t do, and go to war with a nuclear superpower.

But let’s look at our final fail:

Failure to predict the fall of the Soviet Union

This is kind of an ironic, given that Nichols is a Sovietologist, but one of the continuing questions in Political Science is “Why didn’t political scientists predict the fall of the Soviet Union?”

In retrospect, of course, we can point to the state of the Soviet economy, or glasnost, or growing unrest and dissent among Soviet citizens, but as Foreign Policy puts it:

In the years leading up to 1991, virtually no Western expert, scholar, official, or politician foresaw the impending collapse of the Soviet Union, and with it  and with it one-party dictatorship, the state-owned economy, and the Kremlin’s control over its domestic and Eastern European empires. … 

Whence such strangely universal shortsightedness? The failure of Western experts to anticipate the Soviet Union’s collapse may in part be attributed to a sort of historical revisionism — call it anti-anti-communism — that tended to exaggerate the Soviet regime’s stability and legitimacy. Yet others who could hardly be considered soft on communism were just as puzzled by its demise. One of the architects of the U.S. strategy in the Cold War, George Kennan, wrote that, in reviewing the entire “history of international affairs in the modern era,” he found it “hard to think of any event more strange and startling, and at first glance inexplicable, than the sudden and total disintegration and disappearance … of the great power known successively as the Russian Empire and then the Soviet Union.”

I don’t think this is Political Science’s fault–even the Soviets don’t seem to have really seen it coming. Some things are just hard to predict.

Sometimes we overestimate our judgment. We leap before we look. We think there’s evidence where there isn’t or that the evidence is much stronger than it is.

And in the cases I’ve selected, maybe I’m the one who’s wrong. Maybe Vietnam was a worthwhile conflict, even if it was terrible for everyone involved. Maybe the Iraq War served a real purpose.

WWI was still a complete disaster. There is no logic where that war makes any sense at all.

When you advocate for war, step back a moment and ask how sure you are. If you were going to be the canon fodder down on the front lines, would you still be so sure? Or would you be the one suddenly questioning the experts about whether this was really such a good idea?

Professor Nichols, if you have read this, I hope it has given you some food for thought.

Re Nichols: Times the Experts were Wrong, pt 2

Welcome back. In preparation for our review of The Death of Expertise: The Campaign Against Established Knowledge and Why it Matters, I have made a list of “times the experts were wrong.” Professor Nichols, if you ever happen to read this, I hope it give you some insight into where we, the common people, are coming from. If you don’t happen to read it, it still gives me a baseline before reading your book. (Please see part 1 for a discussion of relevant definitions.)

Part 2: Law, Academia, and Science

Legal Testimony

If you’ve had any contact with the court system, you’re probably familiar with the use of “expert testimony.” Often both sides of a case bring in their own experts who give their expert testimony on the case–by necessity, contradictory testimony. For example, one expert in a patent case may testify that his microscopy data shows one thing, while a second testifies that in fact a proper analysis of his microscopy data actually shows the opposite. The jury is then asked to decide which expert’s analysis is correct.

If it sounds suspicious that both sides in a court case can find an “expert” to testify that their side is correct, that’s because it is. Take, for example, the government’s expert testimony in the trial of Mr. Carlos Simon-Timmerman, [note: link takes you to AVN, a site of questionable work-friendliness] accused of possessing child pornography:

“When trial started,” said Ramos-Vega, “the government presented the Lupe DVD and a few other images from the other DVDs that the government understood were also of child pornography.  The government presented the testimony of a Special Agent of Immigration and Customs Enforcement that deals with child pornography and child exploitation cases.  She testified that Lupe was ‘definitely’ under 18. The government then presented the testimony of a pediatrician who testified that she was 100 percent sure that Lupe was underage.”

The experts, ladies and gents.

After the prosecution rested its case, it was Ramos-Vega’s turn to present witnesses.

The first witness we called was Lupe,” he said. “She took the stand and despite being very nervous testified so well and explained to the ladies and gentlemen of the jury that she was 19 years old when she performed in the videos for littlelupe.com.  She also allowed us to present into evidence copies of her documents showing her date of birth.”

So the Customs Special Agent and the pediatrician were both LYING UNDER OATH about the age of a porn star in order to put an innocent man in prison. There were multiple ways they could have confirmed Lupe’s age (such as checking with her official porn star information on file in the US, because apparently that’s an official thing that exists for exactly this purpose,) or contacting Lupe herself like Mr. Simon-Timmerman’s lawyer did.

Unfortunately, this is hardly the first time trial “experts” have lied:

The Washington Post published a story so horrifying this weekend that it would stop your breath: “The Justice Department and FBI have formally acknowledged that nearly every examiner in an elite FBI forensic unit gave flawed testimony in almost all trials in which they offered evidence against criminal defendants over more than a two-decade period before 2000.”

“Of 28 examiners with the FBI Laboratory’s microscopic hair comparison unit, 26 overstated forensic matches in ways that favored prosecutors in more than 95 percent of the 268 trials reviewed so far.” …

Santae Tribble served 28 years for a murder based on FBI testimony about a single strand of hair. He was exonerated in 2012. It was later revealed that one of the hairs presented at trial came from a dog.

Professor Nichols, you want to know, I assume, why we plebes are so distrustful of experts like you. Put yourself, for a moment, in the feet of an ordinary person accused of a crime. You don’t have a forensics lab. Your budget for expert witnesses is pretty limited. Your lawyer is likely a public defender.

Do you trust that these experts are always right, even though they are often hired by people who have a lot more money than you do? Do you think there is no way these experts could be biased toward the people paying them, or that the side with more money to throw at experts and its own labs could produce more evidence favorable to itself than the other?

Now let’s expand our scope: how do you think ordinary people think about climate scientists, medical drug studies, or military intelligence? Unlike drug companies, we commoners don’t get to hire our own experts. Do you think Proctor and Gamble never produces research that is biased toward its own interests? Of course; that’s why researchers have to disclose any money they’ve received from drug companies.

From the poor man’s perspective, it looks like all research is funded by rich men, and none by poor men. It is sensible to worry, therefore, that the results of this research are inherently biased toward those who already have plenty of status and wealth.

The destruction of expertise: “Studies” Departments

Here is a paper published in a real, peer-reviewed academic journal:

Towards a truer multicultural science education: how whiteness impacts science education, by Paul T. Le, (doctoral candidate from the Department of Integrative and Systems Biology at the University of Colorado) and Cheryl Matias, (associate professor at the School of Education and Human Development, University of Colorado) (h/t Real Peer Review):

The hope for multicultural, culturally competent, and diverse perspectives in science education falls short if theoretical considerations of whiteness are not entertained. [Entertained by whom?] Since whiteness is characterized [by whom?] as a hegemonic racial dominance that has become so natural it is almost invisible, this paper identifies how whiteness operates in science education such that [awkward; “to such an extent that”] it falls short of its goal for cultural diversity. [“Cultural diversity” is not one of science education’s goals] Because literature in science education [Which literature? Do you mean textbooks?] has yet to fully entertain whiteness ideology, this paper offers one of the first theoretical postulations [of what?]. Drawing from the fields of education, legal studies, and sociology, [but not science?] this paper employs critical whiteness studies as both a theoretical lens and an analytic tool to re-interpret how whiteness might impact science education. Doing so allows the field to reconsider benign, routine, or normative practices and protocol that may influence how future scientists of Color experience the field. In sum, we seek to have the field consider the theoretical frames of whiteness and how it [use “whiteness” here instead of “it” because there is no singular object for “it” to refer to in this sentence] might influence how we engage in science education such that [“to such an extent that”] our hope for diversity never fully materializes.

Apologies for the red pen; you might think that someone at the “School of Education” could write a grammatical sentence and the people publishing peer-reviewed journals would employ competent editors, but apparently not.

If these are “experts,” then expertise is dead with a stake through its heart.

But the paper goes on!

The resounding belief that science is universal and objective hides the reality that whiteness has shaped the scientific paradigm.

See, you only think gravity pulls objects toward the earth at a rate of 9.8 m/second^2 because you’re white. When black people drop objects off the Leaning Tower of Pisa, they fall 10m/s^2. Science textbooks and educators only teaching the white rate and refusing to teach the black rate is why no black nation has successfully launched a man into space.

Our current discourse believes that science and how we approach experimentation and constructing scientific explanations is unbiased, and on the surface, it may seem justified (Kelly 2014). However, this way of knowing science in the absence of other ways of knowing only furthers whiteness an White supremacy through power and control of science knowledge. As a result, our students of Color are victims of deculturization, and their own worldviews are invalidated, such as described by Ladson-Bilings (1998a).

For example, some Aboriginal people in Australia believe that cancer is caused by curses cast by other people or a spiritual punishment for some misdeed the sufferer committed. Teaching them that cancer is caused by mutated cells that have begun reproducing out of control and can’t be caused by a curse is thus destroying a part of their culture. Since all cultures are equally valuable, we must teach that the Aboriginal theory of cancer-curses and the white theory of failed cellular apoptosis are equally true.

Or Le and Matias are full of shit. Le doesn’t have his PhD, yet, so he isn’t an official expert, but Matias is a professor with a CV full of published, peer-reviewed articles on similar themes.

You might say I’ve cherry-picked a particularly bad article, but give me 10 minutes and I’ll get you 100 more that are just as bad. Here’s one on “the construction of race in contemporary PE curriculum policy.”

Every single degree awarded paper published on such garbage degrades the entire concept of “experts.” Sure, Nichols is a professor–and so is Matias. As far as our official system for determining expertise, Nichols, Matias, and Stephen Hawing are all “experts.”

And this matters, because the opinions of garbage experts get cited in places like the NY Times, and then picked up by other journalists and commentators as though they were some kind of valid proof backing up their points. Take this case, “Extensive Data Shows Punishing Reach of Racism for Black Boys:

Black boys raised in America, even in the wealthiest families and living in some of the most well-to-do neighborhoods, still earn less in adulthood than white boys with similar backgrounds, according to a sweeping new study that traced the lives of millions of children.

White boys who grow up rich are likely to remain that way. Black boys raised at the top, however, are more likely to become poor than to stay wealthy in their own adult households.

(Oh, look, someone discovered regression to the mean.)

What happens when blue check twitter reports on this piece?

    1. You don’t need an “expert” to tell you that black men might get discriminated against.
    2. How do you become an “expert” in anti-racism? Do you have to pass the implicit bias test? Get a degree in anti-racist studies?
    3. Do you think, for whatever reason, that a guy who gets paid to do anti-racist research might come up with “racism” as an answer to almost any question posed?
    4. “The guy who gets paid to say that racism is the answer said the answer is racism” does not actually prove that racism is the answer, but it is being presented like it does.
    5. Blue check has failed to mention any obvious counters, like:
      a. Mysteriously, this “racism” only affects black men and not black women (this is why we’ve had a black female president but not a black male one, right?)
      b. Regression to the mean is a thing and we can measure it (shortly: The further you are from average for your group on any measure [height, intelligence, income, number of Daleks collected, etc.,] the more likely your kids are to be closer to average than you are. [This is why the kids of Nobel prize winners, while pretty smart on average, are much less likely to win Nobels than their parents.] Since on average blacks make a lot less money than whites, any wealthy black family is significantly further from the average black income than a white family with the same amount of money is from the average white income. Therefore at any high income level, we expect black kids to regress harder toward the black mean than white kids raised at the same level. La Griffe du Lion [a statistics expert] has an article that goes into much more depth and math on regression to the mean and its relevance.)
      c. Crime rates. Black men commit more crime than black women or white men, and not only does prison time cut into employment, but most employers don’t want to employ people who’ve committed a crime. This makes it easier for black women to get jobs and build up wealth than black men. (The article itself does mention that “The sons of black families from the top 1 percent had about the same chance of being incarcerated on a given day as the sons of white families earning $36,000,” but yeah, it’s probably just totally irrational discrimination keeping black men out of jobs.)

“Experts” like this get used to trot a simple, narrative-supporting line that the paper wants to make rather than give any real or uncomfortable analysis of a complex issue. It’s dishonest reporting and contributes to the notion that “expert” doesn’t mean all that much.

Source

Leaded Gas:

Tetraethyllead (aka lead) was added to automobile fuels beginning in the 1920s to raise fuel economy–that is, more miles per gallon. For half a century, automobiles belched brain-damaging lead into the atmosphere, until the Clean Air Act in the 70s forced gas companies to cut back.

Here’s a good article discussing the leaded gas and crime correlation.

Plenty of people knew lead is poisonous–we’ve known that since at least the time of the Romans–so how did it end up in our gas? Well, those nice scientists over at the auto manufacturers reassured us that lead in gasoline was perfectly safe, and then got themselves on a government panel intended to evaluate the safety of leaded gas and came to the same conclusion. Wired has a thorough history:

But fearing that such [anti-leaded gas] measures would spread, … the manufacturing companies demanded that the federal government take over the investigation and develop its own regulations. U.S. President Calvin Coolidge, a Republican and small-government conservative, moved rapidly in favor of the business interests.

… In May 1925, the U.S. Surgeon General called a national tetraethyl lead conference, to be followed by the formation of an investigative task force to study the problem. That same year, Midgley [the inventor of leaded gas] published his first health analysis of TEL, which acknowledged  a minor health risk at most, insisting that the use of lead compounds,”compared with other chemical industries it is neither grave nor inescapable.”

It was obvious in advance that he’d basically written the conclusion of the federal task force. That panel only included selected industry scientists like Midgely. It had no place for Alexander Gettler or Charles Norris [scientists critical of leaded gas] or, in fact, anyone from any city where sales of the gas had been banned, or any agency involved in the producing that first critical analysis of tetraethyl lead.

In January 1926, the public health service released its report which concluded that there was “no danger” posed by adding TEL to gasoline…”no reason to prohibit the sale of leaded gasoline” as long as workers were well protected during the manufacturing process.

The task force did look briefly at risks associated with every day exposure by drivers, automobile attendants, gas station operators, and found that it was minimal. The researchers had indeed found lead residues in dusty corners of garages. In addition,  all the drivers tested showed trace amounts of lead in their blood. But a low level of lead could be tolerated, the scientists announced. After all, none of the test subjects showed the extreme behaviors and breakdowns associated with places like the looney gas building. And the worker problem could be handled with some protective gear.

I’m not sure how many people were killed globally by leaded gas, but Wired notes:

It was some fifty years later – in 1986 – that the United States formally banned lead as a gasoline additive. By that time, according to some estimates, so much lead had been deposited into soils, streets, building surfaces, that an estimated 68 million children would register toxic levels of lead absorption and some 5,000 American adults would die annually of lead-induced heart disease.

The UN estimates that the elimination of lead in gas and paint has added 2.4 trillion, annually, the global economy.

Leaded gas is a good example of a case where many experts did know it was poisonous (as did many non-experts,) but this wasn’t the story the public heard.

Pluto

Yes, this one is silly, but I have relatives who keep bringing it up. “Scientists used to say there are 9 planets, but now they say there are only 8! Scientists change what they think all the time!”

Congratulations, astronomers, they think you lost Pluto. Every single time I try to discuss science with these people, they bring up Pluto. Scientific consensus is meaningless in a world where planets just disappear. “Whoops! We miscounted!”

(No one ever really questioned Pluto’s planetary status before it was changed, but a few die-hards refuse to accept the new designation.)

Scientists weren’t actually wrong about Pluto (“planet” is just a category scientists made up and that they decided to redefine to make it more useful,) but the matter confused people and it seemed like scientific consensus was arbitrary and could change unexpectedly.

Unfortunately, normal people who don’t have close contact with science or scientists often struggle to understand exactly what science is and how it advances. They rely, sporadically, on intermediaries like The History Chanel or pop science journalists to explain it to them, and these guys like to run headlines like “5 things Albert Einstein got Totally Wrong” (haha that Albert, what a dummy, amirite?)

So when you question why people distrust experts like you, Professor Nichols, consider whether the other “experts” they’ve encountered have been trustworthy or even correct, or if they’ve been liars and shills.

Re Nichols: Times the Experts were Wrong

In preparation for our review of The Death of Expertise: The Campaign Against Established Knowledge and Why it Matters, I wanted to make list of “times the experts were wrong.” Professor Nichols, if you ever happen to read this, I hope it gives you some insight into where we, the common people, are coming from. If you don’t happen to read it, it still gives me a baseline before reading your book.

Nichols devotes a chapter to the subject–expert failures are, he claims, “rare but spectacular when they do happen, like plane crashes.” (I may be paraphrasing slightly.)

How often are the experts wrong? (And how would we measure that?)

For starters, we have to define what “experts” are. Nichols might define experts as, “anyone who has a PhD in a thing or has worked in that field for 10 years,” but the general layman is probably much laxer in his definitions.

Now, Nichols’s argument that “experts” are correct most of the time probably is correct, at least if we use a conservative definition of “expert”. We live in a society that is completely dependent on the collective expertise of thousands if not millions of people, and yet that society keeps running. For example, I do not know how to build a road, but road-building experts do, and our society has thousands of miles of functional roads. They’re not perfect, but they’re a huge improvement over dirt paths. I don’t know how to build a car, but car-building experts do, and so society is full of cars. From houses to skyscrapers, smartphones to weather satellites, electricity to plumbing: most of the time, these complicated systems get built and function perfectly well. Even airplanes, incredibly, don’t fall out of the sky most of the time (and according to Steven Pinker, they’re getting even better at it.)

But these seem like the kind of experts that most people don’t second-guess too often (“I think you should only put three wheels on the car–and make them titanium,”) nor is this the sort of questioning that I think Nichols is really concerned about. Rather, I think Nichols is concerned about people second-guessing experts like himself whose opinions bear not on easily observed, physical objects like cars and roads but on abstract policies like “What should our interest rates be?” or “Should we bomb Syria?”

We might distinguish here between practical experts employed by corporations, whose expertise must be “proven” via production of actual products that people actually use, and academic experts whose products are primarily ideas that people can’t touch, test, or interact with.

For ordinary people, though, we must include another form of experts: writers–of newspapers, magazines, TV programs, textbooks, even some well-respected bloggers. Most people don’t read academic journals nor policy papers. They read Cosmo and watch daytime talk shows, not because they “hate experts” but because this is the level of information they can understand.

In other words, most people probably think Cosmo’s “style expert” and Donald Trump are as much “experts” as Tom Nichols. Trump is a “business expert” who is so expert he not only has a big tower with his name on it, they even let him hire and fire people on TV! Has anyone ever trusted Nichols’s expertise enough to give him a TV show about it?

Trump Tower is something people can touch–the kind of expertise that people trust. Nichols’s expertise is the Soviet Union (now Russia) and how the US should approach the threat of nuclear war and deterrence–not things you can easily build, touch, and test.

Nichols’s idea of “experts” is probably different from the normal person’s idea of “experts.” Nichols probably uses metrics like “How long has this guy been in the field?” and “Which journals has he been published in?” while normal people use metrics like “Did CNN call him an expert?” and “Did I read it in a magazine?” (I have actually witnessed people citing margarine advertizements as “nutrition advice.”)

If anything, I suspect the difference between “normal people’s idea of expert” and “Nichols’s idea of experts” is part of the tension Nichols is feeling, as for the first time, ordinary people like me who would in the past have been limited largely to discussing the latest newspaper headlines with friends can now pull up any academic’s CV and critique it online. “The people,” having been trained on daytime TV and butter ads, can now critique foreign policy advisers…

Let’s sort “people who distrust experts” into three main categories:

  1. Informed dissenters: People who have read a lot on a particular topic and have good reason to believe the expert consensus is wrong, eg, someone involved in nutrition research who began sounding warning bells about the dangers of partially hydrogenated fats in the ’80s.
  2. General contrarians: Other people are wrong. Music has been downhill ever since the Beatles. The schools are failing because teachers are dumb. Evolution isn’t real. Contrarians like to disagree with others and sometimes they’re correct.
  3. Tinfoil hatters: CHEMTRAILS POISON YOU. The Tinfoil hatters don’t think other people are dumb; they think others are actively conspiring against them.

People can fall into more than one category–in fact, being a General Contrarian by nature probably makes it much easier to be an Informed Dissenter. Gregory Cochran, for example, probably falls into both categories. (Scott Alexander, by contrast, is an informed dissenter but not contrarian.)

Tinfoil hatters are deprecated, but even they are sometimes correct. If a Jew in 1930’s Germany had said, “Gee, I think those Germans have it out for us,” they’d have been correct. A white South African today who thinks the black South Africans have it out for them is probably also correct.

So the first question is whether more people actually distrust experts, or if the spread of the internet has caused Nichols to interact with more people who distrust experts. For example, far more people in the 80s were vocally opposed to the entire concept of “evolution” than are today, but they didn’t have the internet to post on. Nichols, a professor at the US Naval War College and the Harvard Extension School, probably doesn’t interact in real life with nearly as many people who are actively hostile to the entire edifice of modern science as the Kansas City School Board does, and thus he may have been surprised to finally encounter these people online.

But let’s get on with our point: a few cases where “the experts” have failed:

Part 1: Medicine and Doctors

Trans Fats

Artificially created trans (or partially hydrogenated) fats entered the American diet in large quantities in the 1950s. Soon nutrition experts, dieticians, healthcare philanthropists, and the federal government itself were all touting the trans fat mantra: trans fats like margarine or crisco were healthier and better for you than the animal fats like butter or lard traditionally used in cooking.

Unfortunately, the nutrition experts were wrong. Trans fats are deadly. According to a study published in 1993 by the Harvard School of Public Health, trans fats are probably responsible for about 100,000 deaths a year–or a million every decade. (And that’s not counting the people who had heart attacks and survived because of modern medical care.)

The first people to question the nutritional orthodoxy on trans fats (in any quantity) were probably the General Contrarians: “My grandparents ate lard and my parents ate lard and I grew up eating lard and we turned out just fine! We didn’t have ‘heart attacks’ back in the ’30s.” After a few informed dissenters started publishing studies questioning the nutritional orthodoxy, nutrition’s near-endless well of tinfoil hatters began promoting their findings (if any field is perfect for paranoia about poisons and contaminants, well, it’s food.)

And in this case, the tinfoil hatters were correct: corporations really were promoting the consumption of something they by then knew was killing people just because it made them money

Tobacco

If you’re old enough, you remember not only the days of Joe Camel, but also Camel’s ads heavily implying that doctors endorsed smoking. Dentists recommended Viceroys, the filtered cigarettes. Camels were supposed to “calm the nerves” and “aid the digestion.” Physicians recommended “mell-o-wells,” the “health cigar.” Some brands were even supposed to cure coughs and asthma.

Now, these weren’t endorsements from actual doctors–if anything, the desire to give cigarettes a healthy sheen was probably driven by the accumulating evidence that they weren’t healthy–but when my grandmother took up smoking, do you think she was reading medical journals? No, she trusted that nice doctor in that Camel ad.

Chesterfield, though, claimed that actual doctors had confirmed that their cigarettes had no adverse health effects:

In the 70s, the tobacco companies found doctors willing to testify not that tobacco was healthy, but that there was no proof–or not enough data–to accuse it of being unhealthy.

Even when called before Congress in the 90s, tobacco companies kept insisting their products weren’t damaging. If the CEO of Philip Morris isn’t an expert on cigarettes, I don’t know who is.

The CDC estimates that 480,000 Americans die due to cigarettes per year, making them one of our leading killers.

Freudianism, recovered memories, multiple personality disorder, and Satanic Daycares

In retrospect, Freudian Psychoanalysis is so absurd, it’s amazing it ever became a widely-believed, mainstream idea. And yet it was.

For example:

In the early 1890s, Freud used a form of treatment based on the one that Breuer had described to him, modified by what he called his “pressure technique” and his newly developed analytic technique of interpretation and reconstruction. According to Freud’s later accounts of this period, as a result of his use of this procedure most of his patients in the mid-1890s reported early childhood sexual abuse. He believed these stories, which he used as the basis for his seduction theory, but then he came to believe that they were fantasies. He explained these at first as having the function of “fending off” memories of infantile masturbation, but in later years he wrote that they represented Oedipal fantasies, stemming from innate drives that are sexual and destructive in nature.[121]

Another version of events focuses on Freud’s proposing that unconscious memories of infantile sexual abuse were at the root of the psychoneuroses in letters to Fliess in October 1895, before he reported that he had actually discovered such abuse among his patients.[122] In the first half of 1896, Freud published three papers, which led to his seduction theory, stating that he had uncovered, in all of his current patients, deeply repressed memories of sexual abuse in early childhood.[123] In these papers, Freud recorded that his patients were not consciously aware of these memories, and must therefore be present as unconscious memories if they were to result in hysterical symptoms or obsessional neurosis. The patients were subjected to considerable pressure to “reproduce” infantile sexual abuse “scenes” that Freud was convinced had been repressed into the unconscious.[124] Patients were generally unconvinced that their experiences of Freud’s clinical procedure indicated actual sexual abuse. He reported that even after a supposed “reproduction” of sexual scenes the patients assured him emphatically of their disbelief.[125]

To sum: Freud became convinced that patients had suffered sexual abuse.

The patients replied emphatically that they had not.

Freud made up a bunch of sexual abuse scenarios.

The patients insisted they remembered nothing of the sort.

Freud decided the memories must just be repressed.

Later, Freud decided the sexual abuse never actually happened, but that the repressed, inverted memories were of children masturbating to the thought of having sex with their parents.

So not only was Freud’s theory derived from nothing–directly contradicted by the patients he supposedly based it on–he took it a step further and actually denied the stories of patients who had been sexually abused as children.

Freud’s techniques may have been kinder than the psychology of the 1800s, which AFAIK involved locking insane people in asylums and stomping them to death, but there remains a cruel perversity to insisting that people have memories of horrible experiences they swear they don’t, and then turning around and saying that horrible things they clearly remember never happened.

Eventually Freudian psychoanalysis and its promise of “recovering repressed memories” morphed into the recovered traumatic memory movement of the 1980s, in which psychologists used hypnosis to convince patients they had been the victims of a vast world-wide Satanic conspiracy and that they had multiple, independent personalities that could only be accessed via hypnosis.

The satanic Daycare conspiracy hysteria resulted in the actual conviction and imprisonment of real people for crimes like riding broomsticks and sacrificing elephants, despite a total lack of local dead elephants. Judges, lawyers, juries, and prosecutors found the testimony of “expert” doctors and psychologists (and children) convincing enough to put people in prison for running an underground, global network of “Satanic Daycares” that were supposedly raping and killing children. Eventually the hysteria got so bad that the FBI got involved, investigated, and found a big fat nothing. No sacrificial altars. No secret basements full of Satanic paraphernalia and torture devices. No dead elephants or giraffes. No magic brooms. No dead infants.

Insurance companies began investigating the extremely expensive claims of psychologists treating women with “multiple personality disorder” (many of whom had so degenerated while in the psychologists care that they had gone from employed, competent people to hospitalized mental patients.) Amazingly, immediately after insurance companies decided the whole business was a scam and stopped paying for the treatment, the patients got better. Several doctors were sued for malpractice and MPD was removed from the official list of psychological conditions, the DSM-V. (It has been replaced with DID, or dissasociative disorder.)

I wrote about the whole sordid business at length in Satanic Daycares: the scandal that should have never been, Part Two, and Part Three.

(Ironically, people attack psychiatry’s use of medications like Prozac, but if anything, these are the most evidence-based parts of mental care. At least you can collect data on things like “Does Prozac work better than placebo for making people feel better?” unlike Freudian psychoanalysis, which contained so many levels of “repression” and “transference” that there was always a ready excuse for why it wasn’t working–or for why “the patient got worse” was actually exactly what was supposed to happen.)

All Doctors pre-1900

One of West Hunter’s frequent themes is just how bad pre-modern medicine was:

Between 1839 and 1847, the First Clinic at the Vienna General Hospital had 20,204 births and 1,989 maternal deaths. The Second Clinic, attended by midwives, had 17,791 birth and 691 maternal deaths. An MD’s care conferred an extra 6% chance of death. Births at home were even safer, with maternal mortality averaging about 0.5%

In that period, MDs caused about 1200 extra deaths. …

We know that wounded men in the Civil War had a better chance of surviving when they managed to hide from Army surgeons. Think how many people succumbed to bloodletting, over the centuries.

Ever wondered why Christian Scientists, who are otherwise quite pro-science, avoid doctors? It’s because their founder, Mary Baker Eddy (born in 1821) was often sick as a child. Her concerned parents dragged her to every doctor they could find, but poor Mary found that she got better when she stopped going to the doctors.

West Hunt gives a relevant description of pre-modern medicine:

Back in the good old days, Charles II, age 53, had a fit one Sunday evening, while fondling two of his mistresses.

Monday they bled him (cupping and scarifying) of eight ounces of blood. Followed by an antimony emetic, vitriol in peony water, purgative pills, and a clyster. Followed by another clyster after two hours. Then syrup of blackthorn, more antimony, and rock salt. Next, more laxatives, white hellebore root up the nostrils. Powdered cowslip flowers. More purgatives. Then Spanish Fly. They shaved his head and stuck blistering plasters all over it, plastered the soles of his feet with tar and pigeon-dung, then said good-night.

Tuesday. ten more ounces of blood, a gargle of elm in syrup of mallow, and a julep of black cherry, peony, crushed pearls, and white sugar candy.

Wednesday. Things looked good:: only senna pods infused in spring water, along with white wine and nutmeg.

Thursday. More fits. They gave him a spirituous draft made from the skull of a man who had died a violent death. Peruvian bark, repeatedly, interspersed with more human skull. Didn’t work.

Friday. The king was worse. He tells them not to let poor Nelly starve. They try the Oriental Bezoar Stone, and more bleeding. Dies at noon.

Homeopathy has a similar history: old medicines were so often poisonous that even if some of them worked, on average, you were probably better off eating sugar pills (which did nothing) than taking “real” medicines. But since people can’t market “pills with nothing in them,” homeopathy’s strange logic of “diluting medicine makes it stronger” was used to give the pills a veneer of doing something. (Freudian psychotherapy, the extent that it “helped” anyone, was probably similar. Not that the practitioner himself brought anything to the table, but the idea of “I am having treatment so I will get better” plus the opportunity to talk about your problems probably helped some people.)

Today, “alternative” medical treatments like homeopathy and “faith healing” are less effective than conventional medicine, but for most of the past 2,000 years or so, you’d have been better off distrusting the “experts” (ie doctors) than trusting them.

It was only in the 20th century that doctors (or researchers) developed enough technology like vaccines, antibiotics, the germ theory of disease, nutrition, insulin, traumatic care, etc., that doctors began saving more lives than they cost, but the business was still fraught:

Source (PDF)

Disclaimer: I have had the whole birth trifecta: natural birth without medication, vaginal birth with medication, and c-section. Natural birth was horrifically painful and left me traumatized. The c-section, while medically necessary, was almost as terrible. Recovery from natural (and medicated) birth was almost instant–within minutes I felt better; within days I was back on my feet and regaining mobility. The c-section left me in pain for a month, trying to nurse a new baby and care for my other children while on pain killers that made me feel awful and put me to sleep. Without the pain killers, I could barely sit up and get out of bed.

Medically necessary c-sections save lives, perhaps mine. I support them, but I do NOT support medically unnecessary c-sections.

The “international healthcare community” recommends a c-section rate of 10-15% (maybe 19%.) The US rate is over 30%. Half of our c-sections are unnecessary traumas inflicted on women.

In cases where c-sections are not medically necessary (low-risk pregnancies), c-sections carry more than triple the risk of maternal death (13 per 100,000 for c sections and 3.5 per 100,000 for vaginal births.) Medically necessary c-sections, of course, save more lives than they take.

Given: 1,258,581 c-sections in the US in 2016, if half of those were unnecessary, then I estimate 60 women per year died from unnecessary c-sections. Not the kind of death rate Semmelweis was fighting against when he tried to convince doctors they needed to wash their hands between dissecting corpses and delivering babies, (for his efforts he was branded “a guy who didn’t believe the wisdom of experts,” “crazy,” and was eventually put in an insane asylum and literally stomped to death by the guards. (Freudianism looks really good by comparison.)

C-sections have other effects besides just death: they are more expensive, can get infected, and delay recovery. (I’ve also seen data linking them to an increased chance of post-partum depression.) For women who want to have more children, a c-section increases the chances of problems during subsequent pregnancies and deliveries.

Why do we do so many c-sections? Because in the event of misfortune, a doctor is more likely to get sued if he didn’t do a c-section (“He could have done more to save the baby’s life but chose to ignore the signs of fetal distress!”) than if he does do one (“We tried everything we could to save mother and baby.”) Note that this is not what’s in the mother’s best interests, but in the doctor’s.

Although I am obviously not a fan of natural childbirth, (I favor epidurals,) I am sympathetic to the movement’s principle logic: avoiding unnecessary c-sections by avoiding the doctors who give them. These women are anti-experts, and I can’t exactly blame them.

At the intersection of the “natural food” and “natural birth” communities we find the anti-vaxers.

Now, I am unabashedly pro-vaccine (though I reserve the right to criticize any particular vaccine,) but I still understand where the anti-vax crew is coming from. If doctors were wrong about blood-letting, are wrong about many c-sections (or pushing them on unsuspecting women to protect their own bottom lines) and doctors were just plain wrong for decades about dangerous but lucrative artificial fats that they actively pushed people to eat, who’s to say they’re right about everything else? Maybe some of the other chemicals we’re being injected with are actually harmful.

We can point to (and I do) massive improvements in public health and life expectancies as a result of vaccinations, but (anti-vaxers counter) how do we know these outcomes weren’t caused by other things, like the development of water treatment systems and sewers that ensured people weren’t drinking fecal-contaminated water anymore?

(I am also pro-not drinking contaminated water.)

Like concerns about impurities in one’s food, concerns about vaccinations make a certain instinctual sense: it is kind of creepy to inject people (mostly infants) with a serum composed of, apparently, dead germs and “chemicals.” The idea that exposing yourself to germs will somehow make you healthier is counter-intuitive, and hypodermic needles are a well-publicized disease vector.

So even though I think anti-vaxers are wrong, I don’t think they’re completely irrational.

 

This is the end of Part 1. We’ll continue with Part 2 on Wed.

Totemism and Exogamy, pt. 3/3: Mundas, Khonds, and Herero

Welcome to our final installment of James Frazer’s Totemism and Exogamy, published in 1910. Here are some hopefully interesting excerpts (as usual, quotes are in “” instead of blocks):

Mundas:

Birsa Munda, 1875–1900, “Indian tribal freedom fighter, religious leader, and folk hero who belonged to the Munda tribe.”

“Another large Dravidian tribe of Chota Nagpur who retain totemism and exogamy are the Mundas. Physically they are among the finest of the aboriginal tribes of the plateau. The men are about five feet six in height, their bodies lithe and muscular, their skin of the darkest brown or almost black, their features coarse, with broad flat noses, low foreheads, and thick lips. Thus from the physical point of view the Mundas are pure Dravidians. Yet curiously  enough they speak a language which differs radically from the true Dravidian. … This interesting family of language is now known to be akin to the Mon-Khmer languages of Further India as well as to the Nicobarese and the dialects of certain wild tribes of Malacca. It is perhaps the language which has been longest spoken in India, and may well have been universally diffused over the whole of that country as well as Malacca before the tide of invasion swept it away from vast areas and left it outstanding only in a few places like islands or solitary towers rising from an ocean of alien tongues. …

“Another well-known Dravidian tribe of Bengal among whom totemism combined with exogamy has been discovered are the Khonds, Kondhs, or Kandhs, who inhabit a hilly tract called Kandhmals in Boad, one of the tributary states of Orissa in the extreme south of Bengal. …Their country is wild and mountainous, consisting of a labyrinth of ranges covered with dense forests of sal trees. They are a shy and timid folk, who love their wild mountain gorges and the stillness of jungle life, but eschew contact with the low-landers and flee to the most inaccessible recesses of their rugged highlands at the least alarm. They subsist by hunting and a primitive sort of agriculture, clearing patches of land for cultivation in the forest during the cold weather and firing it in the heat of summer. The seed is sown among the ashes of the burnt forest when the first rains have damped it. After the second year these rude tillers of the soil abandon the land and make a fresh clearing in the woods.

“The cruel human sacrifices which they used to offer to the Earth Goddess in order to ensure the fertility of their fields have earned for the Khonds an unenviable notoriety among the hill tribes of India. These sacrifices were at last put down by the efforts of British officers.”

The text says no more on the subject, but Wikipedia recounts:

Traditionally the Kondh religious beliefs were syncretic combining totemism, animism, Ancestor worship, shamanism and nature worship.The Kondhs gave highest importance to the Earth goddess, who is held to be the creator and sustainer of the world. Earlier Human Sacrifices called “Meriah” were offered by the Kondh to propitiate the Earth Goddess. In the Kondh society, a breach of accepted religious conduct by any member of their society invited the wrath of spirits in the form of lack of rain fall, soaking of streams, destruction of forest produce, and other natural calamities. Hence, the customary laws, norms, taboos, and values were greatly adhered to and enforced with high to heavy punishments, depending upon the seriousness of the crimes committed. The practise of traditional religion has almost become extinct today.

Meriah sacrifice post

Castes and Tribes of Southern India, (1909) assembled by K. Rangachari, recounts:

In another report, Colonel Campbell describes how the miserable victim is dragged along the fields, surrounded by a crowd of half intoxicated Khonds, who, shouting and screaming, rush upon him, and with their knives cut the flesh piecemeal from the bones, avoiding the head and bowels, till the living skeleton, dying from loss of blood, is relieved from torture, when its remains are burnt, and the ashes mixed with the new grain to preserve it from insects. Yet again, he describes a sacrifice which was peculiar to the Khonds of Jeypore. It is, he writes, always succeeded by the sacrifice of three human beings, two to the sun to the east and west of the village, and one in the centre, with the usual barbarities of the Meriah. A stout wooden post about six feet long is firmly fixed in the ground, at the foot of it a narrow grave is dug, and to the top of the post the victim is firmly fastened by the long hair of his head. Four assistants hold his out-stretched arms and legs, the body being suspended horizontally over the grave, with the face towards the earth. The officiating Junna or priest, standing on the right side, repeats the following invocation, at intervals hacking with his sacrificial knife the back part of the shrieking victims neck. O ! mighty Manicksoro, this is your festal day. To the Khonds the offering is Meriah, to kings Junna. On account of this sacrifice, you have given to kings kingdoms, guns and swords. The sacrifice we now offer you must eat, and we pray that our battle-axes may be converted into swords, our bows and arrows into gunpowder and balls ; and, if we have any quarrels with other tribes, give us the victory.

Let’s return to Frazer:

“While totemism combined with exogamy is widely spread among the aboriginal tribes of India, it is remarkable that no single indubitable case of it has been recorded, so far as I know, in all the rest of the vast continent of Asia. In the preceding chapters we have traced this curious system of society and superstition from Australia through the islands of Torres Straits, New Guinea, Melanesia, Polynesia, Indonesia, and India. On the eastern frontier of India totemism stops abruptly, and in our totemic survey of the world we shall not meet with any clear evidence of it again till we pass to Africa or America. If we leave India out of account, Asia, like Europe, is practically a blank in a totemic map of the world.”

EvX: Too bad there’s no MAP. A map would have been useful.

Herero woman

Africa:

“When we pass from Asia to Africa the evidence for the existence of totemism and exogamy again becomes comparatively copious ; for the system is found in vogue among Bantu tribes both of Southern and of Central Africa as well as among some of the pure negroes of the West Coast. We begin with the Herero, Ovaherero, or Damaras as they used to be called, who inhabit German South-West Africa.

“The Herero are a tall finely-built race of nomadic herdsmen belonging to the Bantu stock, who seem to have migrated into their present country from the north and east some hundred and fifty or two hundred years ago. The desert character of the country and its seclusion from the world long combined to preserve the primitive manners of the inhabitants. A scanty and precarious rainfall compels them to shift their dwellings from place to place in order to find pasture for their cattle ; and an arid, absolutely rainless coast of dreary sandhills affords no allurement to the passing mariner to land on the inhospitable shore. … But when the first rains, accompanied by thunderstorms of tremendous violence, have fallen, the whole scene changes as by magic. The wastes are converted into meadows of living green, gay with a profusion of beautiful flowers and fragrant with a wealth of aromatic grasses and herbs … Now is the time when the cattle roam at large on the limitless prairies, and beasts of all kinds descend from their summer haunts in the mountains, bringing life and animation where the silence and solitude of death had reigned before. …

“In their native state the Herero are a purely pastoral people, though round about the mission stations some of them have learned to till the ground. They possess, or used to possess, immense herds of cattle and flocks of sheep and goats. These are the pride and joy of their hearts, almost their idols. Their riches are measured by their cattle ; he who has none is of no account in the tribe. Men of the highest standing count it an honour to tend the kine ; the sons of the most powerful chiefs are obliged to lead for a time the life of simple herdsmen. They subsist chiefly on the milk of their herds, which they commonly drink sour. From a motive of superstition they never wash the milk vessels, believing firmly that if they did so the cows would yield no
more milk. Of the flesh they make but little use, for they seldom kill any of their cattle, and never a cow, a calf, or a lamb. Even oxen and wethers are only slaughtered on solemn and festal occasions, such as visits, burials, and the like. Such slaughter is a great event in a village, and young and old flock from far and near to partake of the meat.

“Their huts are of a round beehive shape, about ten feet in diameter. …

“A special interest attaches to the Herero because they are the first people we have met with in our survey who undoubtedly combine totemism with a purely pastoral life ; hitherto the totemic tribes whom we have encountered have been for the most part either hunters or husbandmen…”

EvX: The text claims that the Herero do not wash the vessels they use for holding and storing milk, but if I recall correctly, they actually use urine to this effect, due to their area being quite dry. (Frazer may not have considered urine a cleaning agent, or may have simply been ignorant on this matter.)

Cathedral Round-Up: Should I read Nichols or Pinker?

Harvard Mag had interesting interviews/reviews of both Tom Nichols’s “Death of Expertise” and Steven Pinker’s “Enlightenment Now“.

From the article about Nichols:

Several years ago, Tom Nichols started writing a book about ignorance and unreason in American public discourse—and then he watched it come to life all around him, in ways starker than he had imagined. A political scientist who has taught for more than a decade in the Harvard Extension School, he had begun noticing what he perceived as a new and accelerating—and dangerous—hostility toward established knowledge. People were no longer merely uninformed, Nichols says, but “aggressively wrong” and unwilling to learn. They actively resisted facts that might alter their preexisting beliefs. They insisted that all opinions, however uninformed, be treated as equally serious. And they rejected professional know-how, he says, with such anger. That shook him.

Skepticism toward intellectual authority is bone-deep in the American character, as much a part of the nation’s origin story as the founders’ Enlightenment principles. Overall, that skepticism is a healthy impulse, Nichols believes. But what he was observing was something else, something malignant and deliberate, a collapse of functional citizenship.

What are people aggressively wrong about, and what does he think is causing the collapse of functional citizenship?

The Death of Expertise resonated deeply with readers. … Readers regularly approach Nichols with stories of their own disregarded expertise: doctors, lawyers, plumbers, electricians who’ve gotten used to being second-guessed by customers and clients and patients who know little or nothing about their work. “So many people over the past year have walked up to me and said, ‘You wrote what I was thinking,’” he says.

Sounds like everyone’s getting mansplained these days.

The Death of Expertise began as a cri de coeur on his now-defunct blog in late 2013. This was during the Edward Snowden revelations, which to Nichols’s eye, and that of other intelligence experts, looked unmistakably like a Russian operation. “I was trying to tell people, ‘Look, trust me, I’m a Russia guy; there’s a Russian hand behind this.’ ” But he found more arguments than takers. “Young people wanted to believe Snowden was a hero.”

I don’t have a particular opinion on Snowdon because I haven’t studied the issue, but let’s pretend you were in the USSR and one day a guy in the government spilled a bunch of secrets about how many people Stalin was having shot and how many millions were starving to death in Holodomor (the Ukrainian genocide.) (Suppose also that the media were sufficiently free to allow the stories to spread.)

Immediately you’d have two camps: the “This guy is a capitalist spy sent to discredit our dear leader with a hideous smear campaign” and “This guy is totally legit, the people need to know!”

Do you see why “Snowden is a Russian” sounds like the government desperately trying to cover its ass?

Now let’s suppose the guy who exposed Stalin actually was a capitalist spy. Maybe he really did hate communism and wanted to bring down the USSR. Would it matter? As long as the stuff he said was true, would you want to know anyway? I know that if I found out about Holodomor, I wouldn’t care about the identity of the guy who released the information besides calling him a hero.

I think a lot of Trump supporters feel similarly about Trump. They don’t actually care whether Russia helped Trump or not; they think Trump is helping them, and that’s what they care about.

In other words, it’s not so much “I don’t believe you” as “I have other priorities.”

In December, at a JFK Library event on reality and truth in public discourse, a moderator asked him a version of “How does this end?” … “In the longer term, I’m worried about the end of the republic,” he answered. Immense cynicism among the voting public—incited in part by the White House—combined with “staggering” ignorance, he said, is incredibly dangerous. In that environment, anything is possible. “When people have almost no political literacy, you cannot sustain the practices that sustain a democratic republic.” The next day, sitting in front of his fireplace in Rhode Island, where he lives with his wife, Lynn, and daughter, Hope, he added, “We’re in a very perilous place right now.”

Staggering ignorance about what, I wonder. Given our increased access to information, I suspect that the average person today both knows and can easily find the answers to far more questions than the average person of the 80s, 50s, or 1800s.

I mean, in the 80s, we still had significant numbers of people who believed in: faith healing; televangelists; six-day creationism; “pyramid power”; crop circles; ESP; UFOs; astrology; multiple personality disorder; a global Satanic daycare conspiracy; recovered memories; Freudianism; and the economic viability of the USSR. (People today still believe in the last one.)

One the one hand, I think part of what Nichols is feeling is just the old distrust of experts projected onto the internet. People used to harass their local school boards about teaching ‘evilution’; today they harass each other on Twitter over Ben Ghazi or birtherism or Russia collusion or whatever latest thing.

We could, of course, see a general decline in intellectual abilities as the population of the US itself is drawn increasingly from low-IQ backgrounds and low-IQ people (appear to) outbreed the high-IQ ones, but I have yet to see whether this has had time to manifest as a change in the amount of general knowledge people can use and display, especially given our manifestly easier time actually accessing knowledge. I am tempted to think that perhaps the internet forced Nichols outside of his Harvard bubble and he encountered dumb people for the first time in his life.

On the other hand, however, I do feel a definite since of malaise in America. It’s not about IQ, but how we feel about each other. We don’t seem to like each other very much. We don’t trust each other. Trust in government is low. Trust in each other is low. People have fewer close friends and confidants.

We have material prosperity, yes, despite our economic woes, but there is a spiritual rot.

Both sides are recognizing this, but the left doesn’t understand what is causing it.

They can point at Trump. They can point at angry hoards of Trump voters. “Something has changed,” they say. “The voters don’t trust us anymore.” But they don’t know why.

Here’s what I think happened:

The myth that is “America” got broken.

A country isn’t just a set of laws with a tract of land. It can be that, but if so, it won’t command a lot of sentimental feeling. You don’t die to defend a “set of laws.” A country needs a people.

“People” can be a lot of things. They don’t have to be racially homogenous. “Jews” are a people, and they are not racially homogenous. “Turks” are a people, and they are not genetically homogenous. But fundamentally, people have to see themselves as “a people” with a common culture and identity.

America has two main historical groups: whites and blacks. Before the mass immigration kicked off in 1965, whites were about 88% of the country and blacks were about 10%. Indians, Asians, Hispanics, and everyone else rounded out that last 2%. And say what you will, but whites thought of themselves as the American culture, because they were the majority.

America absorbed newcomers. People came, got married, had children: their children became Americans. The process takes time, but it works.

Today, though, “America” is fractured. It is ethnically fractured–California and Texas, for example, are now majority non-white. There is nothing particularly wrong with the folks who’ve moved in, they just aren’t from one of America’s two main historical ethnic groups. They are their own groups, with their own histories. England is a place with a people and a history; Turkey is a place with a people and a history. They are two different places with different people and different history. It is religiously fractured–far fewer people belong to one of America’s historically prominent religions. It is politically fractured–more people now report being uncomfortable with their child dating a member of the opposite political party than of a different race.

Now we see things like this: After final vote, city will remove racist Pioneer Monument Statue:

As anticipated, the San Francisco Arts Commission voted unanimously Monday to remove the “Early Days” statue from Civic Center’s Pioneer Monument, placing the century-plus old bronze figures in storage until a long-term decision about their fate can be made.

The decision caps off a six-month long debate, after some San Franciscans approached the commission in August 2017 to complain about the statue, which features a pious but patronizing scene of a Spanish missionary helping a beaten Indian to his feet and pointing him toward heaven.

In February the city’s Historic Preservation Commission voted unanimously to recommend removing “Early Days” despite some commissioners expressing reservations about whether the sculpture has additional value as an expose of 19th century racism.

Your statues are racist. Your history is racist. Your people is racist.

What do they think the reaction to this will look like?

 

But before we get too dark, let’s take a look at Pinker’s latest work, Enlightenment Now:

It is not intuitive that a case needs to be made for “Reason, Science, Humanism, and Progress,” stable values that have long defined our modernity. And most expect any attack on those values to come from the far right: from foes of progressivism, from anti-science religious movements, from closed minds. Yet Steven Pinker argues there is a second, more profound assault on the Enlightenment’s legacy of progress, coming from within intellectual and artistic spheres: a crisis of confidence, as progress’s supporters see so many disasters, setbacks, emergencies, new wars re-opening old wounds, new structures replicating old iniquities, new destructive side-effects of progress’s best intentions. …

Pinker’s volume moves systematically through various metrics that reflect progress, charting improvements across the last half-century-plus in areas from racism, sexism, homophobia, and bullying, to car accidents, oil spills, poverty, leisure, female empowerment, and so on. …

the case Pinker seeks to make is at once so basic and so difficult that a firehose of evidence may be needed—optimism is a hard sell in this historical moment. … Pinker credits the surge in such sentiments since the 1960s to several factors. He points to certain religious trends, because a focus on the afterlife can be in tension with the project of improving this world, or caring deeply about it. He points to nationalism and other movements that subordinate goods of the individual or even goods of all to the goods of a particular group. He points to what he calls neo-Romantic forms of environmentalism, not all environmentalisms but specifically those that subordinate the human species to the ecosystem and seek a green future, not through technological advances, but through renouncing current technology and ways of living. He also points to a broader fascination with narratives of decline …

I like the way Pinker thinks and appreciate his use of actual data to support his points.

To these decades-old causes, one may add the fact that humankind’s flaws have never been so visible as in the twenty-first century. … our failures are more visible than ever through the digital media’s ceaseless and accelerating torrent of grim news and fervent calls to action, which have pushed many to emotional exhaustion. Within the last two years, though not before, numerous students have commented in my classroom that sexism/racism/inequality “is worse today than it’s ever been.” The historian’s answer, “No, it used to be much worse, let me tell you about life before 1950…,” can be disheartening, especially when students’ rage and pain are justified and real. In such situations, Pinker’s vast supply of clear, methodical data may be a better tool to reignite hope than my painful anecdotes of pre-modern life.

Maybe Nichols is on to something about people today being astoundingly ignorant…

Pinker’s celebration of science is no holds barred: he calls it an achievement surpassing the masterworks of art, music, and literature, a source of sublime beauty, health, wealth, and freedom.

I agree with Pinker on science, but Nichols’s worldview may be the one that needs plumbing.

Which book do you want me to read/review?

The Progressive Mind Virus Spreads to… India?

As ANI (Asian News International) reports on Twitter (h/t Rohit):

For those of you reading this in the future, after the 15 minutes of manufactured furor have subsided, #MarcyForOurLives is an anti-guns/pro-gun control movement in the US. Gun laws in India are notably much stricter than gun laws in the US, and yet–

The thing that looks like a mushroom is the internal part of a uterus; you can see the rest of the drawing faintly around it. As noted, this is completely backwards from the reality in India, where it is nearly impossible to buy a gun but abortions are extremely common and completely legal. So where did the marchers in Mumbai get this sign?

Well, it’s a meme, found on Twitter, instagram, t-shirts, and of course signs at pussyhat rallies in the US. It’s not even true in the US, but at least it kind of makes sense given our frequent debates over both guns and abortions. Certainly there are some people in the US who think abortions should be completely illegal. India, by contrast, is a nation where slowing the growth rate to prevent famine is a high priority and abortions are quite legal.

I am reminded of that time Michelle Obama tweeted #BringBackOurGirls in support of Nigerians kidnapped by Boko Haram:

This is the signature of a mind-virus: it makes you repeat things that make no sense in context. It makes you spread the virus even though it does not make logical sense for you, personally, to spread it. Michelle Obama is married to a man who controlled, at the time, the world’s largest military, including an enormous stockpile of nuclear weapons, and yet she was tweeting ineffective hashtags to be part of the #movement.

Likewise, the state of gun (and abortion) laws in India is nothing like their state in the US, yet Indians are getting sucked into spreading our viral memes.

Horizontal meme transfer–like social media–promotes the spread of memetic viruses.

Totemism and Exogamy pt 2/3: Plagues, Polyandry, and Infanticide

Welcome back to James Frazer’s Totemism and Exogamy, published in 1910. Here are some hopefully interesting excerpts (as usual, quotes are in “” instead of blocks):

“When an ox or a buffalo dies, the Madigas gather round it like vultures, strip off the skin and tan it, and batten on the loathsome carrion. Their habits are squalid in the extreme and the stench of their hamlets is revolting. They practice various forms of fervent but misguided piety, lying on beds of thorns, distending the mouth with a mass of mud as large as a cricket-ball, bunging up their eyes with the same stuff, and so forth, thereby rendering themselves perhaps well-pleasing to their gods but highly disgusting to all sensible and cleanly men.

“An unmarried, but not necessarily chaste, woman of the caste personifies the favourite goddess Matangi, whose name she bears and of whom she is supposed to be an incarnation. Drunk with toddy and enthusiasm, decked with leaves of the margosa tree {Melia Azadirachtd), her face reddened with turmeric, this female incarnation of the deity dances frantically, abuses her adorers in foul language, and bespatters them with her spittle, which is believed to purge them from all uncleanness of body and soul. Even high-class Reddis, purse-proud Komatis, and pious Brahmans receive the filthy eructations of this tipsy maniac with joy and gratitude as outpourings of the divine spirit.

“When an epidemic is raging, the Madigas behead a buffalo before the image of their village goddess Uramma and a man carries the blood-reeking head in procession on his own head round the village, his neck swathed in a new cloth which has been soaked in the buffalo’s blood. This is supposed to draw a cordon round the dwellings and to prevent the irruption of evil spirits. The villagers subscribe to defray the expense of the procession. If any man refuses to pay, the bloody head is not carried round his house, and the freethinker or niggard is left to the tender mercies of the devils.

“The office of bearer of the head is an ill-omened and dangerous one ; for huge demons perch on the tops of tall trees ready to swoop down on him and carry him and his bleeding burden away. To guard against this catastrophe ropes are tied to his body and arms, and men hang on like grim death to the ends of them.
… ”

15 So the Lord sent a pestilence upon Israel from the morning even to the time appointed: and there died of the people from Dan even to Beersheba seventy thousand men. …

18 And Gad came that day to David, and said unto him, Go up, rear an altar unto the Lord in the threshing floor of Araunah the Jebusite. …

So David bought the threshingfloor and the oxen for fifty shekels of silver.

25 And David built there an altar unto the Lord, and offered burnt offerings and peace offerings. So the Lord was intreated for the land, and the plague was stayed from Israel. –2 Samuel, 24

EvX: There’s not a whole lot of information on Wikipedia about the Madigas aside from the fact that they are one of India’s scheduled castes and were “historically marginalized and oppressed.” Since the tanning of leather is (or at least was) really rank, leather tanning communities have historically faced a fair amount of discrimination.

The use of sacrifice to end plagues is a fascinating part of older religions (and most unfortunate for the sacrificed.) I’ve long thought that Beowulf was really a story about a plague (personified as Grendel/Grendel’s mother) appeased by sacrificing a warrior by throwing his body into a lake or bog. Of course, the warrior isn’t supposed to “die” but travel to the spirit realm to slay the evil spirit causing the plague:

There came unhidden
tidings true to the tribes of men,
in sorrowful songs, how ceaselessly Grendel
harassed Hrothgar, what hate he bore him,
what murder and massacre, many a year,
feud unfading, — refused consent
to deal with any of Daneland’s earls,
make pact of peace, or compound for gold: …
But the evil one ambushed old and young
death-shadow dark, and dogged them still,
lured, or lurked in the livelong night
of misty moorlands: men may say not
where the haunts of these Hell-Runes   be. …
Many nobles
sat assembled, and searched out counsel
how it were best for bold-hearted men
against harassing terror to try their hand.
Whiles they vowed in their heathen fanes
altar-offerings, asked with words
that the slayer-of-souls would succor give them
for the pain of their people. …
Beowulf spoke: … with Hrunting [sword] I
seek doom of glory, or Death shall take me.”
After these words the Weder-Geat lord
boldly hastened, biding never
answer at all: the ocean floods
closed o’er the hero. Long while of the day
fled ere he felt the floor of the sea.
Soon found the fiend who the flood-domain
sword-hungry held these hundred winters,
greedy and grim, that some guest from above,
some man, was raiding her monster-realm.

In Plague and the End of Antiquity: The Pandemic of 541-750, Stoclet quotes Sturluson’s Ynglinga saga:

Domald took the heritage after his father Visbur, and ruled over the land As in his time there was great famine and distress, the swedes made great offerings of sacrifice at Upsal. The first autumn they sacrificed oxen, but the succeeding season was not improved thereby. The following autumn they sacrificed men, but the succeeding year was rather worse. The third autumn, when the offer of sacrifice should begin, a great multitude of Swedes came to Upsal; and now the chiefs held consultations with each other, and all agreed that the time of scarcity were on account of their king Domald, and they resoled to offer him for good seasons, and to assault and kill him, and sprinkle the stalls of the god with this blood. And they did so.

Stoclet continues:

Anyone familiar with Arthur Maurice Hocart’s anthropologocial wiritings on kingship will know that the ancient Swedes of Snorri Sturluson’s Ynglinga saga were anything but unique in believing that a strong connection existed between king and cosmos. This connection underlies a recurring explanation for plague, namely, that i was a direct consequence of the king’s sexual misconduct, specifically in its most extreme form of incest.

In King David’s case, though, the plague was caused by his taking a census.

(Actually, since King David’s census required the movement of the army throughout his kingdom in order to force compliance with the census-takers, maybe the census actually did cause a plague–there’s no doubt the movement of troops during WWI contributed to the Influenza Epidemic of 1918, after all.)

 

Toda Village, 1837, by Richard Barron

Todas:

“The Todas are a small tribe, now less than a thousand in number, who inhabit the lofty and isolated tableland of the Neilgherry Hills. They are a purely pastoral people tribe devoting themselves to the care of their herds of buffaloes and despising agriculture and nearly all manual labour as beneath their dignity. Their origin and affinities are unknown; little more than vague conjecture has been advanced to connect them with any other race of Southern India.

They are a tall, well-built, athletic people, with a rich brown complexion, a profusion of jet black hair, a large, full, speaking eye, a Roman nose, and fine teeth. The men are strong and very agile, with hairy bodies and thick beards. Their countenances are open and expressive ; their bearing bold and free ; their manners grave and dignified ; their disposition very cheerful and friendly. In intelligence they are said to be not inferior to any average body of educated Europeans. In temperament they are most pacific, never engaging in warfare and not even possessing weapons, except bows and arrows and clubs, which they use only for purposes of ceremony. Yet they are a proud race and hold their heads high above all their neighbours.

“The country which they inhabit has by its isolation sheltered them from the inroads of more turbulent and warlike peoples and has allowed them to lead their quiet dream-like lives in all the silence and rural simplicity of an Indian Arcadia. For the land which is their home stands six or seven thousand feet above the sea and falls away abruptly or even precipitously on every side to the hot plains beneath. …

A Toda temple in Muthunadu Mund near Ooty, India.

“Generally a village nestles in a beautiful wooded hollow near a running stream. It is composed of a few huts surrounded by a wall with two or three narrow openings in it wide enough to admit a man but not a buffalo. The huts are of a peculiar construction. Imagine a great barrel split lengthwise and half of it set lengthwise with the cut edges resting on the ground, and you will get a fair idea of a Toda hut. … Near the village is commonly a dairy with a pen for the buffaloes at night and a smaller pen for the calves.

“The daily life of the Toda men is spent chiefly in the tending the buffaloes and in doing the work of the dairy. … Women are entirely excluded from the work of the dairy ; they may neither milk the cows nor churn the butter. Besides the common buffaloes there are sacred buffaloes with their own sacred dairies, where the sacred milk is churned by sacred dairymen. These hallowed dairies are the temples and the holy dairymen are the priests, almost the gods, of the simple pastoral folk.

“The dairyman leads a dedicated life… If he is married he must leave his wife and not go near her or visit his home during the term of his incumbency, however many years it may last. No person may so much as touch him without reducing his holiness to the level of a common man. He may not cross a river by a bridge but must wade through the water at the ford, and only certain fords may be used by him. If a death occurs in the clan he may not attend the funeral unless he resigns his sacred office.

“However, there are different degrees of sanctity among the sacred dairymen. …

“The Todas have the institution of exogamy without the institution of totemism. The whole tribe is divided into two endogamous groups, the Tartharol and the Teivaliol.  Regular marriage is not allowed between these groups, though irregular unions are permitted… Each of these primary divisions is subdivided into a number of exogamous clans ; no man or woman may marry a woman of his or her own clan, but must marry into another clan. But while marriage is prohibited between members of the same clan, it would seem that sexual intercourse is not prohibited and indeed commonly takes place between them. …

Toda woman and two men (though the Wikipedia doesn’t claim that these are her husbands.)

“The Todas have a completely organised and definite system of polyandry, and in the vast majority of polyandrous marriages the husbands are own brothers. Indeed, when a woman marries, it is understood that she becomes the wife of his brothers at the same time. …

“When the joint husbands are not own brothers, they may either live with the wife in one family, or they may dwell in different villages. In the latter case the usual custom is for the wife to reside with each husband in turn for a month … When the joint husbands are own brothers they live together in amity ; in such a family quarrels are said to be unknown. The Todas scout as ridiculous the idea that there should ever be disputes or jealousies between the brother-husbands. When a child is born in a family of this sort, all the brothers are equally regarded as its fathers ; though if a man be asked the name of his father, he will generally mention one man of the group, probably the most prominent or important of them. …

“When the joint husbands are not brothers, they arrange among themselves who is to be the putative father of each child as it is born, and the chosen one accepts the responsibility by performing a certain ceremony …

“The ceremony takes place about the seventh month of the woman’s pregnancy and begins on the evening before the day of the new moon. Husband and wife repair to a wood, where he cuts a niche in a tree and places a lighted lamp in the niche. The two then search the wood till they find the wood called puv {Sophora glauca) and the grass called nark {Andropogon schoenanthus). A bow is made from the wood by stripping off the bark and stretching it across the bent stick so as to form the bowstring. The grass is fitted to the little bow to stand for an arrow. Husband and wife then return to the tree. … The wife then sits down under the tree in front of the lamp, which glimmers in the gloaming or the dark from its niche, on a level with her eyes as she is seated on the ground. The husband next gives her the bow and arrow, and she asks him what they are called. He mentions the name of the bow and arrow, which differs for each clan. …

If this were a Freudian blog, I’d tell you the arrow is a penis.

“On receiving the bow and arrow the woman raises them to her forehead, and then holding them in her right hand she gazes steadily at the burning lamp for an hour or until the light flickers and goes out. The man afterwards lights a fire under the tree and cooks jaggery and rice in a new pot. When the food is ready, husband and wife partake of it together. … Afterwards the relatives return from the village and all pass the night in the wood, the relatives keeping a little way off from the married pair. …

“This remarkable ceremony is always performed in or about the seventh month of a woman’s first pregnancy, whether her husbands are brothers or not. … When the joint husbands are brothers, it is the eldest brother who gives the little bow and arrow. The fatherhood of the child, or rather the social recognition of it, depends entirely on the performance of this ceremony, so much so that he who gives the bow and arrow is counted the father of the child even if he be known to have had no former connection with the woman ; and on the other hand if no living man has performed the ceremony, the child will be fathered on a dead man. An indelible disgrace attaches to a child for whom the ceremony has not been performed.”

EvX: Frazer goes on to describe a number of similar customs, including ones including beans (such as the throwing of beans and grains on a bride,) but seems to have missed Cupid’s use of the bow and arrow to induce love.

Lest you think that polyandry among the Todas and their lack of sexual jealousy means they live in some kind of free-love, feminist paradise:

“The custom of polyandry among the Todas is facilitated, if not caused, by a considerable excess of men over women, and that excess has been in turn to a great extent brought about by the practice of killing the female children at birth. It seems clear that female infanticide has always been and still is practised by the Todas, although in recent years under English influence it has become much less frequent.”