The “rural purge” in American TV was the cancellation, between 1969 and 1972, of “everything with a tree in it.” Wikipedia lists 26 shows that were purged, everything from Lassie to Gunsmoke to Red Skelton. Some of these shows were probably declining anyway and would have been cancelled sooner or later, but most, like Hee Haw (#16 in the ratings), or Red Skelton (#7) were doing quite well.
According to Wikipedia, CBS’s original plans called for Gunsmoke–a TV and radio success since 1952–to be canceled at the end of the 1970-71 season, but Gunsmoke kept defying them by doing things like coming in #5 and #4 in the Nielsen Ratings; it didn’t get canceled until the end of the 1974/5 season, when it came in #28.
The entire cast was stunned by the cancellation, as they were unaware that CBS was considering it. According to Arness, “We didn’t do a final, wrap-up show. We finished the 20th year, we all expected to go on for another season, or two or three. The (network) never told anybody they were thinking of canceling.” The cast and crew read the news in the trade papers.[26]
Gunsmoke was replaced with Rhoda and Phyllis. Rhoda did well for two seasons; then it cratered. By season 5, it had sunk to #43 and was cancelled. Phyllis made it for an impressive two whole seasons, ending at #40.
If ratings didn’t drive the purge, then what did?
Advertisers.
Advertisers wanted TV shows that appealed to young people with money to spend and tastes to shape, not old people whose tastes and incomes were already fixed. (The same dynamic that lead tobacco companies to try to market cigarettes to children.) Everything that appealed to the wrong demographics–old people, rural people, poor people–got the axe. They were replaced with “relevant”show like All in the Family, The Mary Tyler Moore Show, and the Brady Bunch Comedy Hour, which everyone agreed was awful.
Amusingly, Congress–which tends to be full of old people–was upset about all of its favorite shows getting cancelled and replaced with programs aimed at 20-something single women and hippies. According to Wik:
The backlash from the purge prompted CBS to commission, perhaps somewhat facetiously, a rural family drama for its Fall 1972 schedule, but the network scheduled it in what it thought would be a death slot against popular series The Flip Wilson Show and The Mod Squad, allegedly hoping the show would underperform and head to a quick cancellation. Instead, The Waltons went on to run for nine seasons, reaching as high as second in the Nielsens and finishing in the top 30 for seven of its nine years on air.
Dude lives in a trash can.
Mary Tyler Moore only lasted for 7 seasons.
I suspect something similar happened in the late 80s/early 90s as the grittiness of our degraded cities, reflected in shows like Taxi, Sesame Street, and Welcome Back Kotter, began to distress watchers instead of inspire them, and networks began focusing on suburban comedies like The Cosby Show and Full House, but I have yet to find any articles on the subject. (This trend may have reversed again once Giulliani cleaned up NYC, resulting in shows like Seinfeld and Friends.)
“If you aren’t a liberal when you’re young, you have no heart; if you’re not a conservative when you’re old, you have no brains.” — Variously misattributed
While I wouldn’t describe younger me as a total idiot, there are certainly a great many things that I know now that I didn’t know back when Joe Camel ads were a thing or when I attended candle lit vigils for Darfur. That’s part of growing up and getting older: hopefully you learn something.
What happens when most TV programming for 40 or 50 years is intended to appeal primarily to people who don’t yet know much about the world?
It took quite a bit of rumination to come up with something better than “obviously they have trouble telling when the wool is being pulled over their eyes,” and “small children tend not to notice the pattern of kids’ TV shows making the black character the smartest one.” (I’d make a list except I don’t care that much, but I’ll note that even LazyTown, an Icelandic TV show, does it.)
So the non-obvious effect: People massively over-estimate the percent of the country that agrees with liberal values, and then are shocked by reality.
Basically, TV–a few cable stations excepted–functions like a great big liberal bubble.
27% of Americans–20% of Democrats and 44% of Republicans–favor deporting illegal immigrants. 39% of Americans favor amending the Constitution to end birthright citizenship; 41% believe immigrants are, on net, a burden.
17% believe the Bundy-led militia takeover of a building in Oregon was just.
60% of Republican primary voters think the US should should ban Muslims from entering the US; 45% of Democrats agree, so long as you don’t mention that it was Trump’s idea.
44% of Americans between the ages of 18 and 29 don’t know which country America gained its independence from. (Also, while 85% of men know the answer, only 69% of women managed the same feat.)
57% of Americans see the Confederate flag as a symbol of Southern pride, rather than racism; 43% oppose removing the Confederate flag from government buildings.
31% of Americans believe it is immoral to be transgender; 59% believe trans people should use the bathroom that corresponds to their birth gender.
If your perception of “normal” is based on the sitcoms you see on TV, chances are good that virtually all of these stats are surprising, because they don’t feature many young Earth creationists who fly the Confederate flag and want to start WWIII with Russia.
But given the structure of our electoral system–Republicans pick a candidate; Democrats pick a candidate; everyone gets together and we vote for the Republican or the Democrat–there’s a very good chance that about half the time, the president will actually agree with a variety of the positions listed above (except he’s pretty much guaranteed to know who we fought in the Revolution.) If you think that’s an absolutely horrible outcome, I recommend either advocating for massively changing the structure of the electoral system, or investigating some form of Neocameralism.
One of the more amusing experiences of the past few months has been watching people–both Democrats and Republicans–express outrage, shock, and confusion at Donald Trump’s success. Who could have predicted that “kick out illegal immigrants” might attract more voters than “Let’s all die in a war with Russia”? (Clearly not anyone paid to understand which issues appeal to voters.)
“It’s time we punched the Russians in the nose.”–Presidential candidate Gov. John Kasich
“Not only would I be prepared to do it, I would do it,” blurted Christie: … Yes, we would shoot down the planes of Russian pilots if in fact they were stupid enough to think that this president was the same feckless weakling … we have in the Oval Office … right now.”
Carly Fiorina would impose a no-fly zone and not even talk to Putin until we’ve conducted “military exercises in the Baltic States”
There are a lot of people who would describe Trump as “literally Hitler” (which seems a little unfair to a guy whose daughter is Orthodox Jewish,) for wanting to deport illegal immigrants, temporarily halt Muslim immigration, and create some kind of Muslim registry, at least until we have fewer incidents like the San Bernardino Christmas Party shooting.
But on the scale of human suffering, I guarantee that a war with Russia would be far, far worse, and yet no one is protesting against that possibility, screaming that those candidates are going to start WWIII, nor even mildly concerned that the “mainstream” Republican candidates are so completely off their rockers, they make Trump look like a pacifist. (Here the Rand Paul supporters would like to point out that their candidate is also sane.)
Perhaps political commentators have become accustomed to Republicans starting wars and the threat of nuclear armageddon. Killing foreigners is a normal part of the Republican agenda–but trying to keep them out of the country? That’s completely novel. (Or at least, we haven’t done it since the 20s.)
Or perhaps people are mad because Trump is vocally anti-liberal and garners much of his support from people who hate liberals, and liberals had not realized just how many people really hate them.
Liberals must not get out very much.
I didn’t actually intend this to turn into a Trump post, but the subject is popular thee days. Had I written this in 2004, I’d have discussed the two young women I had just spoken with who swore up and down that George Bush couldn’t win reelection because “no one likes him.”
(As always, this blog makes no official political endorsements.)
If you aren’t familiar with the “replication crisis,” in social psychology, start here, here, and here.
I consider the courses I took in college on quantitative and qualitative methods the most important of my undergraduate years. I learned thereby a great many important things about how not to conduct an experiment and how to think about experimental methodology (not to mention statistics.)
If I were putting together a list of “general education” requirements I wanted all students to to take in order to declare them well-educated and ready to go out into the world, it’d be a course on Quantitative and Qualitative Methods. (Much like current “gen ed” and “distribution requirements,” the level of mathematical ability required would likely vary by field, though no one should be obtaining a college degree without some degree of numerical competence.)
But the real problem with the social science fields is not lack of rigorous statistical background, but overwhelming ideological conformity, enforced by the elders of the fields–advisers, hiring committees, textbook writers, journal editors, etc., who all believe in the same ideology and so have come to see their field as “proving” their ideology.
Ideology drives both the publication biases and the wishful thinking that underlie this crisis. For example, everyone in “Women’s studies” is a feminist who believes that “science” proves that women are oppressed because everyone they know has done studies “proving” it. You’re not going to find a lot of Women’s Studies professors aiming for tenure on the basis of their successful publication of a bunch of studies that failed to find any evidence of bias against women. Findings like that => no publication => no tenure. And besides, feminist professors see it as their moral duty to prove that discrimination exists, not to waste their time on studies that just happened not to be good enough to find the effect.
In the Social Sciences more generally, we get this “post modern” mish-mash of everything from Marxists to Freudians to folks who like Foucault and Said, where the goal is to mush up long-winded descriptions of otherwise simple phenomena into endless Chomsky Sentences.
(Just reading the Wikipedia pages on a variety of Social Science oriented topics reveals how very little real research or knowledge is generated in these fields, and how much is based on individual theorists’ personal views. It is often obvious that virtually anyone not long steeped in the academic literature of these fields would not come up with these theories, but with something far more mundane and sensible. Economists, for all their political bias, at least provide a counterpoint to many of these theories.)
Obviously different fields study different aspects of phenomena, but entire fields should not become reduced to trying to prove one political ideology or another. If they are, they should label themselves explicitly, rather than make a pretense of neutrality.
When ideology rather than correctness become the standard for publication (not to mention hiring and tenure,) the natural result is incorrectness.
More statistical knowledge is not, by itself, going to resolve the problem. The fields must first recognize that they have an ideological bias problem, and then work to remedy it by letting in and publishing work by researchers outside the social science ideological mainstream. It is very easy to think your ideas sound rigorous when you are only debating with people who already agree with you; it is much more difficult to defend your views against people who disagree, or come from very different intellectual backgrounds.
They could start with–hahahaha–letting in a Republican.
Since “Do Native Americans have Neanderthal DNA?” (or something similar) is the most popular search that leads people to my blog, I have begun to suspect that a clarification is in order.
Native Americans (Indians) are not Neanderthals. They are not half or quarter or otherwise significantly Neanderthal. If they were, they would have very noticeable fertility problems in mixed-race relationships.
They may have slightly higher than average Neanderthal admixture than other groups, but that is extremely speculative I don’t know of any scientists who have said so. We’re talking here about quite small amounts, like 0.5%, most of which appears to code for things like immune response and possibly some adaptations for handling long, cold winters. None of this appears to code for physical traits like skull shape, which have been under different selective pressures over the past 40,000 years.
As much as I would love to discover a group with significant Neanderthal DNA, that’s just not something we’ve found in anyone alive today.
Today I have an excerpt from Aborigine Myths and Legends, by William Ramsay Smith, c. 1930. (As usual, I am dispensing with block quotes for the sake of readability. I have added pictures.)
At Manly, about six miles from Sydney, there are to be seen aboriginal carvings cut into the flat surface of the rock. Among them there is a figure of a male aboriginal with both arms outstretched and holding in one hand a waddy. Another human and a form represents a shark. In another group, there are four male figures, with a boomerang above the head of one and a fish between the legs of another. And, again, there are two figures, almost oval in shape, and one of the ovals has small circles cut out around the edge. All these, as well as the carvings, have a meaning. Each of the objects, whether an animal, bird, reptile, or fish, represents the totem of a tribe.
Aboriginal engraving, Manly, Australia, courtesy of Lonely Planet
Tribe Totems.–As in the case of the Manly figure, where a fish is placed between the legs of a person, he fish is the totem of the tribe living in that locality. Before a tribe can occupy a hunting-ground it must select a totem—a fish, animal, bird, or reptile–anything, in fact, that has an existence. It may be sun, moon, wind, lightning, or thunder. Thus, in some instance, one may see a figure representing the sun or a half-moon. Sometimes one notices figures of a kangaroo and an emu, or two other forms. The kangaroo might be the totem of the tribe of the chief, and the emu might be the totem of his wife’s tribe.
This totemism plays an important part in the social life of the aboriginals. If, for example, a person has committed an offense, or has broken tribal law, he becomes a fugitive. He may travel to some distant part of the country. … He creeps along stealthily, listening intently for any sound, peering through the dense foliage in every bay or cove to see whether his path is clear, noticing every footprint on the way, reading every mark on the tree-trunks and on the surface of rocks, and scanning every mark to see whether there is hope of protection and friendship. To be seen would mean death to him. By and by the keen eye of the fugitive catches sight of the figure of his mother’s totem. Casting aside all fear, he walks boldly along the beaten track that leads to the camp, and presents himself to the chief. He produces a string of kangaroo teeth, made in bead fashion, and a bunch of emu feathers… . This is a sign that he belongs to the Kangaroo totem tribe, and that his mother belongs to the Emu totem tribe. He is received into either of these tribes, and becomes one with them, and participates in all their privileges.
Nulla-nulla created by and for sale from Jagalingu.
At Manly one may notice two figures: a wallaby footprint and a kangaroo, a man figure and a weapon–it may be a boomerang or a nulla-nulla. This means that the Wallaby totem tribe occupied that country, and the Kangaroo totem tribe came and did battle with the Wallaby totem tribe and drove them away and took possession. … From the different figures carved on the surface of one rock one may infer that tribes of different totems shown in the figures occupied that locality.
There are other figures hewn in the rock. The oval with the small circles, referred to above, may represent the sun in its course; in other words, it may show that the aboriginals had knowledge of the earth’s motion. There are old men in each tribe who study the heavens at night; and at certain times of the year every night at intervals they will give a call, “The earth has already turned.” [Footnote: The aboriginals appear to have believed that the earth went round, because there is a saying which means, “The earth has turned itself about.”] This may be done with the idea of teaching the younger generation something about astronomy.
[EvX comments: while interesting, the earth-centric model of the universe is so immediately obvious and the heliocentric so difficult to prove that I am skeptical of such claims.]
Finishing up with our discussion, (in response to a reader’s question):
Why are people snobs about intelligence?
Is math ability better than verbal?
Do people only care about intelligence in the context of making money?
Now, this is the point in the conversation where somebody tends to say something like, “My cousin / little sister /uncle is retarded, but they are still a beautiful, wonderful person and I love them as much as everyone else, and therefore it is mean to say that smart people are higher status than dumb people.”
It is good that you love your family. You should love your family. I am sure your relatives are lovely people, and you enjoy their company, and would be worse off without them.
But by the same token, I am grateful for the fact that I have never had polio, smallpox, or Ebola. I am thankful that I did not die in childbirth (my own or my childrens’.) I am thankful for life-saving surgeries, medications, and mass-vaccination campaigns that have massively reduced the quantity of human suffering, and I happily praise the doctors and scientists who made all of this possible.
That is why doctors and scientists are higher status than dumb people, and why math-smart people (who tend to end up in science) believe that they should have more status than verbal-smart people.
But on to #3--what is this “intelligence” and “money” connection? (And why does our questioner think it is so bad?)
The obvious answer is no, people don’t only care about intelligence in the context of making money. People also care about enjoying music and reading good books and having fun with their friends, having pleasant conversations and not dying of cancer.
But people are practical creatures, and their first priority is making sure that they and their children will eat tomorrow.
In a purely meritocratic society, more intelligent people will tend to end up in professions that require more intellect and more years of training, which will in turn allow them to demand higher wages. (So will people with rare physical talents, like athleticism and musical ability.) Unintelligent people, by contrast, will end up in the jobs that require the least thought and least training, where they will soon be replaced by robots.
The incentive to pay your doctor more than your trash collector is obvious.
The truly bright and creative, of course, will go beyond merely being employed and actually start companies, invent products/processes, and generally reshape the world around them, all of which results in making even more money.
The truly dull, by contrast, even when they can get jobs, tend to be impulsive and bad at planning, which results in the loss of what little money they have.
We do not live in a purely meritocratic society. No one does. We make efforts to that end, though, which is why public schools exist and employers are officially not supposed to consider things like race and gender when hiring people. Which means that our society is pretty close to meritocratic.
And in fact, the correlation between IQ and wealth/income is remarkably robust:
There are a few outliers–the gulf oil states are far richer than their IQs would predict, due to oil; China is poorer than its IQ predicts, which may be due to the lingering effects of communism or due to some quirk in the nature of Chinese intelligence (either way, I expect a very China-dominant future)–but otherwise, IQ predicts average per cap GDP quite well.
Here people tend to bring up a few common objections:
1. I know a guy who is smart but poor, and a guy who is dumb but rich! Two anecdotes are totally sufficient to completely disprove a general trend calculated from millions of data points.
Yes, obviously some really smart people have no desire to go into high-paying fields, and devote their lives to art, music, volunteering with the poor, raising children, or just chilling with their friends. Some smart people have health problems, are unfairly discriminated against, live in areas with few jobs, or are otherwise unable to reach their potentials. Some dumb people luck into wealth or a high-paying job.
It would be a strange world indeed if IQ were absolute destiny.
But the existence of outliers does not negate the overall trends–smarter people tend to get jobs in higher-paying fields and manage their money more effectively; dumb people tend to get jobs in lower-paying fields and manage their money ineffectively.
2. Maybe everyone is equally smart, but just expresses it in different ways. (Corollary form: IQ is just a measure of how good you are at taking IQ tests.)
Either we mean something when we say “intelligence,” or we do not. If we want to define “intelligence” so that everyone is equally smart, then yes, everyone is equally smart. If we want to know if some people are better than others at doing math, then we find that some people are better than others at doing math. Are some people better than others at reading? Yes. Are some people better than others at football? Yes.
If you transported me tomorrow to a hunter-gatherer community, and they gave me a test of the skills necessary for survival there, I’d flunk (and die.) They’d conclude that I was an idiot who couldn’t gather her way out of a paper bag.
Very well, then.
But neither of us lives in a hunter-gatherer society, nor do we particularly care about the skills necessary to survive in one. If I want to know the kinds of intelligence that are necessary for success in industrial societies–the kind of success that may have led to the existence of industrial societies–then you’re looking at normal old “intelligence” as people conventionally use the term, measured by IQ scores, the SAT, vague impressions, or report cards.
3. “You’ve got causality backwards–people with money send their kids to expensive prep schools, which results in them learning more, which results in higher IQ scores. These “smart” kids then use family connections/prestige to land good jobs, resulting in higher wealth.”
As this shows, the heritability of IQ and of behavioral traits is consistently high, reaching into the 0.8-0.9+ range. This means, out of a group of people, at least 80-90% of the overall differences between them (known as the “variance” in statistical parlance) can be attributed to genetic differences between them. This chart shows that this becomes most evident in adulthood, when genes have been given a chance to fully express themselves. I have summed this up in a neat set of rules:
Heredity: 70-80%
Shared environment: 0%
Something else [random chance]: 30-20%
In other words, adopted kids end up with the IQ scores you’d predict from looking at their biological parents, not their legal parents. Baring extremes of poverty or abuse, the way your parents raise you–including the quality of the schools you attend–has very little long-term effect on IQ.
On a related note, massively increased school expenditures since the ’80s has done very little to test scores:
IQ doesn’t lend itself to much environmental manipulation – indeed, interventions that attempt to boost IQ have all met with failure. As well, IQ remains predictive even when measured in youth. It is predictive even when one controls for things like socioeconomic status (say during childhood). Indeed, the best control for this, looking at different siblings within a family, finds that IQ is predictive of real world outcomes between siblings – the sibling with the higher IQ tends to do better.
Everybody wants to know why some groups or countries out perform other groups or countries, but no one likes to be told that they–or a group that they belong to–are less intelligent than others. No one wants to be in the red; everyone wants to blame their troubles on someone else.
Thus a great deal of debate; some people want to prove that the wealth and poverty of nations depends on IQ, and some people want to prove that it does not. No matter your personal opinions on the matter, it’s pretty hard to have a discussions about IQ without the debate resurfacing.
Now, I fully believe that rich people enroll their kids expensive test-prep classes, which result in small increases in SAT scores over students who’ve never seen the test before (an effect that wears off once classes are over.) It may also be that people from countries where schools barely exist look at a test and have no idea what you want them to do with it, regardless of intelligence. But if parental income were the entire story, rich whites, blacks, Hispanics, and Asians ought to all get similar SAT scores, (with the exception of verbal scores for ESL-students,) and poor whites, blacks, Hispanics, and Asians ought to all get similar, lower scores. Instead, the children of wealthy Black parents have worse SAT scores than the children of poor whites and Asians. (Except Asian verbal scores, which are pretty bad at the low end–probably an ESL-artifact.)
Regardless, a certain kind of intelligence appears to be useful for building certain kinds of societies.
Conclusion:
Yes, there are lots of reasons to value intelligence, like making art and enjoying a good book. And there are many lifestyles that people enjoy that do not require making lots of money, nor do they have much to do with capitalism. But there exists, nonetheless, a fairly reliable correlation–at the group level–between average IQ and income/wealth/development level. Most people don’t care about this because they want to exploit each other and destroy the environment, but because they want to be well-fed, healthy, and happy.
Continuing with yesterday’s discussion (in response to a reader’s question):
Why are people snobs about intelligence?
Is math ability better than verbal?
Do people only care about intelligence in the context of making money?
1. People are snobs. Not all of them, obviously–just a lot of them.
So we’re going to have to back this up a step and ask why are people snobs, period.
Paying attention to social status–both one’s own and others’–is probably instinctual. We process social status in our prefrontal cortexes–the part of our brain generally involved in complex thought, imagination, long-term planning, personality, not being a psychopath, etc. Our brains respond positively to images of high-status items–activating reward-feedback loop that make us feel good–and negatively to images of low-status items–activating feedback loops that make us feel bad.
The mental effect is stronger when we perform high-status actions in front of others:
…researchers asked a person if the following statement was an accurate description of themselves: “I wouldn’t hesitate to go out of my way to help someone in trouble.” Some of the participants answered the question without anyone else seeing their response. Others knowingly revealed their answer to two strangers who were watching in a room next to them via video feed. The result? When the test subjects revealed an affirmative answer to an audience, their [medial prefrontal cortexes] lit up more strongly than when they kept their answers to themselves. Furthermore, when the participants revealed their positive answers not to strangers, but to those they personally held in high regard, their MPFCs and reward striatums activated even more strongly. This confirms something you’ve assuredly noticed in your own life: while we generally care about the opinions of others, we particularly care about the opinions of people who really matter to us.
(Note what constitutes a high-status activity.)
But this alone does not prove that paying attention to social status is instinctual. After all, I can also point to the part of your brain that processes written words (the Visual Word Form Area,) and yet I don’t assert that literacy is an instinct. For that matter, anything we think about has to be processed in our brains somewhere, whether instinct or not.
Better evidence comes from anthropology and zoology. According to Wikipedia, “All societies have a form of social status,” even hunter-gatherers. If something shows up in every single human society, that’s a pretty good sign that it is probably instinctual–and if it isn’t, it is so useful a thing that no society exists without it.
Even animals have social status–“Social status hierarchies have been documented in a wide range of animals: apes,[7] baboons,[8] wolves,[9] cows/bulls,[10] hens,[11] even fish,[12] and ants.[13]” We may also add horses, many monkey species, elephants, killer whales, reindeer, and probably just about all animals that live in large groups.
Among animals, social status is generally determined by a combination of physical dominance, age, relationship, and intelligence. Killer whale pods, for example, are led by the eldest female in the family; leadership in elephant herds is passed down from a deceased matriarch to her eldest daughter, even if the matriarch has surviving sisters. Male lions assert dominance by being larger and stronger than other lions.
In all of these cases, the social structure exists because it benefits the group, even if it harms some of the individuals in it. If having no social structure were beneficial for wolves, then wolf packs without alpha wolves would out-compete packs with alphas. This is the essence of natural selection.
Among humans, social status comes in two main forms, which I will call “earned” and “background.”
“Earned” social status stems from things you do, like rescuing people from burning buildings, inventing quantum physics, or stealing wallets. High status activities are generally things that benefit others, and low-status activities are generally those that harm others. This is why teachers are praised and thieves are put in prison.
Earned social status is a good thing, because it reward people for being helpful.
“Background” social status is basically stuff you were born into or have no effect over, like your race, gender, the part of the country you grew up in, your accent, name, family reputation, health/disability, etc.
Americans generally believe that you should not judge people based on background social status, but they do it, anyway.
Interestingly, high-status people are not generally violent. (Just compare crime rates by neighborhood SES.) Outside of military conquest, violence is the domain of the low-class and those afraid they are slipping in social class, not the high class. Compare Andrea Merkel to the average German far-right protester. Obviously the protester would win in a fist-fight, but Merkel is still in charge. High class people go out of their way to donate to charity, do volunteer work, and talk about how much they love refugees. In the traditional societies of the Pacific Northwest, they held potlatches at which they distributed accumulated wealth to their neighbors; in our society, the wealthy donate millionsto education. Ideally, in a well-functioning system, status is the thanks rich people get for doing things that benefit the community instead of spending their billions on gold-plated toilets.
The Arabian babbler … spends most of its life in small groups of three to 20 members. These groups lay their eggs in a communal nest and defend a small territory of trees and shrubs that provide much-needed safety from predators.
When it’s living as part of a group, a babbler does fairly well for itself. But babblers who get kicked out of a group have much bleaker prospects. These “non-territorials” are typically badgered away from other territories and forced out into the open, where they often fall prey to hawks, falcons, and other raptors. So it really pays to be part of a group. … Within a group, babblers assort themselves into a linear and fairly rigid dominance hierarchy, i.e., a pecking order. When push comes to shove, adult males always dominate adult females — but mostly males compete with males and females with females. Very occasionally, an intense “all-out” fight will erupt between two babblers of adjacent rank, typically the two highest-ranked males or the two highest-ranked females. …
Most of the time, however, babblers get along pretty well with each other. In fact, they spend a lot of effort actively helping one another and taking risks for the benefit of the group. They’ll often donate food to other group members, for example, or to the communal nestlings. They’ll also attack foreign babblers and predators who have intruded on the group’s territory, assuming personal risk in an effort to keep others safe. One particularly helpful activity is “guard duty,” in which one babbler stands sentinel at the top of a tree, watching for predators while the rest of the group scrounges for food. The babbler on guard duty not only foregoes food, but also assumes a greater risk of being preyed upon, e.g., by a hawk or falcon. …
Unlike chickens, who compete to secure more food and better roosting sites for themselves, babblers compete to give food away and to take the worst roosting sites. Each tries to be more helpful than the next. And because it’s a competition, higher-ranked (more dominant) babblers typically win, i.e., by using their dominance to interfere with the helpful activities of lower-ranked babblers. This competition is fiercest between babblers of adjacent rank. So the alpha male, for example, is especially eager to be more helpful than the beta male, but doesn’t compete nearly as much with the gamma male. Similar dynamics occur within the female ranks.
In the eighteenth and early nineteenth century, wealthy private individuals substantially supported the military, with a particular wealthy men buying stuff for a particular regiment or particular fort.
Noblemen paid high prices for military commands, and these posts were no sinecure. You got the obligation to substantially supply the logistics for your men, the duty to obey stupid orders that would very likely lead to your death, the duty to lead your men from in front while wearing a costume designed to make you particularly conspicuous, and the duty to engage in honorable personal combat, man to man, with your opposite number who was also leading his troops from in front.
A vestige of this tradition remains in that every English prince has been sent to war and has placed himself very much in harm’s way.
It seems obvious to me that a soldier being led by a member of the ruling class who is soaking up the bullets from in front is a lot more likely to be loyal and brave than a soldier sent into battle by distant rulers safely in Washington who despise him as a sexist homophobic racist murderer, that a soldier who sees his commander, a member of the ruling classes, fighting right in front of him, is reflexively likely to fight.
(Note, however, that magnanimity is not the same as niceness. The only people who are nice to everyone are store clerks and waitresses, and they’re only nice because they have to be or they’ll get fired.)
Most people are generally aware of each others’ social statuses, using contextual clues like clothing and accents to make quick, rough estimates. These contextual clues are generally completely neutral–they just happen to correlate with other behaviors.
For example, there is nothing objectively good or bad for society about wearing your pants belted beneath your buttocks, aside from it being an awkward way to wear your pants. But the style correlates with other behaviors, like crime, drug use, and aggression, low paternal investment, and unemployment, all of which are detrimental to society, and so the mere sight of underwear spilling out of a man’s pants automatically assigns him low status. There is nothing causal in this relationship–being a criminal does not make you bad at buckling your pants, nor does wearing your pants around your knees somehow inspire you to do drugs. But these things correlate, and humans are very good at learning patterns.
Likewise, there is nothing objectively better about operas than Disney movies, no real difference between a cup of coffee brewed in the microwave and one from Starbucks; a Harley Davidson and a Vespa are both motorcycles; and you can carry stuff around in just about any bag or backpack, but only the hoity-toity can afford something as objectively hideous as a $26,000 Louis Vutton backpack.
All of these things are fairly arbitrary and culturally dependent–the way you belt your pants can’t convey social status in a society where people don’t wear pants; your taste in movies couldn’t matter before movies were invented. Among hunter-gatherers, social status is based on things like one’s skills at hunting, and if I showed up to the next PTA meeting wearing a tophat and monocle, I wouldn’t get any status points at all.
We tend to aggregate the different social status markers into three broad classes (middle, upper, and lower.) As Scott Alexander says in his post about Siderea’s essay on class in America, which divides the US into 10% Underclass, 65% Working Class, 23.5% Gentry Class, and 1.5% Elite:
Siderea notes that Church’s analysis independently reached about the same conclusion as Paul Fussell’s famous guide. I’m not entirely sure how you’d judge this (everybody’s going to include lower, middle, and upper classes), but eyeballing Fussell it does look a lot like Church, so let’s grant this.
It also doesn’t sound too different from Marx. Elites sound like capitalists, Gentry like bourgeoisie, Labor like the proletariat, and the Underclass like the lumpenproletariat. Or maybe I’m making up patterns where they don’t exist; why should the class system of 21st century America be the same as that of 19th century industrial Europe?
There’s one more discussion of class I remember being influenced by, and that’s Unqualified Reservations’ Castes of the United States. Another one that you should read but that I’ll summarize in case you don’t:
1. Dalits are the underclass, … 2. Vaisyas are standard middle-class people … 3. Brahmins are very educated people … 4. Optimates are very rich WASPs … now they’re either extinct or endangered, having been pretty much absorbed into the Brahmins. …
Michael Church’s system (henceforth MC) and the Unqualified Reservation system (henceforth UR) are similar in some ways. MC’s Underclass matches Dalits, MC’s Labor matches Vaisyas, MC’s Gentry matches Brahmins, and MC’s Elite matches Optimates. This is a promising start. It’s a fourth independent pair of eyes that’s found the same thing as all the others. (commenters bring up Joel Kotkin and Archdruid Report as similar convergent perspectives).
I suspect the tendency to try to describe society as consisting of three broad classes (with the admission that other, perhaps tiny classes that don’t exactly fit into the others might exist) is actually just an artifact of being a three-biased society that likes to group things in threes (the Trinity, three-beat joke structure, three bears, Three Musketeers, three notes in a chord, etc.) This three-bias isn’t a human universal (or so I have read) but has probably been handed down to us from the Indo-Europeans, (“Many Indo-European societies know a threefold division of priests, a warriorclass, and a class of peasants or husbandmen. Georges Dumézil has suggested such a division for Proto-Indo-European society,”) so we’re so used to it that we don’t even notice ourselves doing it.
(For more information on our culture’s three-bias and different number biases in other cultures, see Alan Dundes’s Interpreting Folklore, though I should note that I read it back in highschool and so my memory of it is fuzzy.)
(Also, everyone is probably at least subconsciously cribbing Marx, who was probably cribbing from some earlier guy who cribbed from another earlier guy, who set out with the intention of demonstrating that society–divided into nobles, serfs, and villagers–reflected the Trinity, just like those Medieval maps that show the world divided into three parts or the conception of Heaven, Hell, and Purgatory.)
At any rate, I am skeptical of any system that lumps 65% of people into one social class and 0.5% of people into a different social class as being potentially too-finely grained at one end of the scale and not enough at the other. Determining the exact number of social classes in American society may ultimately be futile–perhaps there really are three (or four) highly distinct groups, or perhaps social classes transition smoothly from one to the next with no sharp divisions.
I lean toward the latter theory, with broad social classes as merely a convenient shorthand for extremely broad generalizations about society. If you look any closer, you tend to find that people do draw finer-grained distinctions between themselves and others than “65% Working Class” would imply. For example, a friend who works in agriculture in Greater Appalachia once referred dismissively to other people they had to deal with as “red necks.” I might not be able to tell what differentiates them, but clearly my friend could. Similarly, I am informed that there are different sorts of homelessness, from true street living to surviving in shelters, and that lifetime homeless people are a different breed altogether. I might call them all “homeless,” but to the homeless, these distinctions are important.
Is social class evil?
This question was suggested by a different friend.
I suspect that social class is basically, for the most part, neutral-to-useful. I base this on the fact that most people do not work very hard to erase markers of class distinction, but instead actively embrace particular class markers. (Besides, you can’t get rid of it, anyway.)
It is not all that hard to learn the norms and values of a different social class and strategically employ them. Black people frequently switch between speaking African American Vernacular English at home and standard English at work; I can discuss religion with Christian conservatives and malevolent AI risk with nerds; you can purchase a Harley Davidson t-shirt as easily as a French beret and scarf.
(I am reminded here of an experiment in which researchers were looking to document cab drivers refusing to pick up black passengers; they found that when the black passengers were dressed nicely, drivers would pick them up, but when they wore “ghetto” clothes, the cabs wouldn’t. Cabbies: responding more to perceived class than race.)
And yet, people don’t–for the most part–mass adopt the social markers of the upper class just to fool them. They love their motorcycle t-shirts, their pumpkin lattes, even their regional accents. Class markers are an important part of peoples’ cultural / tribal identities.
But what about class conflicts?
Because every class has its own norms and values, every class is, to some degree, disagreeing with the other classes. People for whom frugality and thrift are virtues will naturally think that people who drink overpriced coffee are lacking in moral character. People for whom anti-racism is the highest virtue will naturally think that Trump voters are despicable racists. A Southern Baptist sees atheists as morally depraved fetus murderers; nerds and jocks are famously opposed to each other; and people who believe that you should graduate from college, become established in your career, get married, and then have 0-1.5 children disapprove of people who drop out of highschool, have a bunch of children with a bunch of different people, and go on welfare.
A moderate sense of pride in one’s own culture is probably good and healthy, but spending too much energy hating other groups is probably negative–you may end up needlessly hurting people whose cooperation you would have benefited from, reducing everyone’s well-being.
(A good chunk of our political system’s dysfunctions are probably due to some social classes believing that other social classes despise them and are voting against their interests, and so counter-voting to screw over the first social class. I know at least one person who switched allegiance from Hillary to Trump almost entirely to stick it to liberals they think look down on them for classist reasons.)
Ultimately, though, social class is with us whether we like it or not. Even if a full generation of orphan children were raised with no knowledge of their origins and completely equal treatment by society at large, each would end up marrying/associating with people who have personalities similar to themselves (and remember that genetics plays a large role in personality.) Just as current social classes in America are ethnically different, (Southern whites are drawn from different European populations than Northern whites, for example,) so would the society resulting from our orphanage experiment differentiate into genetically and personalityish-similar groups.
Why do Americans generally proclaim their opposition to judging others based on background status, and then act classist, anyway? There are two main reasons.
As already discussed, different classes have real disagreements with each other. Even if I think I shouldn’t judge others, I can’t put aside my moral disgust at certain behaviors just because they happen to correlate with different classes.
It sounds good to say nice, magnanimous things that make you sound more socially sensitive and aware than others, like, “I wouldn’t hesitate to go out of my way to help someone in trouble.” So people like to say these things whether they really mean them or not.
In reality, people are far less magnanimous than they like to claim they are in front of their friends. People like to say that we should help the homeless and save the whales and feed all of the starving children in Africa, but few people actually go out of their way to do such things.
There is a reason Mother Teresa is considered a saint, not an archetype.
In real life, not only does magnanimity has a cost, (which the rich can better afford,) but if you don’t live up to your claims, people will notice. If you talk a good talk about loving others but actually mistreat them, people will decide that you’re a hypocrite. On the internet, you can post memes for free without havng to back them up with real action, causing discussions to descend into competitive-virtue signalling in which no one wants to be the first person to admit that they actually are occasionally self-interested. (Cory Doctorow has a relevant discussion about how “reputations economies”–especially internet-based ones–can go horribly wrong.)
Unfortunately, people often confuse background and achieved status.
American society officially has no hereditary social classes–no nobility, no professions limited legally to certain ethnicities, no serfs, no Dalits, no castes, etc. Officially, if you can do the job, you are supposed to get it.
Most of us believe, at least abstractly, that you shouldn’t judge or discriminate against others for background status factors they have no control over, like where they were born, the accent thy speak with, or their skin tone. If I have two resumes, one from someone named Lakeesha, and the other from someone named Ian William Esquire III, I am supposed to consider each on their merits, rather than the connotations their names invoke.
But because “status” is complicated, people often go beyond advocating against “background” status and also advocate that we shouldn’t accord social status for any reasons. That is, full social equality.
This is not possible and would be deeply immoral in practice.
When you need heart surgery, you really hope that the guy cutting you open is a top-notch heart surgeon. When you’re flying in an airplane, you hope that both the pilot and the guys who built the plane are highly skilled. Chefs must be good at cooking and authors good at writing.
These are all forms of earned status, and they are good.
Smart people are valuable to society because they do nice things like save you from heart attacks or invent cell-phones. This is not “winning at capitalism;” this is benefiting everyone around them. In this context, I’m happy to let smart people have high status.
In a hunter-gatherer society, smart people are the ones who know the most about where animals live and how to track them, how to get water during a drought, and where that 1-inch stem they spotted last season that means a tasty underground tuber is located. Among nomads, smart people are the ones with the biggest mental maps of the territory, the folks who know the safest and quickest routes from good summer pasture to good winter pasture, how to save an animal from dying and how to heal a sick person. Among pre-literate people, smart people composed epic poems that entertained their neighbors for many winters’ nights, and among literate ones, the smart people became scribes and accountants. Even the communists valued smart people, when they weren’t chopping their heads off for being bourgeois scum.
So even if we say, abstractly, “I value all people, no matter how smart they are,” the smart people do more of the stuff that benefits society than the dumb people, which means they end up with higher social status.
So, yes, high IQ is a high social status marker, and low IQ is a low social status marker, and thus at least some people will be snobs about signaling their IQ and their disdain for dumb people.
BUT.
I am speaking here very abstractly. There are plenty of “high status” people who are not benefiting society at all. Plenty of people who use their status to destroy society while simultaneously enriching themselves. And yes, someone can come into a community, strip out all of its resources and leave behind pollution and unemployment, and happily call it “capitalism” and enjoy high status as a result.
I would be very happy if we could stop engaging in competitive holiness spirals and stop lionizing people who became wealthy by destroying communities. I don’t want capitalism at the expense of having a pleasant place to live in.
ὃν οἱ θεοὶ φιλοῦσιν, ἀποθνῄσκει νέος — he whom the gods love dies young. (Meander)
Harpending wasn’t particularly young, nor was his death unexpected, but I am still sad; I have enjoyed his work for years, and there will be no more. Steve Sailer has a nice eulogy.
In less tragic HBD-osphere news, it looks like Peter Frost has stopped writing his blog, Evo and Proud, due to Canadian laws prohibiting free speech. (There has been much discussion of this on the Frost’s posts that were carried over on Unz; ultimately, the antisemitism of many Unz commentators made it too dangerous for Frost to continue blogging, even though his posts actually had nothing to do with Judaism.)
Back to our subject: This is an attempt to answer–coherently–a friend’s inquiry.
Why are people snobs about intelligence?
Is math ability better than verbal?
Do people only care about intelligence in the context of making money?
We’re going to tackle the easiest question first, #2. No, math ability is not actually better than verbal ability.
Imagine two people. Person A–we’ll call her Alice–has exceptional verbal ability. She probably has a job as a journalist, novelist, poet, or screenwriter. She understands other people’s emotions and excels at interacting verbally with others. But she sucks at math. Not just suck; she struggles counting to ten.
Alice is going to have a rough time handling money. In fact, Alice will probably be completely dependent on the other people around them to handle money for them. Otherwise, however, Alice will probably have a pretty pleasant life.
Of course, if Alice happened to live in a hunter-gatherer society where people don’t use numbers over 5, she would not stand out at all. Alice could be a highly respected oral poet or storyteller–perhaps her society’s version of an encyclopedia, considered wise and knowledgeable about a whole range of things.
Now consider Person B–we’ll call her Betty. Betty has exceptional math ability, but can only say a handful of words and cannot intuit other people’s emotions.
Betty is screwed.
Here’s the twist: #2 is a trick question.
Verbal and mathematical ability are strongly correlated in pretty much everyone who hasn’t had brain damage (so long as you are looking at people from the same society). Yes, people like to talk about “multiple intelligences,” like “kinesthetic” and “musical” intelligence. It turns out that most of these are correlated. (The one exception may be kinesthetic, about which I have heard conflicting reports. I swear I read a study somewhere which found that sports players are smarter than sports watchers, but all I’m finding now are reports that athletes are pretty dumb.)
Yes, many–perhaps most–people are better at one skill than another. This effect is generally small–we’re talking about people who get A+ in English and only B+s in math, not people who get A+ in English but Fs in math.
The effect may be more pronounced for people at the extremes of high-IQ–that is, someone who is three standard deviations above the norm in math may be only slightly above average in verbal, and vice versa–but professional authors are not generally innumerate, nor are mathematicians and scientists unable to read and write. (In fact, their professions require constantly writing papers for publication and reading the papers published by their colleagues.)
All forms of “intelligence” probably rely, at a basic level, on bodily well-being. Your brain is a physical object inside your body, and if you do not have the material bits necessary for well-being, your brain will suffer. When you haven’t slept in a long time, your ability to think goes down the tubes. If you haven’t eaten in several days (or perhaps just this morning), you will find it difficult to think. If you are sick or in pain, again, you will have trouble thinking.
Healthy people have an easier time thinking, and this applies across the board to all forms of thought–mathematical, verbal, emotional, kinesthetic, musical, etc.
“Health” here doesn’t just include things we normally associate with it, like eating enough vegetables and swearing to the dentist that this time, you’re really going to floss. It probably also includes minute genetic variations in how efficient your body is at building and repairing tissues; chemicals or viruses you were exposed to in-utero; epigenetics, etc.
So where does this notion that math and science are better than English and feelings come from, anyway?
A. Math (and science) are disciplines with (fairly) objective answers. If I ask you, “What’s 2+2?” we can determine pretty easily whether you got it correct. This makes mathematical ability difficult to fudge and easy to verify.
Verbal disciplines, by contrast, are notoriously fuzzy:
riverrun, past Eve and Adam’s, from swerve of shore to bend
I scowl with frustration at myself in the mirror. Damn my hair – it just won’t behave, and damn Katherine Kavanagh for being ill and subjecting me to this ordeal. I should be studying for my final exams, which are next week, yet here I am trying to brush my hair into submission. I must not sleep with it wet. I must not sleep with it wet. Reciting this mantra several times, I attempt, once more, to bring it under control with the brush. I roll my eyes in exasperation and gaze at the pale, brown-haired girl with blue eyes too big for her face staring back at me, and give up. My only option is to restrain my wayward hair in a ponytail and hope that I look semi presentable.
Best-seller, or Mary Sue dreck?
And what does this mean:
Within that conflictual economy of colonial discourse which Edward Said describes as the tension between the synchronic panoptical vision of domination – the demand for identity, stasis – and the counterpressure of the diachrony of history – change, difference – mimicry represents an ironic compromise. If I may adapt Samuel Weber’s formulation of the marginalizing vision of castration, then colonial mimicry is the desire for a reformed, recognizable Other, as a subject of a difference that is almost the same, but not quite. Which is to say, that the discourse of mimicry is constructed around an ambivalence; in order to be effective, mimicry must continually produce its slippage, its excess, its difference. (source)
If we’re going to argue about who’s smartest, it’s much easier if we can assign a number to everyone and declare that the person with the biggest number wins. The SAT makes a valiant effort at quantifying verbal knowledge like the number of words you can accurately use, but it is very hard to articulate what makes a text so great that Harvard University would hire the guy who wrote it.
B. The products of science have immediately obvious, useful applications, while the products of verbal abilities appear more superficial and superfluous.
Where would we be today without the polio vaccine, internal combustion engines, or the transistor? What language would we be writing in if no one had cracked the Enigma code, or if the Nazis had not made Albert Einstein a persona non grata? How many of us used computers, TVs, or microwaves? And let’s not forget all of the science that has gone into breeding and raising massively more caloric strains of wheat, corn, chicken, beef, etc., to assuage the world’s hunger.
We now live in a country where too much food is our greatest health problem!
If I had to pick between the polio vaccine and War and Peace, I’d pick the vaccine, even if every minute spent with Tolstoy is a minute of happiness. (Except when *spoilers spoilers* and then I cry.)
But literature is not the only product of verbal ability; we wouldn’t be able to tell other people about our scientific discoveries if it weren’t for language.
Highly verbal people are good at communication and so help keep the gears of modern society turning, which is probably why La Griffe du Lion found that national per capita GDP correlated more closely with verbal IQ scores than composite or mathematical scores.
Of course, as noted, these scores are highly correlated–so the whole business is really kind of moot.
So where does this notion come from?
In reality, high-verbal people tend to be more respected and better paid than high-math people. No, not novelists–novelists get paid crap. But average pay for lawyers–high verbal–is much better than average pay for mathematicians. Scientists are poorly paid compared to other folks with similar IQs and do badly on the dating market; normal people frequently bond over their lack of math ability.
“Math is hard. Let’s go shopping!” — Barbie
Even at the elementary level, math and science are given short shrift. How many schools have a “library” for math and science exploration in the same way they have a “library” for books? I have seen the lower elementary curriculum; kindergarteners are expected to read small books and write full sentences, but by the end of the year, they are only expected to count to 20 and add/subtract numbers up to 5. (eg, 1+4, 2+3, 3-2, etc.)
The claim that math/science abilities are more important than verbal abilities probably stems primarily from high-math/science people who recognize their fields’ contributions to so many important parts of modern life and are annoyed (or angry) about the lack of recognition they receive.
So I was reading about the building of the trans-continental railroad (and Napoleon) and wondering to myself why so many of our politicians seem utterly lacking in leadership skills like actual competence or ability to get things done.
Napoleon rose to the top of the French Military (as far as I know,) by winning battles. Railroad tycoons got to be railroad tycoons by building railroads. Steve Jobs got to be famous by … innovating in product design? Thomas Edison invented the lightbulb and understood the necessity of building a universal power grid so he could sell them to everyone.
George Washington was a leader, not a politician. He got into office because everyone involved decided, based on the job he’d done leading the army during the Revolutionary War, that he’d be a good national leader in peacetime.
But systems have unexpected consequences–you get what you select for, not what you intend to select for. The founders wanted voters to simply come to a rational agreement about who would be the country’s best leader. Since then, we have accrued dozens of layers of complications–political parties and primary votes; super pacs and campaign ads. I have no doubt the founders would have despised it all.
In our case, the electoral system now selects for people who are good at winning elections.
This is part three of a discussion about the development of various strains of animist religion generally grouped under the term “Voodoo.”
Animism is, more or less, the belief that things other than people–animals, plants, tools–are infused with souls or spirits, which can put to various practical or magical uses via magic/sacrifice. Mild forms of this belief include sacrificing cigarettes or alcohol to deities; extreme forms involve eating other people to gain magic powers. The Voodoo traditions that developed historically in the US shade from the explicitly multi-deity worshiping religion of New Orleans to the “root doctors” and folk medicine beliefs of the Deep South and shade into some of the Christian charismatic movements, at least in style.
For folks who already believed that various bodily parts could be used for black magic, fear of the Night Doctors seems natural:
Night Doctors, also known as Night Riders, Night Witches, Ku Klux Doctors, and Student Doctors are bogeymen of African American folklore, with some factual basis. Emerging from the realities of grave robbing, enforced and punitive medical experimentation, and intimidation rumours spread maliciously by many Southern whites, the Night Doctors purpose was to further prevent slaves, Free Men, and black workers leaving for the North of the United States of America … African American folklore told of white doctors who would abduct, kill, and dissect, performing a plethora of experiments, referred to as “Night Doctors”. …
New Orleans had an interesting variation on the Night Doctors called the “Needle Men”. Thought to be medical students from Charity Hospital (now the Medical Center of Louisiana at New Orleans), the eponymous Needle Men, would poke unsuspecting individuals in the arm, resulting in death.
‘I sure don’t go out much at this time of year. You takes a chance just walkin’ on the streets. Them Needle Mens is everywhere. They always comes ’round in the fall, and they’s ’round to about March. You see, them Needle Mens is medical students from the Charity Hospital tryin’ to git your body to work on. That’s ’cause stiffs is very scarce at this time of the year. But them mens ain’t workin’ on my body. No, sir! If they ever sticks their needles in your arm you is jest a plain goner. All they gotta do is jest brush by you, and there you is; you is been stuck. ‘Course I believes it!’ …
In 1924 there was a Needle Men scare in the Carrollton section of the city. It was reported that these ‘fiends’ slunk about the darkest streets, sprang from behind trees or from vacant lots overgrown with weeds, jabbed women with their needles and fled. Cruel skeptics insinuated the ‘victims’ were suffering from a combination of imagination and Prohibition gin, but indignant females, of all colors, swore to the existence of these particular Needle Men.
(A man with a bayonet was arrested and the attacks stopped.)
Only a few years ago Needle Men appeared, according to reports, and began stabbing young women while they were seated in moving-picture theatres, rendering them partially unconscious and carrying them off into white slavery and a fate ‘worse than death.’ For months in New Orleans downtown cinemas, women were screaming and fainting and crying out they had been jabbed with a needle. But so far as can be ascertained, the period offered no more disappearances than usual, nor is it known that any New Orleans women strayed down the primrose path via this particular route.
Similar to the Needle Men, at least in intent, are the Black Bottle Men. The Black Bottle is reputed to be a potent dose administered to the innocent and unknowing on entry to the
Charity Hospital. Instant death is certain to follow, the body then to be rendered up to the students for carving.
The explanation for this is simple. Every person entering Charity Hospital is given a dose of cascara upon admission. Pure cascara is nearly black and when magnesia is added, as is the custom, it becomes a deep brown, the change in color causing Negroes to fear it is a death-dealing drug.
In the 1800s, it was common for teaching hospitals–especially in the South–to use black cadavers in their dissections, a practice which undoubtedly inspired fear and distrust among the black population. People tend not to take kindly to grave-robbing, after all, and where there’s a market for for dead bodies, enterprising folks of dubious ethics have occasionally taken it upon themselves to create a supply.
Likewise, new surgical and medical techniques were often tested on black subjects prior to use on whites–voluntary or paid, the subjects of medical experimentation are generally drawn from the most desperate, least able to defend themselves classes of society.
But the folks handing out cascara–a laxative–at the charity hospital probably had nothing but good intentions. Fear oft outweighs actual danger; by the 1920s, the only people going around randomly killing blacks in New Orleans were probably serial killers.
And a curious quote from the Wikipedia article:
A woman from the book The Immortal Life of Henrietta Lacks states that: “You’d be surprised how many people disappeared in East Baltimore when I was a girl. I’m telling you, I lived here in the fifties when they got Henrietta, and we weren’t allowed to go anywhere near Hopkins. When it got dark and we were young, we had to be on the steps, or Hopkins might get us.” (bold mine)
“Got Henrietta”?
The woman was treated for cancer, not abducted.
According to the Henrietta Lacks Wikipedia page, John Hopkins was the only hospital around that would treat black patients. No good deed goes unpunished.
“Research Melano-Tan Injections. … You need to see this. Whites are gearing up for the next stage in their desire for blackness and their war against us. It is very real. Why do you think we’re being kidnapped?” …
Whites are kidnapping us, melting down our organs, cutting open our skulls and eating our pineal gland in order to become “powerful” and injecting our melanin into themselves.
This wasn’t written in 1924 or 1950, but in 2013.
A commentator writes:
The blood bank won’t stop calling me and always trying to tell me that there is a blood shortage. They ask if, you could like to have your blood go to sicklwe [sic] cell children. I have found that they put a little mark on it so it is used for particular people.
They also run genetic test on your blood when you donate it and they have you sign away rights kind of alla Henretta Lacks. I’ve had one of them say that my blood helped people get well faster and that I should donate as much as possible. This has freaked me out and I haven’t been back since. I’m thinking of changing my phone number as I know they can track you through your cell phone using GPS. I don’t plan on being snatched. (my bold)
Henrietta Lacks again.
I always told my family that whenever you see missing black kids that are posted on the news or on the board at Walmart, they are in underground slave camps and will be used for experiments. They are preying on the weak.
And another on blood:
Not to forget mentioning the Rhesus Negative Blood type. Having learnt a lot about it (my mother is this blood type and I’m sure a lot of you know that it is the oldest blood type on this planet), it is no surprise to me that this, besides the melanin, could be why Caucasian ‘celebs’ are snatching up Black babies QUICK. It’s the cure. Even if the Caucasians are calling it the ‘Anunnaki blood type’.
They are trying to graft themselves back in because – and this could not be stressed enough, they’re fully aware that their time is up.
To be fair, I have also seen white people write crazy things on the subject of Rh-/Rh+ blood, because the name seems to confuse people who can’t use Wikipedia. Crazy thought is all over the place if you look for it.
This particular strain of crazy is of interest, though, because it leads us to Henrietta Lacks.
Henrietta Lacks was only 31 years old when she developed adenocarcinoma of the cervix. This was 1951, and cancer treatments weren’t very effective–hell, now it’s 2016, and treatment for cervical cancer still isn’t very good–and Mrs. Lacks soon died.
During a biopsy, cancerous cells were removed from her cervix and later given to researcher George Gey, who discovered that, unlike normal cells which died quickly in the laboratory, these cells had the remarkable ability to keep reproducing. Thus the “immortal” line of HeLa cells was born.
In 1954, Jonas Salk used HeLa cells to develop the polio vaccines. (My dad had polio. Anyone who thinks using HeLa cells was unethical can fuck off and die in a fire.) Since then, HeLa cells have been used in thousands of experiments and been involved in too many medical advances to list.
Publicity surrounding Mrs. Lacks’s undying cell line–including the book–has led to a flurry of posthumous awards, including an honorary degree of public service from Morgan State University and induction into the Maryland Women’s Hall of Fame.
Meanwhile, the George Gey, the guy who actually cultured the cell line, has been pretty much forgotten.
In 2013, Mrs. Lacks’s descendents were given “some control over scientists’ access to the cells’ DNA code”–two family members now sit on a six-member committee that decides which projects get to access the genetic code. Additionally, they are supposed to “receive acknowledgement in the scientific papers that result.” (Acknowledgment for what? Being related to the cell line?)
Since they share some of Lacks’s DNA, they want to be able to control what people can predict about their own DNA. I can kind of see this one, if you think the DNA might predict something embarrassing, like compulsive farting or halitosis, but we are 50+ years down the line and talking about Mrs. Lacks’s decreasingly related descendents, who do not share most of their DNA with her. Additionally, I have no idea what qualifications these folks have for determining which research should and shouldn’t go forward, and no reason to believe it will actually be based on, “Butts Disease sounds embarrassing, let’s not do that one,” vs. “I don’t like this guy’s research, so screw him.”
(We are assured, of course, that, “It’s not about money,” though when people say that, it often is. The Washington Post opines:
It’s not about money. Though many have made a lot off the cells of Henrietta Lacks, her surviving family members won’t see any of it. But her descendants will finally gain some control over how pieces of the poor black woman who died in Baltimore in 1951 are used in medical research. When scientists and doctors crave the key to the genetic code that unlocked treatments and vaccines, two family members will have a seat at the table where the decisions are made.
It is said that the samples taken from Henrietta set the groundwork for the multi-million dollar biomedical research industry, as they allowed researchers to analyze the cells in a way that they couldn’t on living humans. To date, Henrietta’s relatives have yet to see a dime of the millions of dollars made off of her cells, but as of yesterday, they’ve gained a little more control over scientists who are given access to the cells and what they’re allowed to do with them.)
But I am not a medical ethicist; if this sort of thing makes people happier and more likely to donate cells and DNA that will contribute to medical and scientific research, then I am all for it. Nor do I have any reason to believe that the family was motivated by animist concerns–I know nothing about them. It is only interesting how Mrs. Lacks’s case gets framed in the same language and context as Night Doctors and Needle Men.