Quick note on jobs and education

So this whole Yang Gang phenomenon is shaping up to be quite amusing. So far I’ve seen Yang supported by little old liberal grandmas and alt-right memers. I’d better start up some posts on modern monetary theory.

In the meanwhile, just some quick thoughts on how we need to restructure our thinking about education:

The entire education => jobs model has got to change. Not in format–much of the way things are physically taught in the classroom is fine–but in how we think about the process (and thus fund it).

People have the idea that education is 1. Job training and 2. Ends when you graduate.

#2 is important: it implies that education ENDS, and since it ends, you can afford to shell out an enormous quantity of cash for it. But this is increasingly misguided, as many laid-off journalists recently discovered.

The difficulty is that humans are producing knowledge and innovation at an exponential rate, so whatever was an adequate amount of knowledge to begin in a field 20 years ago is no longer adequate–and in the meanwhile, technology has likely radically altered the field, often beyond recognition.

Modern education must be ongoing, because fields/tech/knowledge are shifting too quickly for a single college degree to equip you for 45 years of work.

Is there any point to a degree (or other form of certification)? Yes. It can still function to allow a person into a work community. It just shouldn’t be seen as the end of education, and thus should not cost nearly as much as it does.

Modern education should proceed in bursts. After a short training period, you begin to work, to see if you are a good fit for the particular community (profession) you’ve chosen, or need to transfer to a different community and learn there. Better to figure this out before you spend tens or hundreds of thousands of dollars on a degree. Job, pay, education–all need to be unified, small bits, throughout your life.

 

So what do you think about the Yang Gang?

When we become our own worst enemies

In times of danger, tribes merge; in times of peace, they split.

From Scientific Reports, De novo origins of multicellularity in response to predation:

Here we show that de novo origins of simple multicellularity can evolve in response to predation. We subjected outcrossed populations of the unicellular green alga Chlamydomonas reinhardtii to selection by the filter-feeding predator Paramecium tetraurelia. Two of five experimental populations evolved multicellular structures not observed in unselected control populations within ~750 asexual generations. Considerable variation exists in the evolved multicellular life cycles, with both cell number and propagule size varying among isolates. Survival assays show that evolved multicellular traits provide effective protection against predation. These results support the hypothesis that selection imposed by predators may have played a role in some origins of multicellularity.

If we evolve multicellularity in response to predation, then the inverse–a loss of multicellularity, a splitting apart, can happen when predation is removed. 

The Democrats have faced a bit of controversy lately over the comments of Ilhan Omar (for the non-Americans in the audience, Ilhan Omar is a recently elected representative of Somali Muslim origins.) As Politico reports: 

Then, after being seated on the House Foreign Affairs Committee, Omar was lampooned for a 2012 tweet in which she wrote during an Israeli military campaign in the Gaza Strip, “Israel has hypnotized the world, may Allah awaken the people and help them see the evil doings of Israel.”  

Omar then made an idiotic non apology — “she claimed ignorance of the anti-Semitic trope that conceives of Jewish hypnosis.” 

Whether Omar knew it is a trope or not is irrelevant to the question of whether or not Omar was saying something anti-semitic–and even that is not necessarily grounds for an apology, because people apologize when they actually feel contrite about something. Omar most likely doesn’t.

Muslims have their interests; Jews have different interests. The existence of Israel is a big deal for Jews–it helps ensure that nasty incidents like the Holocaust don’t repeat. The existence of Israel is also a big deal for Palestinians, many of whom, I assume, would be living in the area if Jews weren’t. 

Conflicts over land are nothing new in human history, and it doesn’t require a degree in astrophysics to realize that sometimes groups have conflicting interests. Americans of the non-Jewish or Muslim variety also have their own interests (many desire, for example, that Israel continue existing for their own religious reasons–not hypnosis.) 

The left’s coalition requires different groups to work together (to ally) in their own self-interest, which works if they have bigger enemies to fear. It doesn’t work if they are strong enough to stand on their own feet (or if someone is too dumb to recognize the value of teamwork.) The ideological justification for allying is “intersectionality,” a term which has been bastardized well beyond its original meaning, but is now used to mean “all forms of oppression are really the same thing, so if you oppose one oppression, you must oppose them all.” So if you are against wife beating, you must also be vegan; if you are opposed to the police shooting unarmed black men, you must also be in favor of hijabs. “Interlocking systems of oppression” work to identify a single enemy, a necessary component for unifying people into something like a voting block or a military. 

And it works as long as there actually is a single enemy. 

It falls apart when you don’t have a single enemy, which is of course the world as it actually stands, because lots of groups have different interests and would like each other’s stuff. There isn’t actually anything magically special about cis-hetero-white-Christian-omnivorous-etc-men that makes them any more or less the oppressors of others. Over in Africa, Africans get oppressed by their fellow Africans. In Islamic countries, chickens get eaten by Muslims. In China, Christianity isn’t even remotely significant. 

In a related story, some British schools have recently seen their pro-LGBT curricula attacked by Muslim parents, who are, despite intersectionalist theory, actually pretty anti-homosexuality.

There is no real way to decide between these two points of view. The vast, vast majority of Muslims believe that homosexuality is a sin, and a school that goes out of its way to teach something counter to that is obviously running up against the students’ and parents’ right to their beliefs. Yet gay people also believe, with equal fervor, that homosexuality is morally respectable and they have a right to advocate on their own behalf and have a perfectly sensible desire to reach out to gay Muslims. 

The difficulty with victory is you don’t need your allies anymore; like the US and the USSR at the end of WWII, victorious allies are apt to turn on each other, fighting for what remains of the spoils. This is true of everyone, not just the left–it is just more interesting when it happens on the left because I’ve been pointing out that this would happen for years. 

Of course, some people react to this and say, “clearly the solution to our group splitting apart is to split our group apart; once our group is split, we will all have the same interests and no one will ever fight, just as children never fight with their siblings–hey knock it off in there STOP PUNCHING YOUR BROTHER you have to SHARE THAT TOY–“

Lack of predation => splitting doesn’t just stop at any particular level. 

297px-world_population_v3-svgThe other difficulty with splitting is that we live in a shrinking world. Up until the 1950s, the entire world had fewer than 3 billion people; today we have more than twice that many, and we’re still growing. Our cities are bigger, communities are expanding, transportation is better and faster, and more people have the money necessary to move to new places. More people than ever before are on the internet, watching TV, or otherwise interacting. 

Devour, get devoured, or make something new?

Feel Something

My Name is Ruin, by Gary Numan

Me: In my zone, listening to music
Husband: Look at this dumb shit someone said on the internet
Me: What? Brains?

So far, everything I have listened to on this album is excellent.

By the way, Mongolia still isn’t sorry–The Hu, Yuve Yuve Yu

Mongolia is going to fuck your shit up and take your women, apparently.

Nirvana: Smells Like Teen Spirit

Guys, I have discovered the point of music. It’s sex.

Alice in Chains: Them Bones

In retrospect, I guess it’s not a surprise that a lot grunge musicians died of drugs or suicide.

Smashing Pumpkins: Bullet with Butterfly Wings

As long as you can still scream, you can still feel.

I don’t know if we can scream anymore.

Placebo–literally, “I please”–Sucker Love:

Their lead singer is a good example of a male playing up his effeminate qualities in order to get laid.
Husband: You can’t just say that without explanation.
Me: Have you seen the lead singer? I guarantee he gets tons of sex.
Husband: Is he gay?
Me: Whatever he’s into, he gets plenty of it.

There’s a lesson here for effeminate men thinking “Hey, would it be easier for me if I became a girl?”

No. It wouldn’t. Be you. Own who you are and find the people who are attracted to you.

AFI: Miss Murder

Gary Numan aside, it seems like the music scene has changed in fundamental ways over the past few decades. I don’t think there is anyone in the business today whose suicide would affect teens like Kurt Cobain’s, just because there is no one that widely loved. It’s not that society is more divided (though perhaps it is); we just don’t listen to music like we used to.

Of course popular music is still around, and still of varying (usually low) quality.

To hazard a guess, if music is really about reproducing, then the change in music is related to the decline in birth rates. A typical modern human mating ritual involves going to a club, listening to a band or some very loud recorded music, getting drunk, and meeting someone you’d like to have sex with. These clubs also provide a place for new bands to get started. But if fewer people go out, clubs close, people meet fewer other people, people are lonelier, birth rates drop, and new bands have a harder time getting noticed, and the industry changes.

On a final note:

This is why certain traits persist in the population.

Is Spring Cleaning an Instinct?

9e30041a93246ed05739ef863ec12415
The Mole spring cleaning in the Wind in the Willows

For the past three days, I have been seized with a passion for cleaning and organizing the house that my husband describes as “a little scary.” So far I’ve found a missing hairbrush, the video camera, (it was in a lunchbox under some papers under some toys), and the floor; reorganized the bedroom, built a mini-chest of drawers out of cardboard, and returned my mother’s plates–and I’m not even pregnant.

A mere week ago, my limbs hurt whenever I moved. I wasn’t sad or depressed, but it simply felt like pushing boulders every time I needed to walk over to the kitchen.

I woke up this morning with high spirits, sore arms from carrying laundry and a question: is spring cleaning an instinct?

You don’t hear much about fall cleaning or winter cleaning. No one bothers with night cleaning or rainy day cleaning. Only Spring receives special mention for its burst of cleaning.

Over on Bustle, Rachel Krantz links a sudden urge to clean to the menstrual cycle:

… science says there is actually a hormonal reason for all this: when you’re PMSing, you are often overcome by an urge to clean house — literally and figuratively.

Why? The answer lies in the way estrogen and progesterone levels affect your brain. Before our periods, our estrogen levels drop — causing serotonin levels to drop right along with it. …

But the drops in estrogen and serotonin aren’t the only things that spur the desire to clean up. Before your period, your progesterone levels also drop, which combines the impulse to clean with an instinct to “nest.” We see this tendency manifest itself more dramatically in pregnant women, who in their later months of pregnancy have low progesterone levels — which often lead them to go into a frenzy of cleaning house and nesting in order to prepare for the baby.

The PMS-related drop in progesterone is a less-intense version of the same phenomenon. 

The Window Genie blog reflects on the effects of long winter days on melatonin, which makes us sleepy:

Well, it’s no myth; winter causes us to be inherently less active and motivated. That’s right; your brain creates melatonin when there is less sunlight on cold dreary days, making you sleepy! Come spring, Mother Nature provides us a natural energy boost by giving us warmer weather and extra sunlight. The dreary days of snow are (hopefully) over and our natural instinct is to explore and interact with others. Although it may seem like a western tradition, cultures from all over the world have been spring cleaning for thousands of years.

Hopefully I can use this newfound energy to write more, because my posting has been deficient of late.

Window Genie (which I suspect is really a window-cleaning service) also notes that spring-cleaning is a cross-cultural phenomenon. I was just commenting on this myself, in a flurry of dish-washing. Do the Jews not clean thoroughly before Passover? Don’t they go through the house, removing all of the bits of old bread, vacuuming and sweeping and dusting to get out even the slightest bit of crumbs or stray yeast? Some even purchase a special feather and spoon kit to dust up the last few crumbs from the corners of the cupboards, then burn them. Burning seems a bit extreme, yet enjoyable–your cleaning is thoroughly done when you’ve burned the last of it.

I would be surprised if “spring cleaning” exists in places that effectively don’t have spring because their weather is warm all-year-long. Likely they have some other traditions, like “Dry season dusting” or “annual migration.” (I find moving an especially effective way to motivate oneself to throw out excess belongings.)

It’s no secret that sales of cleaning and organizing products ramp up in spring, but the claim that our seasonal affection for washing is merely “cultural” is highly suspect–mere “culture” is an extremely ineffective way of getting me to do the laundry.

The claim that Spring Cleaning started in ancient Iran is even more nonsensical. This is simply mistaking the presence of written records in one place and not another for evidence that a tradition is older there. There is no cultural connection between modern American housewives vacuuming their carpets and ancient Iranian cleaning habits.

I do wish people wouldn’t say such idiotic things; I certainly didn’t work through dinner last night because of a love of Zoroaster. It is far more likely that I and the Persians–and millions of other people–simply find ourselves motivated by the same instincts, For we are both humans, and humans, like all higher animals, make and arrange our shelters to suit our needs and convenience. The spider has her web, the snake his hole, the bee her hive. Chimps build nests and humans, even in the warmest of climates, build homes.

These homes must be kept clean, occasionally refreshed and rid of dust and disease-bearing parasites.

Like the circle of the seasons, let us end with the beginning, from The Wind in the Willows:

The Mole had been working very hard all morning, spring-cleaning his little home. First with brooms, then with dusters; then on ladders and steps and chairs, with a brush and a pail of whitewash; till he had dust in his throat and eyes and splashes of whitewash all over his black fur, and an aching back and weary arms. Spring was moving in the air above and in the earth below and around him, penetrating even his dark and lowly little house with its spirit of divine discontent and longing. It was small wonder, then, that he suddenly flung down his brush on the floor, said “Bother!” and “Oh blow!” and also “Hang spring cleaning!” and bolted out of the house without even waiting to put on his coat. Something up above was calling to him…

 

 

What does “Heritable” mean? 

“Heritable” (or “heritability”) has a specific and unfortunately non-obvious definition in genetics.

The word sounds like a synonym for “inheritable,” rather like your grandmother’s collection of musical clocks. Musical clocks are inheritable; fruit, since it rots, is not very inheritable.

This is not what “heritable” means.

“Heritability,” in genetics, is a measure of the percent of phenotypic variation within a population that can be attributed to genetics.

Let me clarify that in normal speak. “Phenotype” is something you can actually see about an organism, like how tall it is or the nest it builds. “Phenotypic variation” means things like “variation in height” or “variation in nest size.”

Let’s suppose we have two varieties of corn: a giant strain and a dwarf strain. If we plant them in a 100% even field with the same nutrients, water, sunlight, etc at every point in the field, then close to 100% of the variation in the resulting corn plants is genetic (some is just random chance, of course.)

In this population, then, height is nearly 100% heritable.

Let’s repeat the experiment, but this time, we sow our corn in an irregular field. Some patches have good soil; some have bad. Some spots are too dry or too wet. Some are sunny; others shaded. Etc.

Here it gets interesting, because aside from a bit of random chance in the distribution of seeds and environmental response, in most areas of the irregular field, our “tall” corn is still taller than the “short” corn. In the shady areas, both varieties don’t get enough sun, but the tall corn still grows taller. In the nutrient-poor areas, both varieties don’t get enough nutrients, but the tall still grows taller. But when we compare all of the corn all over the field, dwarf corn grown in the best areas grows taller than giant corn grown in the worst areas.

Our analysis of the irregular field leads us to conclude that water, sunlight, nutrients, and genes are all important in determining how tall corn gets.

Height in the irregular field is still heritable–genes are still important–but it is not 100% heritable, because other stuff is important, too.

What does it mean to be 10, 40, or 80% heritable?

If height is 10% heritable, then most of the variety in height you see is due to non-genetic factors, like nutrition. Genes still have an effect–people with tall genes will still, on average, be taller–but environmental effects really dominate–perhaps some people who should have been tall are severely malnourished.

In modern, first world countries, height is about 80% heritable–that is, since most people in first world countries get plenty of food and don’t catch infections that stunt their growth, most of the variation we see is genetic. In some third world countries, however, the heritability of height drops to 65%. These are places where many people do not get the nutrients they need to achieve their full genetic potential.

How do you achieve 0% heritability?

A trait is 0% heritable not if you can’t inherit it, but if genetics explains none of the variation in the sample. Suppose we seeded an irregular field entirely with identical, cloned corn. The height of the resulting corn would would vary from area to area depending on nutrients, sunlight, water, etc. Since the original seeds were 100% genetically identical, all of the variation is environmental. Genes are, of course, important to height–if the relevant genes disappeared from the corn, it would stop growing–but they explain none of the variation in this population.

The heritability of a trait decreases, therefore, as genetic uniformity increases or the environment becomes more unequal. Heritability increases as genetics become more varied or the environment becomes more equal. 

Note that the genes involved do not need to code directly for the trait being measured. The taller people in a population, for example, might have lactase persistence genes, which let them extract more calories from the milk they drink than their neighbors. Or they might be thieves who steal food from their neighbors.

I remember a case where investigators were trying to discover why most of the boys at an orphanage had developed pellagra, then a mystery disease, but some hadn’t. It turns out that the boys who hadn’t developed it were sneaking into the kitchen at night and stealing food.

Pellagra is a nutritional deficiency caused by lack of niacin, aka B3. Poor Southerners used to come down with it from eating diets composed solely of (un-nixtamalized) corn for months on end.

The ultimate cause of pellagra is environmental–lack of niacin–but who comes down with pellagra is at least partially determined by genes, because genes influence your likelihood of eating nothing but corn for 6 months straight. Sociopaths who steal the occasional ham, for example, won’t get pellagra, but sociopaths who get caught and sent to badly run prisons, however, increase their odds of getting it. In general, smart people who work hard and earn lots of money significantly decrease their chance of getting it, but smart black people enslaved against their will are more likely to get it. So pellagra is heritable–even though it is ultimately a nutritional deficiency.

What’s the point of heritability?

If you’re breeding corn (or cattle,) it helps to know whether, given good conditions, you can hope to change a trait. Traits with low heritability even under good conditions simply can’t be affected very much by breeding, while traits with high heritability can.

In humans, heritability helps us seek out the ultimate causes of diseases. On a social level, it can help measure how fair a society is, or whether the things we are doing to try to make society better are actually working.

For example, people would love to find a way to make children smarter. From Baby Einstein to HeadStart, people have tried all sorts of things to raise IQ. But beyond making sure that everyone has enough to eat, no nutrient deficiencies, and some kind of education, few of these interventions seem to make much difference.

Here people usually throw in a clarification about the difference between “shared” and “non-shared” environment. Shared environment is stuff you share with other members of your population, like the house your family lives in or the school you and your classmates attend. Non-shared is basically “random stuff,” like the time you caught meningitis but your twin didn’t.

Like anything controversial, people of course argue about the methodology and mathematics of these studies. They also argue about proximate and ultimate causes, and get caught up matters of cultural variation. For example, is wearing glasses heritable? Some would say that it can’t be, because how can you inherit a gene that somehow codes for possessing a newly invented (on the scale of human evolutionary history) object?

But this is basically a fallacy that stems from mixing up proximate and ultimate causes. Obviously there is no gene that makes a pair of glasses grow out of your head, nor one that makes you feel compelled to go and buy them. It is also obvious that not all human populations throughout history have had glasses. But within a population that does have glasses, the chances of you wearing glasses is strongly predicted by whether or not you are nearsighted, and nearsightedness is a remarkable 91% heritable. 

Of course, some nearsighted people opt to wear contact lenses, which lowers the heritability estimate for glasses, but the signal is still pretty darn strong, since almost no one who doesn’t have vision difficulties wears glasses.

If we expand our sample population to include people who lived before the invention of eyeglasses, or who live in countries where most people are too poor to afford glasses, then our heritability estimate will drop quite a bit. You can’t buy glasses if they don’t exist, after all, no matter how bad your eyesight it. But the fact that glasses themselves are a recent artifact of particular human cultures does not change the fact that, within those populations, wearing glasses is heritable.

“Heritability” does not depend on whether there is (or we know of ) any direct mechanism for a gene to code for the thing under study. It is only a statistical measure of genetic variation that correlates with the visible variation we’re looking at in a population.

I hope this helps.

Everyone’s using “social construct” wrong

Well.

Dr. Seers is close.

A “social construct”–in the context of groups of people–is just a stereotype. We’ll call it an “idealized version.” We learn this idealized version by interacting with many individual instances of a particular type of thing and learning to predict its typical behaviors and characteristics.

Suppose I asked you to draw a picture of a man and woman. Go ahead, if you want; then you can compare it to the draw-a-man test.

Out in reality, there are about 7 billion men and women; there is no way you drew someone who looks like all of them. Chances are you drew the man somewhat taller than the woman, even though in reality, there are millions of men and women who are the same height. You might have even drawn hair on the figures–long hair for the woman, short for the man–and some typical clothing, even though you know there are many men with long hair and women with short.

In other words, you drew an idealized version of the pair in order to make it clear to someone else what, exactly, you were drawing.

Our idealized pictures work because they are true on average. The average woman is shorter than the average man, so we draw the woman shorter than the man–even though we know perfectly well that short men exist.

Once an ideal exists, people (it seems) start using artificial means to try to achieve it (like wearing makeup,) which shifts the average, which in turn prompts people to take more extreme measures to meet that ideal.

This may lead to run-away beauty or masculinity trends that look completely absurd from the outside, like foot binding, adult circumcision rituals, or peacocks’ tails. Or breasts–goodness knows why we have them while not nursing.

Our idealized images work less well for people far from the average, or who don’t want to do the activities society has determined are necessary to meet the ideal.

Here’s an interesting survey of whether people (in this case, whites) consider themselves masculine or feminine, broken down by political orientation.

DxuYyOUX4AIiTkU
“In General, would you describe yourself as…”

The same trend holds for women–conservative women are much more likely to consider themselves to be very feminine than liberal women. Of course, ideology has an effect on people’s views, but the opposite is probably also true–people who don’t feel like they meet gender ideals are more likely to think those ideals are problematic, while people who do meet them are more likely to think they are perfectly sensible.

And this sort of thinking applies to all sorts of groups–not just men and women. Conservatives probably see themselves as better encapsulating the ideal of their race, religion, nationality (not just American conservatives, but conservatives of all stripes,) while liberals are probably more likely to see themselves as further from these ideals. The chief exceptions are groups where membership is already pre-determined as liberal, like vegetarians.

esquireThis may also account for the tendency people have, especially of late, to fight over certain representations. An idealized representation of “Americans” may default to white, since whites are still the majority in this country, but our growing population of non-whites would also like to be represented. This leads to pushback against what would be otherwise uncontroversial depictions (and the people who fit the ideal are not likely to appreciate someone else trying to change it on them.)

Why do People believe in Conspiracies?

What happens when one’s beliefs come in conflict with reality? Not a small conflict, like the shops closing earlier than expected, but a massive conflict, such as believing that a non-existent conspiracy is out to get you.

Both leftists and rightists have their pet conspiracies. I have conspiracy theories. Every now and then, a conspiracy theory turns out to be true, but usually they aren’t.

Here’s an interesting example of a non-political conspiracy theory: Obsessed Benedict Cumberbatch Fans Tried to Have Me Fired:

It started, as so many online flaps do, with a thoughtless tweet. A starstruck friend and I had bumped into the popular actor Benedict Cumberbatch and his pregnant wife, and I made a faintly ironic tweet about it. …

Then the replies started. “How do you know it was his wife?” “What’s his wife like?”

Then, “SHE’S NOT PREGNANT.“ …

Members of the self-named “Skeptics” (a group of exclusively female Cumberbatch fans who believe that his wife is, variously: a prostitute, a hired PR girlfriend, a blackmailer, a con artist, a domestic abuser, mentally ill, and apparently the most brilliant criminal mastermind of all time, and that the marriage, his wife’s pregnancy, and very existence of their child have all been faked in a wide-ranging international conspiracy orchestrated by a 30-something British opera director in an attempt to force a naïve and helpless movie star to pretend to be married to her) had discovered me, and they were not impressed.

These sorts of fans are probably either 14 years old or actually low-level mentally ill.

In a way, I suspect that mental illness is far more common than we generally acknowledge.

If we define mental illness in evolutionary terms as something that interferes with survival and reproduction, then it is relatively rare. For example, depression–one of the most common mental illnesses–doesn’t interfere with female fertility, and at least in some studies, neuroticism is positively associated with having more children.

By contrast, if we define mental illness as including any significant disconnect from reality, then large swaths of people may be ill. People who are convinced that movie stars’ wives are fake, for example, may be perfectly adept at getting pregnant, but they are still delusional.

Here is another conspiracy theory: The Fetid, Right-Wing Origins of “Learn to Code”:

Last Thursday, I received the news that the HuffPost Opinion section—where I’d been opining on a weekly basis for a few months—had been axed in its entirety. … Dozens of jobs were slashed at HuffPost that day, following a round of layoffs at Gannett Media; further jobs were about to be disappeared at BuzzFeed. …

Then the responses started rolling in—some sympathy from fellow journalists and readers, then an irritating gush of near-identical responses: “Learn to code.” “Maybe learn to code?” “BETTER LEARN TO CODE THEN.” …

On its own, telling a laid-off journalist to “learn to code” is a profoundly annoying bit of “advice,” a nugget of condescension and antipathy. … the timing and ubiquity of the same phrase made me immediately suspect a brigade attack. My suspicions were confirmed when conservative figures like Tucker Carlson and Donald Trump Jr. joined the pile-on, revealing the ways in which right-wing hordes have harnessed social media to discredit and harass their opponents.

So the journalist does some deep sleuthing, discovers that people on 4Chan are talking about telling journalists they should learn to code, and decides that the entire thing is some coordinated troll attack for no other reason than trolls are gonna troll. Just like some movie stars inexplicably have fake girlfriends, so people on 4Chan inexplicably hate journalists.

Related: The Death of a Dreamer:

The day before the conference, Heinz had apparently been told he would be on for ten minutes rather than the three he’d been planning. To fill some of the time at the end, he decided to speak briefly about some of companies he’d partnered with who’d be using Cambrian Genomics technology. Welcoming one of these partners onstage, Gilad Gome of Petomics, he talked about the idea of changing the smell of faeces and gastric wind and using it as an alert that a person was unwell. “When your farts change from wintergreen to banana maybe that means you have an infection in your gut,” he said. He introduced Sweet Peach as a similar project. “The idea is to get rid of UTIs and yeast infections and change the smell of the vagina through probiotics,” he said. …

“These Startup Dudes Want to Make Women’s Private Parts Smell Like Ripe Fruit” ran the headline at Inc.com later that day. … Soon, the Huffington Post picked it up: “Two Science Startup Dudes Introduced a New Product Idea this Week: A Probiotic Supplement that Will Make Women’s Vaginas Smell Like Peaches.” Gawker called it a “waste of science” and said Sweet Peach “sounds like a C-list rom-com with a similarly retrograde view on the priorities of the contemporary human female.” Then, Inc.com weighed in again: “Its mission, apparently hatched by a couple of 11-year-old boys still in the ‘ew, girl cooties’ stage, is to make sure women’s vaginas smell ‘pleasant.’” Similarly negative stories began appearing in major news sources such as SalonBuzzfeed, the Daily Mail and Business Insider.

Long story short, all of the negative publicity resulted in public ostracism in his real life; funding for his company dried up; the company crashed; and he committed suicide.

Shit like this is why so many people hate journalists at magazines like HuffPo.

HuffPo journalists apparently think it’s fine to lie about a guy’s company and drive him to suicide, but think it is very concerning that some assholes told them to “learn to code.” (That said, a bullying campaign targeted at a bunch of people who just lost their jobs might also push someone over the edge to suicide.)

Over in reality land, the learn-to-code meme is far bigger than 4Chan and stems from society’s generalized attempt to replace outsourced manufacturing and other blue-collar labor with white collar jobs like coding. Earning a degree in computer science is, however, outside both the cognitive and physical resources of most laid-off factory workers. Indeed, as the information revolution progresses and society grows more complex, it is not unreasonable to expect that many people will simply not be smart enough to keep up. These are the losers, and there is nothing to be done for them but eternal bread and circuses, welfare and soma.

They commit suicide a lot.

It’s tempting to claim that being so out of touch with mainstream culture that you believe the “learn to code” meme sprang up ex nihilo is part of why these journalists got fired, but it’s far more likely they were just the latest victims of the contraction of print media that’s been going on for two decades.

People believe many other things that defy logic. The QAnoners fall more into the non-functional loony category, but the also-fanciful Russia Conspiracy is widely believed by otherwise levelheaded and normal liberals. The usually not too insane NY Times just ran an article claiming that, “As soon as black women could afford to buy mink coats, white society and white women said fur was all wrong.” Whew. There’s a lot implied in that statement.

(While I can’t tell you what people in New York think of black women wearing fur, I can tell you that around here, the only concern is for the fur.)

And there are many conservatives who believe an equal number of silly things about vast conspiracies–be they run by the Jews or the Gays or whomever–but in general, conservative conspiracy theories don’t get as much attention from reasonable people. Conservative conspiracies are low-class.

Take, for example, the way Alex Jones was deplatformed for getting the Sandy Hook students and their families harassed. Infowars is considered low-class and disreputable. But The New York Times did the exact same thing to the Covington students and their families, resulting in harassment and death threats for them, yet the NY Times has not been deplatformed.

What makes a conspiracy low or high status, published in the NY Times or on Infowars, believed by people who are otherwise kind of crazy or otherwise fairly sane?

Centrists and moderates tend not to champion political conspiracies, probably because they basically like society the way it is. “There is great big conspiracy to make society a nice place!” is not an argument most people will bother with. People who are further toward the political extremes, however, are dissatisfied with much of the way society is run. These people need an explanation for why society is so awful.

“Satan” is the archetypal explanation. The Evil One leads people into evil, and thus there is sin in the world and we are fallen from our original state of utopian grace. Satan has the rhetorical advantage of generally not being associated with a real person, so people of even moderate persuasions can be convinced to rally against the abstraction of evil, but sometimes people get a bit too worked up and actual people are put in prison for witchcraft or devil worship. Our last serious witch-hunt was in the 1980s, when people became convinced that Satanists were operating an international daycare conspiracy to kidnap, rape, and torture people’s children.

Today’s Pizzagaters are disreputable, but the Satanic Daycare Conspiracy was pushed by completely respectable mainstream media outlets and supported by the actions of actual police, judges, prosecutors, etc. If you lived through the 80s, you’ve probably repressed your memory of this, but it was a totally real conspiracy that actually sent real people to prison.

Today’s atheists have had to invent less demonic adversaries. The far left believes that the world is run by a cabal of evil heterosexual patriarchal cis-gendered white male Christians. The alt-right believes the world is run by a cabal of scheming Jews. Both of these are conspiracy theories. (Moderates occasionally delve into non-political conspiracies, like the ones surrounding famous movie stars or vaccinations.)

These theories provide all-encompassing ways of understanding the world. People are inexplicably mean to you? It must be part of a conspiracy by “them” to “get” you. As people encounter new information, the ideology they already have shapes how they react, either incorporating it as corroborating evidence or discarding it as worthless propaganda put out by their enemies.

Unfortunately, this makes conspiracies difficult to disprove.

A conspiracy will be considered reputable and believed by otherwise sane and level-headed people if it comes from an already trusted source, like the New York Times or 60 Minutes. It is normal to trust a source you already trust. After all, humans, even intelligent ones, are incapable of knowing everything society needs to know to keep functioning. We therefore have systems of trust and verification set up–such as medical degrees–that let us know what other people know so we can draw on their knowledge. If a plumber says that my plumbing is busted, it is probably in my interest to believe them. So it goes all the way up society–so if trusted people on CNN or in the government think Trump colluded with the Russians, then a reasonable person concludes that Trump colluded with the Russians.

A conspiracy will be considered disreputable and will appeal more to mentally unstable people if it requires first rejecting an established, trusted source. It is easy to believe a false thing by accident if someone you trust states it first; it requires much more work to first justify why all of the trusted sources are saying an untrue thing. This is therefore much easier if you are already paranoid, and distrusting everyone around you is usually a bad idea. (But not always.)

Of course this does not tell us how a source becomes trusted in the first place, but it does suggest that a false idea, once spread by a trusted source, can become very pernicious. (Conversely, a true idea, spread by a false source, will struggle.) The dominance of Cultural Marxism in universities may simply be a side effect of leftist conspiracies being spread by people whom society (or universities) see as more trustworthy in the first place.

(I suppose the fact that I usually don’t believe in conspiracy theories and instead believe in the power of evolution–of species, ideas, cities, civilizations, the sexes, families, etc–to explain the world as it is, might be why I generally see myself as a moderate. However, this leaves me with the task of coming up with a conspiracy theory to explain why evolutionary theories are not more widely accepted. “Meta-conspiracy theorist” sounds about right.)

(My apologies if this post is disorganized; it’s late.)

What they Want you to See…

bq-5c82bdeba0869

H/T Matthew Montoya)

I have not seen Captain Marvel, and thus cannot judge it, but I have seen the articles claiming that the only reason people don’t like Captain Marvel is because they’re evil patriarchs who hate female empowerment.

My sans-spoilers review for Alita is here and my massive spoilers reflections on cyborgs and Alita is here. I liked the movie quite a bit.

Ironically, these are both movies featuring female superheroes; if you are an evil patriarch who hates female superheroes, you will presumably hate both movies.

There is nothing deep, here, only an observation that the “culture war” is but sound and fury, signifying nothing. 58,000 men did not go on Rotten Tomatoes to review a movie just because they “hate women,” though some of them might have been motivated by humorless scolds lecturing them about how much they hate women, nor did 23,000 people show up to rate Alita just because they something something love empowering female teenagers.

If the only thing you want out of a movie is a chance to show off your politics, then you will miss out on the entire rest of the range of human experiences that reveal through narrative.

And if we could only see the official reviews, we might miss out on a great deal, because it seems that official reviewers are bad at their job.

So much of what passes for modern “politics” is this mere sound and fury; tempests in teapots over great big nothings. I don’t even want to comment on much of it, because it is so pathetic (I just happen to have a strong emotional attachment to Alita, which even I find a bit curious.) The things that pass for “politics” in our modern world are so detached from reality I can’t help but wonder if we are all just being fed bread and circuses to keep us distracted from the things we ought to be doing. If you care about women, go help at your local battered women’s shelter. If you want to help trans people, volunteer with one of the charities that sends letters to incarcerated trans people. If you want to help the poor, volunteer at a soup kitchen or donate clothes and toys to foster kids–or better yet, adopt one. If you want to help people, go outside and HELP someone, but for goodness’ sakes, don’t think that you’re advancing any social cause by watching a movie.

Meanwhile, Asia Review reports that half of the world’s self-made female billionaires are now Chinese. The world is changing and we Americans are off squabbling about our genitals instead of getting out there and DOING SOMETHING.

So get out there.

Trump can’t fire anyone and neither could Tsar Nicholas II

The late reign of the Russian Tsars was marked by their near total inability to exert their will over anything.

At Tsar Nicholas II’s coronation festival:

Before the food and drink was handed out, rumours spread that there would not be enough for everyone. As a result, the crowd rushed to get their share and individuals were tripped and trampled upon, suffocating in the dirt of the field.[39] Of the approximate 100,000 in attendance, it is estimated that 1,389 individuals died[37] and roughly 1,300 were injured.[38] The Khodynka Tragedy was seen as an ill omen and Nicholas found gaining popular trust difficult from the beginning of his reign. The French ambassador’s gala was planned for that night. The Tsar wanted to stay in his chambers and pray for the lives lost, but his uncles believed that his absence at the ball would strain relations with France, particularly the 1894 Franco-Russian Alliance. Thus Nicholas attended the party; as a result the mourning populace saw Nicholas as frivolous and uncaring.

The guy can’t even get out of sports with his uncle:

From there, they made a journey to Scotland to spend some time with Queen Victoria at Balmoral Castle. While Alexandra enjoyed her reunion with her grandmother, Nicholas complained in a letter to his mother about being forced to go shooting with his uncle, the Prince of Wales, in bad weather, and was suffering from a bad toothache.[41]

Russo-Japanese War:

Nicholas’s stance on the war was something that baffled many. He approached the war with confidence and saw it as an opportunity to raise Russian morale and patriotism, paying little attention to the financial repercussions of a long-distance war.[45] Shortly before the Japanese attack on Port Arthur, Nicholas held firm to the belief that there would be no war. Despite the onset of the war and the many defeats Russia suffered, Nicholas still believed in, and expected, a final victory, maintaining an image of the racial inferiority and military weakness of the Japanese.[44]

As Russia faced imminent defeat by the Japanese, the call for peace grew. Nicholas’s mother, as well as his cousin Emperor Wilhelm II, urged Nicholas to negotiate for peace. Despite the efforts, Nicholas remained evasive, sending a telegram to the Kaiser on 10 October that it was his intent to keep on fighting until the Japanese were driven from Manchuria.[44] It was not until 27–28 May 1905 and the annihilation of the Russian fleet by the Japanese, that Nicholas finally decided to sue for peace.[citation needed]

The Duma:

A second Duma met for the first time in February 1907. The leftist parties—including the Social Democrats and the Social Revolutionaries, who had boycotted the First Duma—had won 200 seats in the Second, more than a third of the membership. Again Nicholas waited impatiently to rid himself of the Duma. In two letters to his mother he let his bitterness flow:

A grotesque deputation is coming from England to see liberal members of the Duma. Uncle Bertie informed us that they were very sorry but were unable to take action to stop their coming. Their famous “liberty”, of course. How angry they would be if a deputation went from us to the Irish to wish them success in their struggle against their government.[67]

He can’t even stop people from coming into his country!

Then, of course, there was that little matter with WWI.

The Tsarina, Alexandra, complained that she couldn’t so much as change the scones they were served at tea time. Each detail of the tea service was set, determined by a system of rules and patronage already put into place and now immutable.

I wish I could find now the book that discussed this, but my search skills are failing me. But in short, despite being the ostensible autocratic monarchs of a massive empire, the Tsar and Tsarina were remarkably incapable of altering even the most minor aspects of their lives. Despite titles like autocrat, emperor, tsar, etc., few men rule alone–most monarchs are enmeshed in multiple overlapping systems of authority, from their relatives–the rest of the royalty–to the military, bureaucracy, the local upper class, feudal obligations, rights and privileges, etc.

Even Henry VIII had to resort to inventing his own religion just to get a simple divorce–something we peasants affect with far more ease. Henry’s difficulties stemmed from the fact that his wife, Catherine of Aragon, was daughter of the king and queen of Spain, and the Pope (whose dispensation was needed for a royal divorce) was at the time being held prisoner by Catherine’s nephew, Emperor Charles V.

But Henry did eventually manage.

We might criticize Henry for murdering two of his wives, but Britain had just emerged from a century of civil war and he knew the importance of producing a clear heir so succession could not be contested and the country would not descend again into war. He was descended from the guys who were ruthless enough to come out on top and he was willing to chop off a few heads if that’s what it took to keep his country safe.

And the product of Henry’s reign was peace; his daughter, Queen Elisabeth I, oversaw England’s golden age.

By contrast, Nicholas II couldn’t produce a viable male heir (hemophiliacs are right out). Alexandra’s failure resulted in neither divorce, a rupture with the Orthodox Church, nor execution (had any of Henry’s wives associated with the likes of Rasputin, their heads would have been off.) He couldn’t even get out of frivolous amusements with his uncle.

It’s not that lopping of Alexandra’s head would have saved the Russian Tsars, but that having a system with enough flexibility that the Tsar could actually make important decisions and leaders capable of using said system might have.

Meanwhile in America, it amazes me that Trump is not capable of simply firing anyone in the executive branch he so desires–including the entire executive branch. After all, Trump is the head of the executive branch; they answer to him. If Trump cannot fire them, who can? How can bad actors be removed from the executive branch?

Take the incredible recent 60 Minutes Interview with McCabe, a former FBI agent who was fired for conspiring to overthrow President Trump during the election:

Tonight you will hear for the first time from the man who ordered the FBI investigations of the President. Former acting FBI director Andrew McCabe is about to describe behind the scenes chaos in 2017, after Trump fired FBI director James Comey. In the days that followed, McCabe says that law enforcement officials discussed whether to secretly record a conversation with the president, and whether Mr. Trump could be removed from office by invoking the 25th amendment.

Who the fuck does this McCabe asshole think he is? The power to impeach lies with Congress, not the FBI. The FBI is part of the executive branch. It doesn’t even make sense for the executive branch to investigate its own head, much less try to oust a sitting president for firing someone.

That’s how the entire CHAIN OF COMMAND works.

After Comey was fired, McCabe says he ordered two investigations of the president himself. They asked two questions. One, did Mr. Trump fire Comey to impede the investigation into whether Russia interfered with the election. And two, if so, was Mr. Trump acting on behalf of the Russian government.

The media keeps trotting out a line–they’ve been trotting this out since before the election–that Trump needs to believe the intelligence on Russia. But nobody–outside of a few folks inside the intelligence service itself and perhaps Trump–gets to see the actual evidence on the matter, because it’s all “classified.” And frankly, I don’t think they have any evidence. Because it’s not real.

Remember Iraq?

If you can’t prove any of this, there’s no reason to believe (or not believe) any of it.

Imagine if during the ’08 election, the Republicans had become convinced that Obama was an Islamic foreign agent working together with Muslim countries to subvert America, and the FBI under Bush started an investigation into Obama. (There are Republicans who thought this, but it has always been fringe.) Now imagine that two years later, the media is still insisting that Obama needs to “believe the intelligence agencies” about Saudi interference in the election and that the FBI is trying to secretly wiretap him because he fired the guy who was pushing the “investigation” of his supposed links to Osama bin Laden.

Would you not think that the FBI had gone a bit insane?

Whether you like Trump or not is beside the point.

There is simply no accountability here for the FBI’s behavior. The FBI is pushing whatever harebrained conspiracy it wants, and if Trump tries to do anything to reign them in, they threaten him with “obstruction of justice” and threaten to team up with Congress to get him impeached.

Even if you don’t believe in democracy, you may still be concerned that random guys in the FBI are trying to run the country.

Remember, in the midst of the destruction of the Russian regime, the best the royalty could manage was murdering an annoying monk. They couldn’t save themselves–or their country–from disaster.

What do Terrorists Read and Are Tech Companies Suppressing Wages?

First, an interesting article claiming that tech companies are using artificial labor shortages to claim they need to import more H1-Bs in order to keep wages low:

That study was a key link in a chain of evidence leading to an entirely different view of the real origins of the Immigration Act of 1990s and the H1-B visa classification. … Their aims instead were to keep American scientific employers from having to pay the full US market price of high skilled labor. They hoped to keep the US research system staffed with employees classified as “trainees,” “students,” and “post-docs” for the benefit of employers. The result would be to render the US scientific workforce more docile and pliable to authority and senior researchers by attempting to ensure this labor market sector is always flooded largely by employer-friendly visa holders who lack full rights to respond to wage signals in the US labor market.

I rate this credible.

Second, an article by Donald Holbrook, “What Types of Media do Terrorists Collect?” [PDF] Unfortunately, the article only looks at religious/historical/political media, and so does not answer the eternal question of whether terrorists prefer Asuka or Rei, or whether their media consumption differs in other ways from other people’s.

The author looked at media collected by ten Islamic terrorists in, I believe, Britain. It would be interesting to compare these collections to those of NRA terrorists and people of similar backgrounds who didn’t commit terrorism–maybe someone can do a follow-up study on the matter.

So what media do they consume?

Holbrook found, first of all, that most of their media is pretty innocuous–things like 17-part audiobook series on some historical topic. (Audio–rather than written or video–media predominated, but that may not hold in the future with YouTube videos now quite easy to produce.) Only a small percent of the media was coded as “extreme” (that is, advocating violence)–even terrorists don’t spend all of their time reading about how to build bombs.

A few items were consumed by multiple people (this was generally more extreme media, which probably just exists in much lower quantities,) but most of the media was of sufficient variety that different people read different things.

Most of it was in English, since the terrorists speak English. The author expressed some concern that translations of much older religious material were not entirely accurate, but also noted that the terrorists possessed a fair amount of religious/historical commentaries that expressed counter-extremist messages.

So what can we conclude from this?

  1. It seems unlikely to me that radicalization is simply due to exposure to extreme material, since most of what they consumed was mild. It seems more likely that people who are prone to radicalization seek out more extreme material.
  2. However, it is possible that a strong sense of historical or religious identity is an important part of radicalization–most people don’t listen to 17 part serieses on obscure religious history topics.
  3. People who live in Britain but have a strong identity as something other than British are probably more likely to engage in anti-British terrorism
  4. The internet/modern technology have increased the availability of historical/foreign documents, especially in translation, allowing for people to communicate across nations and through time in ways that were much more difficult and limited before.

#4 is, I think, quite important–across a range of different human activities, not just radicalization. I think the increased availability of printed material in the early modern period allowed for the spread of the European witchcraft hysteria, for example, as the gullible public eagerly consumed pamphlets purporting to report on heinous crimes of witchcraft occurring in neighboring towns.

Increased literacy probably also went hand-in-hand with the Protestant Revolution, which emphasized the importance of people reading the Bible for themselves in order to have a personal relationship with God–something that was impossible before the era of relatively cheap Bibles.

This, of course, launched years of religious warfare that scourged the European continent and led to a lot of people being burned at the stake, at least until people mellowed and decided religious differences weren’t that big a deal.

Today, changes in media availability/ease of communication is changing how Westerners think about morality. It may also be changing how non-Westerners approach the world too–but not necessarily in the same ways.

Unsurprisingly, this study contradicts the common claim that terrorists aren’t religiously motivated or aren’t practicing “true Islam.” Of course, I have yet to see anyone, ever, admit to practicing a false version of a religion. Everyone believes that they are practicing the true version (or the true lack of a version, in the case of atheists,) and that everyone else is practicing a false version. Of course I also think terrorists have got religion wrong, but that doesn’t mean they aren’t practicing it to the best of their abilities–and of course, they think I’m doing it wrong.

But the fact that these folks are religiously motivated is undeniable–they definitely consume far more religious media than the average person.