Do Chilblains Affect Blacks More than Whites?

toes afflicted with chilblains
toes afflicted with chilblains

While tromping through a blizzard, seeking insight into circum-polar peoples, I discovered a condition called chilblains. The relevant Wikipedia page is rather short:

Chilblains … is a medical condition that occurs when a predisposed individual is exposed to cold and humidity, causing tissue damage. It is often confused with frostbite and trench foot. Damage to capillary beds in the skin causes redness, itching, inflammation, and sometimes blisters. Chilblains can be reduced by keeping the feet and hands warm in cold weather, and avoiding extreme temperature changes. Chilblains can be idiopathic (spontaneous and unrelated to another disease), but may also be a manifestation of another serious medical condition that needs to be investigated.

The part they don’t mention is that it can really hurt.

The first HBD-related question I became interested in–after visiting a black friend’s house and observing that she was comfortable without the AC on, even though it was summer–is whether people from different latitudes prefer different temperatures. It seems pretty obvious: surely people from Yakutsk prefer different temperatures than people from Pakistan. It also seems easy to test: just put people in a room and give them complete control over the thermostat. And yet, I’d never heard anyone discuss the idea.

Anyway, the perfunctory Wikipedia page on chilblains mentioned nothing about racial or ethnic predisposition to the condition–even though surely the Eskimo (Inuit) who have genetic admixture from both ice-age Neanderthals and Denisovans:

“Using this method, they found two regions with a strong signal of selection: (i) one region contains the cluster of FADS genes, involved in the metabolism of unsaturated fatty acids; (ii) the other region contains WARS2 and TBX15, located on chromosome 1.” …

“TBX15 plays a role in the differentiation of brown and brite adipocytes. Brown and brite adipocytes produce heat via lipid oxidation when stimulated by cold temperatures, making TBX15 a strong candidate gene for adaptation to life in the Arctic.” …

“The Inuit DNA sequence in this region matches very well with the Denisovan genome, and it is highly differentiated from other present-day human sequences, though we can’t discard the possibility that the variant was introduced from another archaic group whose genomes we haven’t sampled yet,” Dr. Racimo said.

The scientists found that the variant is present at low-to-intermediate frequencies throughout Eurasia, and at especially high frequencies in the Inuits and Native American populations, but almost absent in Africa.

Sub-Saharan Africans have their own archaic admixture, but they have very little to no ice-age hominin–which is probably good for them, except for those who’ve moved further north.

Imagine my surprised upon searching and discovering very little research on whether chilblains disproportionately affects people of different races or ethnicities. If you were a dermatologist–or a genetically prone person–wouldn’t you want to know?

So here’s what I did find:

The National Athletic Trainers Association Position Statement on Cold Injuries notes:

Black individuals have been shown to be 2 to 4 times more likely than individuals from other racial groups to sustain cold injuries. These differences may be due to cold weather experience, but are likely due to anthropometric and body composition differences, including less-pronounced CIVD, increased sympathetic response to cold exposure, and thinner, longer digits.3,6

I think CIVD=Cold-Induced Vasodilation

The Military Surgeon: Journal of the Association of Military Surgeons of the United States, Volumes 36-37, states:

c2ijujzucaayfvv

c2ijvyvveaaehdz c2ijw49vqaajuou

The text continues with descriptions of amputating rotting feet.

A PDF from the UK, titled “Cold Injury,” notes:

c2ilp17uaaaonao

Notice that the incidence of chilblains is actually less in extremely cold places than moderately cold places–attributed here to people in these places being well-equipped for the cold.

c2ilp4evqaet4lu

Finally I found a PDF of a study performed, I believe, by the US Military, Epidemiology of US Army Cold Weather Injuries, 1980-1999:

picture-2

While I would really prefer to have more ethnic groups included in the study, two will have to suffice. It looks like trench foot may be an equal-opportunity offender, but chilblains, frostbite, and other cold-related injuries attack black men (at least in the army) at about 4x the rate of white men, and black women 2x as often as white women (but women in the army may not endure the same conditions as men in the army.)

On a related note, while researching this post, I came across this historic reference to infectious scurvy and diabetes, in the Journal of Tropical Medicine and Hygiene, Volumes 4-5 (published in 1902):

c2iou5uviae83si

Note: this is why it is important to discard bad theories after they’ve been disproven. Otherwise, you kill your scurvy victims by quarantining them instead of giving them oranges.

Tesla vs. Edison

... and fight! 220px-Thomas_Edison2

It has become popular of late, especially on the left, to love Tesla and hate Edison. (Warning: that is a link to the Oatmeal, which is very funny and will suck up large quantities of your time if you let it, but if you aren’t familiar with the leftists hate of Edison and valorization of Tesla, it’s a necessary read.)

Edison, (1847 – 1931) was an American-born (son of a Canadian war refugee of Dutch descent) auto-didact, inventor, and businessman who was awarded over a thousand patents. His most important inventions (or inventions produced by his lab,) include the first actually useful lightbulb, the phonograph, the first movie camera and a device to view the movies on, the electrical grid necessary to power the lightbulb, the movie studio necessary to make the films for people to watch, and the scientific research lab.

He was friends with Henry Ford, a community volunteer, deaf, and a general humanitarian who abhorred violence and prided himself on having never invented an offensive weapon.

His worst mistake appears to have been not realizing what business he was in during the “War of the Currents;” Edison thought he was in the lightbulb-selling business, and since he had invented a lightbulb that ran on DC, he wanted everyone to use DC. He also seems to have been genuinely concerned about the high voltages used by AC, but DC just drops off too quickly to be used in non-urban areas; to get the country electrified required DC. Edison not only lost the Currents War, but also got kicked out of the company he’d founded by his stock holders. The company’s name was later changed to General Electric.

His political views were fairly common for his day–he advocated the populist position on abolishing the gold standard, tax reform, and making loans interest free to help farmers. Religiously, he was basically a GNON-believing deist. He preferred silent films over “talkies” due to being deaf, and had six children, three of whom went into science/inventing, one with a degree from Yale and one from MIT.

The idea that Edison was “merely” a businessman or CEO is completely bollocks. He was not only a brilliant inventor, but also understood how his inventions would be used and created the systems–both human and mechanical–necessary to bring them to full fruition.

Edison's lab in Menlo Park
Edison’s lab in Menlo Park

 

Tesla (1856-1943) was a Serb born in Croatia back when Croatia was part of the Austrian empire. By all accounts, he was exceedingly brilliant. His father was a priest and his mother was the daughter of a priest, but he received a scholarship to the Austrian Polytechnic University, where he burned like a meteor for his first year, earning the highest grades possible in 9 subjects (almost twice the required course load.) In his second year, he became addicted to gambling, then gambled away his tuition money in year three and forgot to study for his finals. He flunked out and ran away.

A couple of years later, his family raised money to send him to university again, which was another fiasco, since Tesla didn’t have training in two of the required subjects and so couldn’t actually attend.

Nevertheless, Tesla managed to get work at a telegraph company and was eventually invited to the US to work under Edison. Here he did excellent work, but quit over a rather stupid sounding misunderstanding about pay, wherein Tesla expected to be paid far more for an invention than Edison had in funds to pay anyone. Edison offered a raise instead, but Tesla decided to strike out on his own.

Tesla attempted to start a business, which ended badly (it sounds like it went south because he wasn’t focusing on the stated goals of the company,) and left him a penniless ditch-digger.

He then hit on a series of successes, including the polyphase induction motor, which ended with him quite handsomely employed by one of Edison’s competitors, Westinghouse, but even here he had difficulties getting along with his co-workers. Eventually it seems he established his own lab and convinced investors to give him $100,000, which he promptly spent on more lab equipment instead of the new lighting system he’d promised. His lab was later sold and torn down to pay off debts.

Tesla received yet another major investment, $150,000 to build a wireless telegraph facility, but appears to have blown the money on stock market speculation. He did manage to finish the project, though without any more funds from his now very jaded investors, but eventually he had to sell the building, and it was demolished.

Many of Tesla’s inventions were clearly brilliant and far ahead of their time. Others are delusions, like his mechanical oscillator. Tesla claimed it nearly brought down the building; Mythbusters built one themselves, and it did no such thing.

There is a kind of brilliance that slides easily into madness, and Tesla’s was clearly of this sort. He was too adept at pattern matching (he could do calculus in his head) to sort out real patterns from ones he’d dreamed up. He never married, but once fell in love with a pigeon at the park, feeding it daily and spending over $2000 dollars on it when its wing was injured.

In his personal life, he was extremely rigid–working and eating at the exact same times every day, eating a very restricted diet, and wearing a fastidiously neat and regimented wardrobe. He was extremely thin and slept very little–perhaps only 2 hours a day. (There are a vanishingly few people in the world who actually do function like this.) He was critical and harsh toward people who didn’t meet his standards, like fat people or secretaries whose clothes he thought were insufficiently attractive. Despite not having any children of his own, he believed the unfit should be sterilized and the rest of the population coerced into a selective breeding program. He also said some unflattering things about Edison upon the man’s death, which is kind of rude.

To prevent him from sinking further into poverty, his former employer, Westinghouse, took pity on him and started paying his hotel bills, (Tesla seems to have not thought of living in a house.) Tesla spent much of his final years claiming to have built a “Death Ray” and claiming that various thieves had broken into his hotel room to steal it.

Upon his death in 1943, the government seized all of his belongings just in case there were actual Death Rays or other such inventions in there that the Nazis might try to steal. The box with Tesla’s Death Ray turned out to have nothing more than an old battery inside. The investigator concluded:

“[Tesla’s] thoughts and efforts during at least the past 15 years were primarily of a speculative, philosophical, and somewhat promotional character often concerned with the production and wireless transmission of power; but did not include new, sound, workable principles or methods for realizing such results.

To be frank, I’ve talked to homeless schizophrenics who sound a lot like Tesla; the line between correct pattern matching and incorrect pattern matching is, at times, easily crossed.

 

The modern habit of shitting on Edison and glorifying Tesla stems from the tendency to see Edison as a stereotypically American businessman who wickedly and cunningly stole ideas from from smarter people to build up his own wealth and reputation. It feeds into the notion that Americans (white Americans, especially,) have built nothing of their own, but stolen all of their wealth and a great many of their ideas from others. Here Tesla–attractive, urbane, brilliant, and most of all, not tainted by the blight of having been born in America–gets to stand in for the usual victimized classes.

Ironically, Edison’s political beliefs line up with the Progressives of his day–that is, socialists/liberals like Teddy Roosevelt and Woodrow Wilson. Tesla, at least as far as the Wikipedia describes any of his beliefs, favored Nazi-style forced sterilization and eugenics. In daily life, Tesla may have been a nicer person than Edison (it is rather difficult to tell from Wikipedia articles what people were like personally,) but I question a left that denigrates one of their own Progressives while upholding a man whose political beliefs are, at best, anathema to their own.

Regardless, Tesla’s failures were not Edison’s fault. Edison may have screwed him on pay, but he didn’t gamble away Tesla’s tuition money, make him fail his classes, nor convince him not to marry. Edison didn’t make him blow his investment money on the stock market or wander around NYC at all hours of the night, feeding pigeons.

Edison, deaf since childhood, didn’t have half the advantages handed to him as Tesla. He had all of three months of schooling; no one ever sent him to university or gave him a scholarship to waste. He may not have been as smart as Tesla, but he was still an intensely intelligent man and adeptly capable of carrying out the business side of the operation, without which no research could get done. Without funding, you don’t have a lab; no lab, no research. Humans do not live in isolation; someone has to do the inglorious work of coordinate things so that other people can reap the benefits of a system set up for them to work in.

Ultimately, Tesla was a brilliant man who should not have been allowed to run his affairs. He needed the structure of a boss, a wife, parents, family, etc., to keep him on track and stop him from doing idiotic things like gambling away his tuition money.

Familial supervision during college could have ensured that he graduated and gotten him on the path toward a tenured position. Perhaps he would have rubbed shoulders with the likes of Einstein and Curie at the Solvay Conference. A boss would have ensured that the strategic, business ends of things–the ends Tesla had no great talent for–got done, leaving Tesla to do the things he did best, to reach far more of his full potential. (In this regard, Edison had advantages Tesla lacked–a wife, family, and a country he had grown up in.) But Tesla was too rigid to submit to someone of inferior intellect (real or perceived), and his family back in Europe was too far away to help him. Loneliness is madness, for humans are social animals, and so brilliant Tesla died alone, poor, and in love with a pigeon.

Tesla's wireless telegraph tower, 1904
Tesla’s wireless telegraph tower, 1904

Just imagine what Edison and Tesla could have created had they put their animosity aside and worked together.

Part 2 coming soon.

 

 

Women in Science–the Bad Old Days

Once we were driving down the highway when my husband said, “Hey, a Federal Reserve Note just flew across the road.”

Me: I think you have been reading too many finance blogs.

 

Oh look, Silver Certificates:

800px-US-$5-SC-1896-Fr.270 Hey, did you know we still have two dollar bills? 800px-US-$1-SC-1896-Fr-224-(3923429)

 

These bills, from the so-called “Education Series,” were printed in 1896 and feature, rather prominently, women. The $1 bill has Martha (and George) Washington. The other bills feature women as allegories of science, history, electricity, commerce, manufacturing, and you know, I can’t really tell if the steam and electricity children are supposed to be male or female.

If someone wants to put women on money, I totally support bringing back these bills, because they’re gorgeous.

There’s a certain sadness in looking at these and thinking, “Gosh, maybe people in the 1800s really were smarter than us.” Today, the five dollar bill would offend too many people (it has a breast on it!) and couldn’t get printed. We’ve become Philistines.

There’s also a sense of, “Wait, are you sure this the bad old days of women’s oppression, when people thought women were dumb and couldn’t handle higher education and shit?” Why would people who think women are dumb use women to illustrate the very concept of “science”?

Here’s a painting of MIT’s Alma Mater (Latin for “Nourishing Mother,”) finished in 1923:

e2b6e964ec0c2ea35ee1cba8f4edc95a

(Sorry it’s a crappy photo. I couldn’t find any good photos.)

“Alma Mater,” of course, is used synonymously with “university.” That is, the university itself (all universities,) is female. From the description:

“The central panel is rigidly symmetrical, with the centrally enthroned Alma Mater approached by two groups of acolytes extending laurel wreaths. The composition deliberately recalls the tradition in Christian art of the ascending Madonna attended by saints and apostles. Alma Mater is surrounded by personifications of learning through the printed page, learning through experiment, and learning through the various branches of knowledge. They hover above the Charles River Basin, with a spectral hint of the MIT buildings in the background.”

Here’s a detail:

Blashfield-popup

Unfortunately, I haven’t found good photos of the side paintings, but they sound dramatic:

“The two side panels … bring the elevated scene down to earth with trees that appear to grow straight up from the floor. Unexplained spectral figures glide through this grove. … The right panel, which has been identified as Humanity Led by Knowledge and Invention depicts a mother and children of varying ages progressing from Chaos to Light, accompanied by cherubs bearing the scales of Justice. On the left, the most dark and dramatic mural squarely faces the ethical challenge that has confronted science from the outset. The Latin inscription (from Genesis) in the roundel spells out: “Ye Shall Be Us Gods Knowing Good and Evil.” The lab-coated scientist is crowned by a figure said to be Hygenia (goddess of Health). He stands between two giant jars containing beneficent and malevolent gasses, symbolizing the constructive and destructive possibilities unleashed with every new discovery. With the horrors of the First World War still fresh, soldiers and diplomats gather at the Council table of the World. Dogs of war lurk near evil gasses, while Famine threatens the background. The strangely out-of-scale, dark colossal head within the shadow of the Tree of Knowledge is said to represent Nature; her relation to the rest of the drama is (perhaps deliberately) unclear.”

If you squint, you might be able to make them out:

morss_center_large

Before art went to shit, the world was full of lovely paintings of things like “Liberty leading the People” or “The Arts and Sciences,” using allegorical human forms that relied upon people’s understanding and knowledge of ancient Greek mythology–not so ancient when people were actually reading it. I suspect there are so few good photos of this painting because people forget, when surrounded by splendor, that splendor is no longer normal.

This habit of using women as allegorical figures to represent science and learning goes back hundreds, if not thousands of years:

These guys thought women were dumb?
12th century illustration of the Seven Liberal Arts: Grammar, Logic, Rhetoric, Arithmetic, Geometry, Music Theory, and Astronomy

The “Liberal Arts” did not originally refer to silly university classes, but to the knowledge thought essential to the education of all free (liber) people, in order to participate properly in civic life. These essential studies were Grammar, Logic, Rhetoric, Arithmetic, Geometry, Music Theory, and Astronomy (one may assume that the functional ability to read is considered a basic prerequisite for learning, not an endpoint in itself as it is in our modern system.) These studies all culminate in their purist expression in Philosophy, the very love of wisdom.

Notice that all of these allegorical figures are women. Did the depiction of women as the purist ideal of mathematical knowledge make male students doubt their own self-worth and drive them away from serious study?

Then why do people think the inverse?

The trend can be traced back further:

Botticelli, Primavera
Botticelli, Primavera

Boticelli depicts the Spring accompanied by the Greek Graces.

Raphael's Parnassus
Raphael’s Parnassus

The Greek Muses were goddesses of inspiration for literature, science, and the arts. Different people list them differently, (I doubt there was ever any widespread agreement on exactly what the muses represented,) but the lists generally look like, “epic poetry, history, music, poetry, tragedy, hymns, dance, comedy, and astronomy,” or “music, science, geography, mathematics, philosophy, art, drama, and inspiration.”

1280px-Muses_sarcophagus_Louvre_MR880

And who can forget Athena herself, goddess of wisdom and warfare?

280px-Athena_aigis_Cdm_Paris_254

220px-Mattei_Athena_Louvre_Ma530_n2

310px-NAMABG-Aphaia_Athena_statue

(Take your artistic pick.)

Who needs Nobel Prize Winners, anyway?

You may have noticed that I like science. I also like scientists–heck, let’s expand this to most of STEM. Good folks.

Scientists tend to be quiet, unassuming folks who get on with the business of making the world a better place by curing cancer, inventing airplanes, and developing the germ theory of disease.

I don’t like it when political ideas try to dictate science. It was bad enough when the Soviet Union tried it (and Maoist China, remember that exciting time when Mao declared that the concept of diminishing returns was bourgeois capitalist lies and that just planting more seeds in your fields would result in more crops, and then millions of people died? Fun times!)

Sometimes scientists say or think unpopular things, like that humans evolved from apes or that some human populations have lower IQs than others. Or that women cry easily or that Global Warming is real.

The mature reaction to someone saying something you find offensive is to make a logical counter-argument. (Or, you know, ignore them.) Indeed, as I’ve said before, one of the beauties of science is that the whole point of it is to disprove incorrect ideas. If there’s an idea floating around in science that you don’t like, well, disprove it with science!

If you can’t, then maybe you’re the one who’s wrong.

Republicans have traditionally been the anti-science side. 49% don’t believe in evolution, versus 37% do. Throwing Democrats and independents into the equation doesn’t help much–overall, 42% of Americans don’t believe in evolution, versus 50% who believe in some form of evolution, (including god-directed evolution):

At least evolution is getting a tiny bit more popular
From Gallup

Unfortunately, a lot of those people who claim to believe in evolution don’t.

For example, according to Gallop, 2005, the majority of Americans–68%–believe that men and women are equally good at math and science. Only 10% believe that men have an innate advantage in math and science, and 8% believe that women are superior.

Do you know how depressing this is? I mean, for starters, the question itself is badly worded. Men and women are about equal on average, but men are disproportionately represented at the high end of mathematical ability and at the low end. As I noted yesterday, this is a natural side effect of Y chromosome variation. But for the purposes of doing math and science as a career, which takes rather more than average talent, men do have an innate advantage.

But instead of getting intelligent discussions about these sorts of things, we get people shouting insults and trying to ruin each other’s careers.

This popped up on FB today:

*winces*
Does this count as a microaggression?

“Sexist,” of course, is an insult, akin to saying that you hate women or believe that they are inherently inferior. So according to these people, anyone who thinks that, IDK, men are more aggressive on average because their brains produce more testosterone is a bad person. Never mind that science supports this notion pretty soundly.

(BTW, it’s pretty hard to argue that society’s anti-woman views are nefariously keeping women out of STEM when the majority of people think men and women are equally talented. For that matter, if there’s any group of people that I’ve found to be extremely accepting of and decent toward women, it’s the folks in STEM. Seriously, these guys are super awesome.)

So you may remember that whole kerfluffle in which Tim Hunt–some nobody who’s contributed nothing of worth to humanity except maybe Nobel Prize-winning work in Medicine/Physiology, small stuff on the scale of human achievement–made some comments about women in science and the entire world spent about 5 minutes losing their collective shit and then a lot of pictures of female scientists got posted on the internet. (Actually, the pictures are kind of nice.)

Oh, and Tim was forced to resign from some honorary professorship.

“The days that followed saw him unceremoniously hounded out of honorary positions at University College London (UCL), the Royal Society and the European Research Council (ERC).

“Under siege at his Hertfordshire home, he sank into despair.

“‘Tim sat on the sofa and started crying. Then I started crying,’ his wife, Professor Mary Collins (herself a prominent scientist) later recalled. ‘We just held on to each other.’”

When it came to light that Tim Hunt may have just been trying to make a joke–a bad one–the provost at his erstwhile University indicated that, (in The Guardian’s words) “Professor Hunt would not be reinstated, it was impossible for an institution to tolerate someone to whom they had awarded an honorary post, even a 71-year-old Nobel prize winner, expressing views even in jest that so comprehensively undermined its own reputation as a leading supporter of female scientists.”

I am just thrilled, oh so thrilled, that university science departments now see their primary purpose as public works programs for women, rather than, IDK, the pursuit of actual fucking science.

Do you know what happens to your science department when you stop focusing on science and turn it into a pity-festival for women? You end up with a bunch of women who can’t hack it in science. Accept men and women on their merits, and you end up with quality scientists. Accept people based on their qualities other than merit, and you end up with hacks.

BTW, I’m female.

You might think Hunt’s comments were totally silly (in which case, go ahead and ignore them,) but I’ve known couples that started in labs. I don’t think it’s any big secret that people sometimes fall in love with co-workers. Is this a problem? I don’t know. Do women cry more than men? Anecdotal experience says yes.

The intelligent response to Hunt’s comments (if you want to do anything at all,) would have been to document whether or not women cry at a higher rate than men when you criticize their lab work and whether lab romances are a problem–and if gender segregated labs would actually work any better, or end up with their own issues. The unintelligent response is to make a big deal out of how offended you are and try to get someone fired.

So what does Connie St Louis, the female scientist journalist who’s actually not a scientist (The Daily Mail claims that St Louis made up/faked a large chunk of her CV, if you believe anything the Daily Mail prints,) and so probably has less experience running a lab than Hunt does, but never mind, we’re all experts now, have to say about starting the whole firestorm that made Hunt lose his probably not very important honorary position?

“The likes of Richard Dawkins and Brian Cox should focus on taking up the real issue of sexism in science. It is absurd to say that scientists can do and say what they like in the name of academic freedom.”

Let’s read that again. “It is absurd to say that scientists can do and say what they like in the name of academic freedom.

What else does St Louis have to say?

“…eight Nobel laureates, plus the ubiquitous Richard Dawkins, have come out in support of Hunt. There are over 2,000 signatures on an online petition to reinstate him to his honorary post at UCL. Contrast this with 200+ signatures on a petition that I started calling on the Royal Society to elect its first female president. The Nobel eight made an idiotic attempt to equate the upset caused by Hunt’s ill advised and sexist comments with some kind of “chilling effect” on academics.”

Of course it has a chilling effect. No one wants to get fired. How does a journalist even presume to claim to know what does and doesn’t have a chilling effect on someone else’s profession, when rather respected people in that profession are claiming that chilling effects exist?

Hell, there’s a reason this blog is anonymous, and it’s people like Connie St Louis. But she continues:

“This is an absurd idea and deserves to be outed for what it is, a deeply cynical attempt to say that scientists can do and say what they like. In the name of academic freedom? Is science so special that any old sexist (or for that matter racist) words that they utter are allowed? The answer is and must be a resounding no.”

Free inquiry is dead.

Remember whom to thank when we all die of cancer plague.

Epigenetics

I remember when I first heard about epigenetics–the concept sounded awesome.

Now I cringe at the word.

To over simplify, “epigenetics” refers to biological processes that help turn on and off specific parts of DNA. For example, while every cell in your body (except sperm and eggs and I think blood cells?) have identical DNA, they obviously do different stuff. Eyeball cells and brain cells and muscle cells are all coded from the exact same DNA, but epigenetic factors make sure you don’t end up with muscles wiggling around in your eye sockets–or as an undifferentiated mass of slime.

If external environmental things can have epigenetic effects, I’d expect cancer to be a biggie, due to cell division and differentiation being epigenetic.

What epigenetics probably doesn’t do is everything people want it to do.

There’s a history, here, of people really wanting genetics to do things it doesn’t–to impose free will onto it.* Lamarck can be forgiven–we didn’t know about DNA back then. His theory was that an organism can pass on characteristics that it acquired during its lifetime to its offspring, thus driving evolution. The classic example given is that if a giraffe stretches its neck to reach leaves high up in the trees, its descendants will be born with long necks. It’s not a bad theory for a guy born in the mid 1700s, but science has advanced a bit since then.

The USSR put substantial resources into trying to make environmental effects show up in one’s descendants–including shooting anyone who disagreed.

Trofim Lysenko, a Soviet agronomist, claimed to be able to make wheat that would grow in winter–and pass on the trait to its offspring–by exposing the wheat seeds to cold. Of course, if that actually worked, Europeans would have developed cold-weather wheat thousands of years ago.

Lysenko was essentially the USSR’s version of an Affirmative Action hire:

“By the late 1920s, the Soviet political leaders had given their support to Lysenko. This support was a consequence, in part, of policies put in place by the Communist Party to rapidly promote members of the proletariat into leadership positions in agriculture, science and industry. Party officials were looking for promising candidates with backgrounds similar to Lysenko’s: born of a peasant family, without formal academic training or affiliations to the academic community.” (From the Wikipedia page on Lysenko)

In 1940, Lysenko became director of the USSR’s Academy of Science’s Institute of Genetics–a position he would hold until 1964. In 1948, scientific dissent from Lysenkoism was formally outlawed.

“From 1934 to 1940, under Lysenko’s admonitions and with Stalin’s approval, many geneticists were executed (including Isaak Agol, Solomon Levit, Grigorii Levitskii, Georgii Karpechenko and Georgii Nadson) or sent to labor camps. The famous Soviet geneticist Nikolai Vavilov was arrested in 1940 and died in prison in 1943. Hermann Joseph Muller (and his teachings about genetics) was criticized as a bourgeois, capitalist, imperialist, and promoting fascism so he left the USSR, to return to the USA via Republican Spain.

In 1948, genetics was officially declared “a bourgeois pseudoscience”; all geneticists were fired from their jobs (some were also arrested), and all genetic research was discontinued.”  (From the Wikipedia page on Lysenkoism.)

Alas, the Wikipedia does not tell me if anyone died from Lyskenkoism itself, say, after their crops failed, but I hear the USSR doesn’t have a great agricultural record.

Lysenko got kicked out in the 60s, but his theories have returned in the form of SJW-inspired claims of the magic of epigenetics to explain how any differences in average group performance or behavior is actually the fault of long-dead white people. Eg:

Trauma May be Woven into DNA of Native Americans, by Mary Pember

” The science of epigenetics, literally “above the gene,” proposes that we pass along more than DNA in our genes; it suggests that our genes can carry memories of trauma experienced by our ancestors and can influence how we react to trauma and stress.”

That’s a bold statement. At least Pember is making Walker’s argument for him.

Of course, that’s not actually what epigenetics says, but I’ll get to that in a bit.

“The Academy of Pediatrics reports that the way genes work in our bodies determines neuroendocrine structure and is strongly influenced by experience.”

That’s an interesting source. While I am sure the A of P knows its stuff, their specialty is medical care for small children, not genetics. Why did Pember not use an authority on genetics?

Note: when thinking about whether or not to trust an article’s science claims, consider the sources they use. If they don’t cite a source or cite an unusual, obscure, or less-than-authoritative source, then there’s a good chance they are lying or cherry-picking data to make a claim that is not actually backed up by the bulk of findings in the field. Notice that Pember does not provide a link to the A of P’s report on the subject, nor provide any other information so that an interested reader can go read the full report.

Wikipedia is actually a decent source on most subjects. Not perfect, of course, but it is usually decent. If I were writing science articles for pay, I would have subscriptions to major science journals and devote part of my day to reading them, as that would be my job. Since I’m just a dude with a blog who doesn’t get paid and so can’t afford a lot of journal memberships and has to do a real job for most of the day, I use a lot of Wikipedia. Sorry.

Also, I just want to note that the structure of this sentence is really wonky. “The way genes work in our bodies”? As opposed to how they work outside of our bodies? Do I have a bunch of DNA running around building neurotransmitters in the carpet or something? Written properly, this sentence would read, “According to the A of P, genes determine neuroenodcrine structures, in a process strongly influenced by experience.”

Pember continues:

“Trauma experienced by earlier generations can influence the structure of our genes, making them more likely to “switch on” negative responses to stress and trauma.”

Pember does not clarify whether she is continuing to cite from the A of P, or just giving her own opinions. The structure of the paragraph implies that this statement comes from the A of P, but again, no link to the original source is given, so I am hard pressed to figure out which it is.

At any rate, this doesn’t sound like something the A of P would say, because it is obviously and blatantly incorrect. Trauma *may* affect the structure of one’s epigenetics, but not the structure of one’s genes. The difference is rather large. Viruses and ionizing radiation can change the structure of your DNA, but “trauma” won’t.

” The now famous 1998 ACES study conducted by the Centers for Disease Control (CDC) and Kaiser Permanente showed that such adverse experiences could contribute to mental and physical illness.”

Um, no shit? Is this one of those cases of paying smart people tons of money to tell us grass is green and sky is blue? Also, that’s a really funny definition of “famous.” Looks like the author is trying to claim her sources have more authority than they actually do.

“Folks in Indian country wonder what took science so long to catch up with traditional Native knowledge.”

I’m pretty sure practically everyone already knew this.

“According to Bitsoi, epigenetics is beginning to uncover scientific proof that intergenerational trauma is real. Historical trauma, therefore, can be seen as a contributing cause in the development of illnesses such as PTSD, depression and type 2 diabetes.”

Okay, do you know what epigenetics actually shows?

The experiment Wikipedia cites is of male mice who were trained to fear a certain smell by giving them small electric shocks when they smelled the smell. The children of these mice, conceived after the foot-shocking was finished, startled in response to the smell–they had inherited their father’s epigenetic markers that enhanced their response to that specific smell.

It’s a big jump from “mice startle at smells” to “causes PTSD.” This is a big jump in particular because of two things:

1. Your epigenetics change all the time. It’s like learning. You don’t just learn one thing and then have this one thing you’ve learned stuck in your head for the entire rest of your life, unable to learn anything new. Your epigenetics change in response to life circumstances throughout your entire life.

Eg, (from the Wikipedia):

“One of the first high-throughput studies of epigenetic differences between monozygotic twins focused in comparing global and locus-specific changes in DNA methylation and histone modifications in a sample of 40 monozygotic twin pairs. In this case, only healthy twin pairs were studied, but a wide range of ages was represented, between 3 and 74 years. One of the major conclusions from this study was that there is an age-dependent accumulation of epigenetic differences between the two siblings of twin pairs. This accumulation suggests the existence of epigenetic “drift”.

In other words, when identical twins are babies, they have very similar epigenetics. As they get older, their epigenetics get more and more different because they have had different experiences out in the world, and their experiences have changed their epigenetics. Your epigenetics change as you age.

Which means that the chances of the exact same epigenetics being passed down from father to child over many generations are essentially zilch.

2. Tons of populations have experienced trauma. If you go back far enough in anyone’s family tree, you can probably find someone who has experienced trauma. My grandparents went through trauma during the Great Depression and WWII. My biological parents were both traumatized as children. So have millions, perhaps billions of other people on this earth. If trauma gets encoded in people’s DNA (or their epigenetics,) then it’s encoded in virtually every person on the face of this planet.

Type 2 Diabetes, Depression, and PTSD are not evenly distributed across the planet. Hell, they aren’t even common in all peoples who have had recent, large oppression events. African Americans have low levels of depression and commit suicide at much lower rates than whites–have white Americans suffered more oppression than black Americans? Whites commit suicide at a higher rate than Indians–have the whites suffered more historical trauma? On a global scale, Israel has a relatively low suicide rate–lower than India’s. Did India recently experience some tragedy worse than the Holocaust? (See yesterday’s post for all stats.)

Type 2 Diabetes reaches its global maximum in Saudia Arabia, Oman, and the UAE, which as far as I know have not been particularly traumatized lately, and is much lower among Holocaust descendants in nearby Israel:

From a BBC article on obesity
From a BBC article on obesity

It’s also very low in Sub-Saharan Africa, even though all of the stuff that causes “intergenerational trauma” probably happened there in spades. Have Americans been traumatized more than the Congolese?

This map doesn’t make any sense from the POV of historical trauma. It makes perfect sense if you know who’s eating fatty Waestern diets they aren’t adapted to. Saudia Arabia and the UAE are fucking rich (I bet Oman is, too,) and their population of nomadic goat herders has settled down to eat all the cake they want. The former nomadic lifestyle did not equip them to digest lots of refined grains, which are hard to grow in the desert. Most of Africa (and Yemen) is too poor to gorge on enough food to get Type-2 Diabetes; China and Mongolia have stuck to their traditional diets, to which they are well adapted. Mexicans are probably not adapted to wheat. The former Soviet countries have probably adopted Western diets. Etc., etc.

Why bring up Type-2 Diabetes at all? Well, it appears Indians get Type-2 Diabetes at about the same rate as Mexicans, [Note: PDF] probably for the exact same reasons: their ancestors didn’t eat a lot of wheat, refined sugar, and refined fats, and so they aren’t adapted to the Western diet. (FWIW, White Americans aren’t all that well adapted to the Western Diet, either.)

Everybody who isn’t adapted to the Western Diet gets high rates of diabetes and obesity if they start eating it, whether they had historical trauma or not. We don’t need epigenetic trauma to explain this.

“The researchers found that Native peoples have high rates of ACE’s and health problems such as posttraumatic stress, depression and substance abuse, diabetes all linked with methylation of genes regulating the body’s response to stress. “The persistence of stress associated with discrimination and historical trauma converges to add immeasurably to these challenges,” the researchers wrote.

Since there is a dearth of studies examining these findings, the researchers stated they were unable to conclude a direct cause between epigenetics and high rates of certain diseases among Native Americans.”

There’s a dearth of studies due to it being really immoral to purposefully traumatize humans and then breed them to see if their kids come out fucked up. Luckily for us, (or not luckily, depending on how you look at it,) however, humans have been traumatizing each other for ages, so we can just look at actually traumatized populations. There does seem to be an effect down the road for people whose parents or grandparents went through famines, but, “the effects could last for two generations.”

As horrible as the treatment of the Indians has been, I am pretty sure they didn’t go through a famine two generations ago on the order of what happened when the Nazis occupied the Netherlands and 18-22,000 people starved.

In other words, there’s no evidence of any long-term epigenetic effects large enough to create the effects they’re claiming. As I’ve said, if epigenetics actually acted like that, virtually everyone on earth would show the effects.

The reason they don’t is because epigenetic effects are relatively short-lived. Your epigenetics get re-written throughout your lifetime.

” Researchers such as Shannon Sullivan, professor of philosophy at UNC Charlotte, suggests in her article “Inheriting Racist Disparities in Health: Epigenetics and the Transgenerational Effects of White Racism,” that the science has faint echoes of eugenics, the social movement claiming to improve genetic features of humans through selective breeding and sterilization.”

I’m glad the philosophers are weighing in on science. I am sure philosophers know all about genetics. Hey, remember what I said about citing sources that are actual authorities on the subject at hand? My cousin Bob has all sorts of things to say about epigenetics, but that doesn’t mean his opinions are worth sharing.

The article ends:

“Isolating and nurturing a resilience gene may well be on the horizon.”

How do you nurture a gene?

 

There are things that epigenetics do. Just not the things people want them to do.

Scientific Nostalgia

So, I hear the Brontosaurus might return to the rolls of official dinosaurs, rather than oopsies. From Yale mag’s “The Brontosaurus is Back“:

Originally discovered and named by Yale paleontologist O. C. Marsh, Class of 1860, the “thunder lizard” was later determined to be the same as the ApatosaurusBut European researchers recently reexamined existing fossils and decided that Brontosaurus is in fact a separate species.”

Well, these things happen. I’m glad scientists are willing to revisit their data and revise their assumptions. Of course, I have no idea how much morphological difference is necessary between two skeletons before we start calling them different species, (by any sane metric, would a wolf hound and a chihuahua be considered the same species?) but I’m willing to trust the paleontologists on this one.

The interesting thing isn’t the reclassification itself, which gets down to somewhat dry and technical details about bone sizes and whatnot, but the fact that people–myself included!–have some sort of reaction to this news, eg:

Dinosaur lovers of a certain age are gratified. “I’m delighted,” says geology professor Jacques Gauthier, the Peabody’s curator of vertebrate paleontology and vertebrate zoology. “It’s what I learned as a kid.”

I’ve seem other people saying the same thing. Those of us who grew up with picture books with brontosauruses in them are happy at the news the brontosaurus is back–like finding an old friend again, or episodes of your favorite childhood show on YouTube. Perhaps you think, “Yes, now I can get a book of dinosaurs for my kids and share the animals I loved with my kids!”

Meanwhile some of us still cling to the notion that Pluto, despite its tiny size and eccentric orbit, really ought to be a planet. Even I feel a touch of anthropomorphizing pity for Pluto, even though I think from an objective POV that the current classification scheme is perfectly sensible.

Pluto is not the first round, rocky body to get named a planet and then demoted: in 1801, Giuseppe Piazzi discovered Ceres, a small, round, rocky body orbiting between Jupiter and Mars.

Finding a planet between Mars and Jupiter was intellectually satisfying on a number of levels, not least of which that it really seems like there ought to be one there. For the next 50 years, Ceres made it into the textbooks as our fifth planet–but by the 1860s, it had been demoted. A host of other, smaller bodies–some of them roundish–had also been discovered orbiting between Mars and Jupiter, and it was now clear that these were a special group of space bodies. They all got named asteroids, and Ceres went down the memory hole.

Ceres is smaller than Pluto, but they have much in common. As scientists discovered more small, Pluto-like bodies beyond Neptune’s orbit, the question of what is a planet revived. Should all non-moon, round bodies (those with enough gravity to make themselves round) be planets? That gets us to at least 13 planets, but possibly dozens–or hundreds–more.

There’s an obvious problem with having hundreds of planets, most of which are miniscule: kids would never learn ’em all. When you get right down to it, there are thousands of rocks and balls of ice and other such things zooming around the sun, and there’s a good reason most of them are known by numbers instead of names. You’ve got to prioritize data, and some sort of definition that would cut out the tiniest round ones was needed. Tiny Pluto, alas, ended up on the wrong side of the definition: not a planet.

Pluto is, of course, completely unaffected by a minor change in human nomenclature. And someday, like Ceres, Pluto may be largely forgotten by the public at large. In the meanwhile, there will still be nostalgia for the friendly science of one’s childhood.

Comets

I’ve long wondered why comets have such eccentric orbits and come from the far outer reaches of the solar system. Why aren’t there more asteroids with eccentric orbits? Why aren’t there comets in round orbits? Why don’t they generally hang out closer to the sun?

Happily, I think I’ve figured it out. Yes, a comet is a “snowball in space.” But a comet isn’t just formed when liquid water freezes, as it often does on Earth. A comet is formed when a body gets so cold, the air on it freezes. The nitrogen, oxygen, hydrogen, etc. This only happens very far from the sun–so comets can only form far to the sun. If they formed close-in, their gases wouldn’t freeze.

So long as a frozen body stays way out there away from the sun, we’re not going to see it. It’s only when comets come closer to the sun (say, by getting knocked out of their original orbits,) that their gases begin to sublimate under the sun’s glare and they appear as bright, fiery comets in the night sky. Then, if it is lucky, the comet swings back to its frigid neighborhood before it totally melts away.

This explains why the comets we see have such eccentric orbits–the eccentricity allows the comet to freeze, sublimate, and freeze again. Without that orbit, no bright comet.

Is Acne an Auto-Immune Disorder?

Like our lack of fur, acne remains an evolutionary mystery to me.

Do other furless mammals get acne? Like elephants or whales? Or even chimps; their faces don’t have fur. If so, everyone’s keeping it a secret–I’ve never even seen an add for bonobo anti-acne cream, and with bonobos’ social lives, you know they’d want it. 🙂

So far, Google has returned no reports of elephants or whales with acne.

Now, a few skin blemishes here and there are not terribly interesting or mysterious. The weird thing about acne (IMO) is that it pops up at puberty*, and appears to have a genetic component.

Considering that kids with acne tend to feel rather self-conscious about it, I think it reasonable to assume that people with more severe acne have more difficulty with dating than people without. (Remember, some people have acne well into their 30s or beyond.)

Wouldn’t the non-acne people quickly out-compete the acne-people, resulting in less acne among humans? (Okay, now I really want to know if someone has done a study on whether people with more acne have fewer children.) Since acne is extremely common and shows up right as humans reach puberty, this seems like a pretty easy thing to study/find an effect if there is any.

Anyway, I totally remember a reference to acne in Dr. Price’s Nutrition and Physical Degeneration, (one of my favorite books ever,) but can’t find it now. Perhaps I am confusing it with Nutrition and Western Disease or a book with a similar title. At any rate, I recall a picture of a young woman’s back with a caption to the effect that none of the people in this tropical local had acne, which the author could tell rather well since this was one of those tropical locals where people typically walk around with rather little clothing.

The Wikipedia has this to say about the international incidence of acne:

“Rates appear to be lower in rural societies. While some find it affects people of all ethnic groups, it may not occur in the non-Westernized people of Papua New Guinea and Paraguay.

Acne affects 40 to 50 million people in the United States (16%) and approximately 3 to 5 million in Australia (23%). In the United States, acne tends to be more severe in Caucasians than people of African descent.”

I consider these more “hints” than “conclusive proof of anything.”

Back when I was researching hookworms, I ran across these bits:

“The [Hygiene Hypothesis] was first proposed by David P. Strachan who noted that hay fever and eczema were less common in children who belonged to large families. Since then, studies have noted the effect of gastrointestinal worms on the development of allergies in the developing world. For example, a study in Gambia found that eradication of worms in some villages led to increased skin reactions to allergies among children. … [bold mine.]

Moderate hookworm infections have been demonstrated to have beneficial effects on hosts suffering from diseases linked to overactive immune systems. … Research at the University of Nottingham conducted in Ethiopia observed a small subset of people with hookworm infections were half as likely to experience asthma or hay fever. Potential benefits have also been hypothesized in cases of multiple sclerosis, Crohn’s Disease and diabetes.”

So I got to thinking, if allergies and eczema are auto-immune reactions (I know someone in real life, at least, whose skin cracks to the point of bleeding if they eat certain foods, but is otherwise fine if they don’t eat those foods,) why not acne?

Acne is generally considered a minor problem, so people haven’t necessarily spent a ton of time researching it. Googling “acne autoimmune” gets me some Paleo-Dieter folks talking about curing severe cases with a paleo-variant (they’re trying to sell books, so they didn’t let on the details, but I suspect the details have to do with avoiding refined sugar, milk, and wheat.)

While I tend to caution against over-enthusiastic embrace of a diet one’s ancestors most likely haven’t eaten in thousands or ten thousand years, if some folks are reporting a result, then I’d love to see scientists actually test it and try to confirm or disprove it.

The problem with dietary science is that it is incredibly complicated, full of confounds, and most of the experiments you might think up in your head are completely illegal and impractical.

For example, scientists figured out that Pellagra is caused by nutritional deficiency–rather than an infectious agent–by feeding prisoners an all-corn diet until they started showing signs of gross malnutrition. (For the record, the prisoners joined the program voluntarily. “All the corn you can eat” sounded pretty good for the first few months.) Likewise, there was a program during WWII to study the effects of starvation–on voluntary subjects–and try to figure out the best way to save starving people, started because the Allies knew they would have a lot of very real starvation victims on their hands very soon.

These sorts of human experiments are no longer allowed. What a scientist can do to a human being is pretty tightly controlled, because no one wants to accidentally kill their test subjects and universities and the like don’t like getting sued. Even things like the Milgram Experiments would have trouble getting authorized today.

So most of the time with scientific studies, you’re left with using human analogs, which means rats. And rats don’t digest food the exact same way we do–Europeans and Chinese don’t digest food the exact same way, so don’t expect rats to do it the same way, either. An obvious oversight as a result of relying on animal models is that most animals can synthesize Vitamin C, but humans can’t. This made figuring out this whole Vitamin C thing a lot trickier.

Primates are probably a little closer, digestively, to humans, but people get really squeamish about monkey research, and besides, they eat a pretty different diet than we do, too. Gorillas are basically vegan (I bet they eat small bugs by accident all the time, of course,) and chimps have almost no body fat–this is quite remarkable, actually. Gorillas and orangutans have quite a bit of body fat, “normal” levels by human standards. Hunter-gatherers, agriculturalists, and sedentary butt-sitters like us have different amounts, but they still all have some. But chimps and bonobos have vanishingly little; male chimps and bonobos have almost zero body fat, even after being raised in zoos and fed as much food as they want.

Which means that if you’re trying to study diet, chimps and bonobos are probably pretty crappy human analogs.

(And I bet they’re really expensive to keep, relative to mice or having humans fill out surveys and promise to eat more carbs.)

So you’re left with trying to figure out what people are eating and tinker with it in a non-harmful, non-invasive way. You can’t just get a bunch of orphans and raise them from birth on two different diets and see what happens. You get people to fill out questionnaires about what they eat and then see if they happen to drop dead in the next 40 or 50 years.

And that doesn’t even take into account the fact that “corn” can mean a dozen different things to different people. Someone whose ancestors were indigenous to North and South America may digest corn differently than someone from Europe, Africa, or Asia. Different people cook corn differently–we don’t typically use the traditional method of mixing it with lime (the mineral), which frees up certain nutrients and traditionally protected people from Pellagra. We don’t all eat corn in the same combinations with other foods (look at the interaction between the calcium in milk and Vitamin D for one of the ways which combining foods can complicate matters.) And we aren’t necessarily even cooking the same “corn”. Modern hybrid corns may not digest in exactly the same way as corn people were growing a hundred or two hundred years ago. Small differences are sometimes quite important, as we discovered when we realized the artificially-created trans-fats we’d stuck in our foods to replace saturated fats were causing cancer–our bodies were trying to use these fats like normal fats, but when we stuck them into our cell walls, their wonky shapes (on a chemical level, the differences between different kinds of fats can be mostly understood that they are shaped differently, and trans fats have been artificially modified to have a different shape than they would have otherwise,) fucked up the structure of the cells they were in.

In short, this research is really hard, but I still encourage people to go do it and do it well.

 

Anyway, back on topic, here’s another quote from the Wikipedia, on the subject of using parasites to treat autoimmunie disorders:

“While it is recognized that there is probably a genetic disposition in certain individuals for the development of autoimmune diseases, the rate of increase in incidence of autoimmune diseases is not a result of genetic changes in humans; the increased rate of autoimmune-related diseases in the industrialized world is occurring in too short a time to be explained in this way. There is evidence that one of the primary reasons for the increase in autoimmune diseases in industrialized nations is the significant change in environmental factors over the last century. …

Genetic research on the interleukin genes (IL genes) shows that helminths [certain kinds of parasites] have been a major selective force on a subset of these human genes. In other words, helminths have shaped the evolution of at least parts of the human immune system, especially the genes responsible for Crohn’s disease, ulcerative colitis, and celiac disease — and provides further evidence that it is the absence of parasites, and in particular helminths, that has likely caused a substantial portion of the increase in incidence of diseases of immune dysregulation and inflammation in industrialized countries in the last century. …

Studies conducted on mice and rat models of colitis, muscular sclerosis, type 1 diabetes, and asthma have shown helminth-infected subjects to display protection from the disease.”

 

Right, so I’m curious if acne falls into this category, too.

The Marxist Meme-Plex as Cargo Cult of the Industrial Revolution

So I was thinking about Marxism, and how strange it is that it only ever really caught on in precisely the countries where it itself proclaimed it shouldn’t, and never became very domestically important in the countries where it was supposed to go.

It’s kind of like if there were a bunch of people going around proclaiming “This is what Mexican culture is like,” only none of them were Mexican, and actual Mexicans wanted very little to do with it–you might suspect that the stuff being called “Mexican culture” wasn’t all that Mexican.

Only we’re talking about overthrowing the state and killing a bunch of people, rather than tacos and Cinco de Mayo.

Marx proclaimed that Communism, (by which I mean Marxist-style communism inspired by Marx and written about by Marx in his many works on the subject, which became the intellectual basis for the international communist movement that eventually triumphed in the USSR, China, Vietnam, Cuba, N. Korea, etc.) was supposed to be the natural outgrowth of capitalism itself in industrialized nations, but the list I just gave contains only barely-industrialized or practically feudal nations.

Marx was, of course, a mere mortal; one cannot expect anyone to write thousands of pages and come out correct in all of them. Still, this is a pretty big oversight. A great deal of Marx’s theory rests on the belief that the form of the economic system dictates the culture and political system: that is, that capitalism forces people to act and organize in certain ways in order to feed the capitalist machine; feudalism forces people to act and organize in certain other ways, in order to feed the feudal machine.

So for the capitalist, industrialized countries to not go Communist, while a bunch of non-capitalist, non-industrialized do, seems like a pretty big blow to the basics of the theory.

Kind of like if I had a theory that all noble gases were naturally magnetic, and all metals weren’t, and yet metal things kept sticking to my magnets and noble gases seemed relatively uninterested. I might eventually start thinking that maybe I was wrong.

Of course you can pick and chose your Marxism; you might like the idea of the “commodity fetish” while throwing out the rest of the bathwater. Have at it. But we are speaking here of believing both broadly and deeply enough in Marx’s theories to actually advocate overthrowing the state and murdering all the Kulaks.

My own theory is that Marxism appealed to the wrong group of people precisely because they were the wrong group of people.

Actual scientists tend to have little interest in pseudo science. Actual members of a culture don’t get excited by fake versions of their culture. And people with actual experience with industrial capitalism have little interest in Marxism.

In short, Marxism became a kind of myth among unindustrialized or barely-industrialized people about what would happen when the factories came, and so believing the myth, they made it happen.

Marx had intended to create a “science;” describing patterns in his data and thereby making predictions about the future. When that future didn’t happen, the first reaction of his followers was to double down–the theory must not have worked because evil bad people were sabotaging it.

(If it happens naturally, why would it have saboteurs?)

Many people have accused Communism of being a religion–an atheistic religion, but a religion nonetheless. SSC wisely asks Is Everything a Religion?–since practically everything does get described as a religion. EvenCargo Cult Programming.)

Every worldview–every meme-plex, as I like to call them–involves certain beliefs about the world that help people make sense of the vast quantities of data we absorb every day and make predictions about the future. My observation of the sun rising leads me to believe there is a consistent pattern of “sun rises in morning” and that, therefore, the sun will rise tomorrow. “Science” itself contains many such beliefs.

Religions, like all other world views and meme-plexes, provide a way of organizing and understanding one’s observations about the world, generally through appeal to supernatural agents. (It rains because Zeus is peeing through a sieve; suffering exists because sin.)

The obvious reason belief systems get called religions is to insult them and suggest that they are irrational.

Of course, none of us is entirely rational; the idea that bags of rice that suddenly fell from the sky were the gift of the sky gods makes as much sense as any other if you have no other information on the subject. Scientists believe wrong and irrational things, too.

The critical difference is that science attempts to falsify itself–a theory cannot even be described as “scientific” if it cannot be falsified. All meme-plexes resist change, both because of human biases and because it’s probably a bad idea to try to re-formulate your beliefs about everything every time you happen across a single discordant datum, but science does attempt to disprove and discard bad theories over time–this is fundamentally what science is, and this is why I love science.

A faith, by contrast, is something one just believes, even despite evidence to the contrary, or without any ability to disprove it. For the deeply faithful, the reaction to evidence that contradicts one’s theory is generally not, “Hrm, maybe the theory is wrong,” but, “We aren’t following the the theory hard enough!”

The former leads to penicillin and airplanes; the later leads to dead people.

Note: I feel compelled to add that not all faith leads to dead people. Faith in Communism certainly did, however.

Marxists failed to admit information that contradicted their theories; they just killed people who contradicted their theories for being counter-revolutionaries.

 

Things that Hurt my Soul

Any version of “Scientists say XYZ, but science is changing all the time and they keep coming out with things they thought they knew that turned out to be wrong, so you never know!” just makes me want to scream. In this case, it was interjected into an explanation to a small child that you use your brain to think and control your bodily movements. Yes, yes, tell me some more about how someday we are totally going to discover that this whole “brain” thing was incorrect and we actually think with some other organ! Go on! No wait please don’t; it hurts my soul.

I wish people could tell the difference between “quantum entanglement implies that Einstein’s opinion on the EPR paradox might have been wrong,” and “we know nothing because scientists are dumbshits.”

BTW, every article ever written that starts out “Einstein wrong!” or “Scientists question Einstein’s legacy: legendary scientists might have been wrong!” is fucking shit and should be ashamed of itself. It’s like being all “LoL Newton was dumb and didn’t figure out relativity, so Newtonian dynamics is wrong and we shouldn’t teach it. Also, Evilution is fake!”