Homeschooling Corner: The Things we Played

I’m a really boring person who gets excited about finding math workbooks at the secondhand shop. I got lucky this week and snagged two math and 1 science workbooks, plus Bedtime Math 2 at the library. Since new workbooks/manipulatives/materials can be pricey,* I’ve been keeping an eye out for good deals for, well, pretty much my kids’ whole lives. For example, a few years ago I found Hooked on Math ($45 on Amazon) at Goodwill for a couple of bucks; I found some alphabet flashcards at a garage sale for 50c.

I’m also lucky to have several retired teachers in the family, so I’ve “inherited” a nice pile of teaching materials, from tangrams to fractions.

*That said, sometimes you need a particular workbook now, not whenever one shows up at the second hand shop, so thankfully plenty of workbooks are actually pretty cheap.

But full “curriculums” can be pretty expensive–for example, Saxon Math plus manipulatives runs about $200; a Lifepack 4 or 5-subject curriculum is about $320; Montessori math kit: $250; Horizons: $250. I have no idea if these are worth the money or not.

So I’m glad I already have most of what I need (for now.)

This week we started typing (I went with the first website that came up when I searched for “typing tutor” and so far it’s gone well.) We finished Bedtime Math and moved on to Bedtime Math 2. (We’re also working out of some regular old math books, as mentioned above.)

In science we’re still reading Professor Astro Cat’s Frontiers of Space (today we discussed eclipses,) and we started Professor Astro Cat’s Intergalactic Workbook, which has been fun so far. It has activities based on space gloves, weightlessness, Russian phrases (used on the International Space Station,) Morse Code, etc.

(The gloves activity was difficult for youngest child–in retrospect, one pair of glove would have been sufficient. Eventually they got frustrated and started using their feet instead of hands to complete the activities.)

Professor Astro Cat has therefore been the core of our activities this week.

To keep things light, I’ve interspersed some games like Trucky3, Perplexus, and Fraction Formula. They’re also useful when one kid has finished an activity and another hasn’t and I have to keep them occupied for a while.

Coding continues apace: learned about loops this week.

Spelling is one of our weak points, so I want to do at least some spelling each day, (today we spelled planets’ names) but I’m not sure what the best approach is. English spelling is pretty weird.

Homeschooling Corner

Welcome! Highly unscientific polling has revealed an interest in a regular or semi-regular feature focused on homeschooling.

Note that I am NOT some homeschooling guru with years of experience. We are just beginning, so I want some other people to discuss things with. I don’t have a curriculum picked out nor a coherent “philosophy,” but I am SO EXCITED about all of the things I have to teach I couldn’t even list them all.

I was thinking of starting with just a focus on what has been successful this week–which books/websites/projects we liked–and perhaps what was unsuccessful. I invite all of you to come and share your thoughts, ideas, questions, philosophies, recommendations, etc. Parents whose kids are attending regular schools but want to talk about learning materials are also welcome.

One request: Please no knee-jerk bashing of public schools or teachers. (I just find this really annoying.) Thoughtful, well-reasoned critique of mainstream schooling are fine, but let’s try to focus on the homeschooling.

This week’s successes:

DK Workbooks: Coding with Scratch (workbook) has been an amazing success.

Like many parents, I thought it’d be useful to learn some basic coding, but have no idea where to start. I once read HTML for dummies, but I don’t know my CSS from Perl, much less what’s best for kids.

After a bit of searching, I decided to try the the DK Coding with Scratch series. (This particular workbook is aimed at kids 6-9 yrs old, but there are others in the series.)

Scratch is a free, simple, child-friendly coding program available online at https://scratch.mit.edu/. You don’t need the workbook to use Scratch, (it’s just a helpful supplement.) There are also lots of helpful Youtube videos for the enterprising young coder.

Note: my kids really want to code because they want to make their own video games.

In general, I have found that toys and games that claim they will teach your kids to code actually won’t. (Eg, Robot Turtles.) Some of these games are a ton of fun anyway, I just wouldn’t expect to become a great coder that way.

Professor Astro Cat’s Frontiers of Space is as good as it looks. Target market is 8-11 years old. There’s a lot of information per page, so we’re reading and discussing a few pages each day.

There are two other books in the series, Professor Astro Cat’s Intergalactic Activity Book, which I’m hoping will make a good companion to this one, and Astro Cat’s Atomic Adventure, which looks like it fills the desperately needed “quantum physics for kids” niche.)

I’m still trying to figure out how to do hands-on science activities without spending a bundle. Most of the “little labs” type science kits look fun, but don’t pack a lot of educational bang for your buck. For example, today we built a compass (it cost $10 at the toy store, not the $205 someone is trying charge on Amazon.) This was fun and I really like the little model, but it also took about 5 minutes to snap the pieces together and we can’t actually carry it around to use it like a real compass.

Plus, most of these labs are basically single-use items. I like toys with a sciency-theme, but they’re too expensive to run the whole science curriculum off of.

Oh, sure, I hand them a page of math problems and they start squawking at me like chickens. But bedtime rolls around and they’re like, “Where’s our Bedtime Math? Can’t we do one more page? One more problem? Please?”

There are only three math problems every other page (though this does add up to over 100 problems,) the presentation is fun, and the kids like the book better than going to sleep.

The book offers easy, medium, and hard problems in each section, so it works for kids between the ages of about 4 and 10.

There’s an inherent tension in education between emphasizing subjects that kids are already good at and working on the ones they’re bad at. The former gives kids a chance to excel, build confidence, and of course actually get good at something, while the latter is often an annoying pain in the butt but nevertheless necessary.

 

Since we’ve just started and are still getting in the swing of things, I’m trying to focus primarily on the things they’re good at and enjoy and have just a little daily focus on the things they’re weak at.

I’d like to find a good typing tutor (I’ll probably be trying several out soon) because watching the kids hunt-and-peck at the keyboard makes my hair stand on end. I’d also like to find a good way to hold up workbooks next to the computer to make using the DK books easier.

That’s about it, so I’ll open the floor to you guys.

No, Graecopithecus does not prove humans evolved in Europe

Hello! We’re in the midst of a series of posts on recent exciting news in the field of human evolution:

  • Ancient hominins in the US?
  • Homo naledi
  • Homo flores
  • Humans evolved in Europe?
  • In two days, first H Sap was pushed back to 260,000 years,
  • then to 300,000 years!
  • Bell beaker paper

Today we’re discussing the much-publicized claim that scientists have discovered that humans evolved in Europe. (If you haven’t read last week’s post on Homo naledi and flores, I encourage you to do so first.) The way reporters have framed their headlines about the recent Graecopithecus freybergi findings is itself a tale:

The Telegraph proclaimed, “Europe was the birthplace of mankind, not Africa, scientists find,” Newsweek similarly trumpeted, “First Human Ancestor Came from Europe Not Africa,” and CBS News stated, “Controversial study suggests earliest humans lived in Europe – not Africa.”

The Conversation more prudently inquired, “Did humans evolve in Europe rather than Africa? ” and NewScientist and the Washington Post, in a burst of knowing what a “human” is, stated, “Our common ancestor with chimps may be from Europe, not Africa” and “Ape that lived in Europe 7 million years ago could be human ancestor,” respectively.

This all occasioned some very annoying conversations along the lines of “White skin tone couldn’t possibly have evolved within the past 20,000 years because humans evolved in Europe! Don’t you know anything about science?”

Ohkay. Let’s step back a moment and take a look at what Graecopithecus is and what it isn’t.

This is Graecopithecus:

I think there is a second jawbone, but that’s basically it–and that’s not six teeth, that’s three teeth, shown from two different perspectives. There’s no skull, no shoulder blades, no pelvis, no legs.

Lucy
Naledi

By contrast, here are Lucy, the famous Australopithecus from Ethiopia, and a sample of the over 1,500 bones and pieces of Homo naledi recently recovered from a cave in South Africa.

Now, given what little scientists had to work with, the fact that they managed to figure out anything about Graecopithecus is quite impressive. The study, reasonably titled “Potential hominin affinities of Graecopithecus from the Late Miocene of Europe,” by
Jochen Fuss, Nikolai Spassov, David R. Begun, and Madelaine Böhm, used μCT and 3D reconstructions of the jawbones and teeth to compare Graecopithecus’s teeth to those of other apes. They decided the teeth were different enough to distinguish Graecopithecus from the nearby but older Ouranopithecus, while looking more like hominin teeth:

G. freybergi uniquely shares p4 partial root fusion and a possible canine root reduction with this tribe and therefore, provides intriguing evidence of what could be the oldest known hominin.

My hat’s off to the authors, but not to all of the reporters who dressed up “teeth look kind of like hominin teeth” as “Humans evolved in Europe!”

First of all, you cannot make that kind of jump based off of two jawbones and a handfull of teeth. Many of the hominin species we have recovered–such as Homo naledi and Homo floresiensis, as you know if you already read the previous post–possessed a mosaic of “ape like” and “human like” traits, ie:

The physical characteristics of H. naledi are described as having traits similar to the genus Australopithecus, mixed with traits more characteristic of the genus Homo, and traits not known in other hominin species. The skeletal anatomy displays plesiomorphic (“ancestral”) features found in the australopithecines and more apomorphic (“derived,” or traits arising separately from the ancestral state) features known from later hominins.[2]

Nebraska Man teeth compared to chimps, Homo erectus, and modern humans

If we only had six Homo naledi bones instead of 1,500 of them, we might be looking only at the part that looks like an Australopithecus instead of the parts that look like H. erectus or totally novel. You simply cannot make that kind of claim off a couple of jawbones. You’re far too likely to be wrong, and then not only will you end up with egg on your face, but you’ll only be giving more fuel to folks who like to proclaim that “Nebraska Man turned out to be a pig!”:

In February 1922, Harold Cook wrote to Dr. Henry Osborn to inform him of the tooth that he had had in his possession for some time. The tooth had been found years prior in the Upper Snake Creek beds of Nebraska along with other fossils typical of North America. … Osborn, along with Dr. William D. Matthew soon came to the conclusion that the tooth had belonged to an anthropoid ape. They then passed the tooth along to William K. Gregory and Dr. Milo Hellman who agreed that the tooth belonged to an anthropoid ape more closely related to humans than to other apes. Only a few months later, an article was published in Science announcing the discovery of a manlike ape in North America.[1] An illustration of H. haroldcookii was done by artist Amédée Forestier, who modeled the drawing on the proportions of “Pithecanthropus” (now Homo erectus), the “Java ape-man,” for the Illustrated London News. …

Examinations of the specimen continued, and the original describers continued to draw comparisons between Hesperopithecus and apes. Further field work on the site in the summers of 1925 and 1926 uncovered other parts of the skeleton. These discoveries revealed that the tooth was incorrectly identified. According to these discovered pieces, the tooth belonged neither to a man nor an ape, but to a fossil of an extinct species of peccary called Prosthennops serus.

That basically sums up everything I learned about human evolution in highschool.

Second, “HUMANS” DID NOT EVOLVE 7 MILLION YEARS AGO.

Scientists define “humans” as members of the genus Homo, which emerged around 3 million years ago. These are the guys with funny names like Homo habilis, Homo neanderthalensis, and the embarrassingly named Homo erectus. The genus also includes ourselves, Homo sapiens, who emerged around 200-300,000 years ago.

Homo habilis descended from an Australopithecus, perhaps Lucy herself. Australopithecines are not in the Homo genus; they are not “human,” though they are more like us than modern chimps and bonobos are. They evolved around 4 million years ago.

The Australopithecines evolved, in turn, from even older apes, such as–maybe–Ardipithecus (4-6 million years ago) or Sahelanthropus tchadensis.

Regardless, humans didn’t evolve 7 million years ago. Sahelanthropus and even Lucy do not look like anyone you would call “human.” Humans have only been around for about 3 million years, and our own specific species is only about 300,000 years old. Even if Graecopithecus turns out to be the missing link–the true ancestor of both modern chimps and modern humans–that still does not change where humans evolved, because Graecopithecus narrowly missed being a human by 4 million years.

If you want to challenge the Out of Africa narrative, I think you’d do far better arguing for a multi-regional model of human evolution that includes back-migration of H. erectus into Africa and interbreeding with hominins there as spurring the emergence of H. sapiens than arguing about a 7 million year old jawbone. (I just made that up, by the way. It has no basis in anything I have read. But it at least has the right characters, in the right time frame, in a reasonable situation.)

Sorry this was a bit of a rant; I am just rather passionate about the subject. Next time we’ll examine very exciting news about Bushmen and Pygmy DNA!

 

Recent Exciting Developments: 130kya American Hominins?

There has been SO MUCH EXCITING NEWS out of paleoanthropology/genetics lately, it’s been a little tricky keeping up with it all. I’ve been holding off on commenting on some of the recent developments to give myself time to think them over, but here goes:

  1. Ancient hominins in the US?
  2. Homo naledi
  3. Homo flores
  4. Humans evolved in Europe?
  5. In two days, first H Sap was pushed back to 260,000 years,
  6. then to 300,000 years!
  7. Bell beaker paper

1. Back in May (2017,) Holen et al published an article discussing A 130,000-year-old archaeological site in southern California, USA, in Nature:

Here we describe the Cerutti Mastodon (CM) site, an archaeological site from the early late Pleistocene epoch, where in situ hammerstones and stone anvils occur in spatio-temporal association with fragmentary remains of a single mastodon (Mammut americanum). The CM site contains spiral-fractured bone and molar fragments, indicating that breakage occured while fresh. Several of these fragments also preserve evidence of percussion. The occurrence and distribution of bone, molar and stone refits suggest that breakage occurred at the site of burial. Five large cobbles (hammerstones and anvils) in the CM bone bed display use-wear and impact marks, and are hydraulically anomalous relative to the low-energy context of the enclosing sandy silt stratum. 230Th/U radiometric analysis of multiple bone specimens using diffusion–adsorption–decay dating models indicates a burial date of 130.7 ± 9.4 thousand years ago. These findings confirm the presence of an unidentified species of Homo at the CM site during the last interglacial period (MIS 5e; early late Pleistocene), indicating that humans with manual dexterity and the experiential knowledge to use hammerstones and anvils processed mastodon limb bones for marrow extraction and/or raw material for tool production.

Reconstruction of a Homo erectus woman, Smithsonian

Note that “Homo” here is probably not H. sapiens, but a related or ancestral species, like Denisovans or Homo erectus, because as far as we know, H. sapiens was still living in Africa at the time.

This is obviously a highly controversial claim. Heck, “earliest human presence in the Americas” was already controversial, with some folks firmly camped at 15,000 years ago and others camped around 40,000 yeas ago. 130,000 years ago wasn’t even on the table.

Unfortunately, the article is paywalled, so I can’t read the whole thing and answer simple questions like, “Did they test the thickness of mineral accumulation on the bones to see if the breaks/scratches are the same age as the bones themselves?” That is, minerals build up on the surfaces of old bones over time. If the breaks and scratches were made before the bones were buried, they’ll have the same amount of buildup as the rest of the bone surfaces. If the breaks are more recent–say, the result of a bulldozer accidentally backing over the bones–they won’t.

They did get an actual elephant skeleton and smacked it with rocks to see if it would break in the same ways as the mammoth skeleton. A truck rolling over a rib and a rock striking it at an angle are bound to produce different kinds and patterns of breakage (the truck is likely to do more crushing, the rock to leave percussive impacts.) I’d also like to know if they compared the overall butchering pattern to known stone-tool-butchered elephants or mammoths, although I don’t know how easy it would be to find one.

Oldowan tool, about 2 million years old

They also looked at the pattern of impacts and shapes of the “hammerstones.” A rock which has been modified by humans hitting it with another rock will typically have certain shapes and patterns on its surface that can tell you things like which angle the rock was struck from during crafting. I’ve found a few arrowheads, and they are pretty distinct from other rocks.

Here’s a picture of an Oldowan stone chopper, about 2 million years old, which is therefore far older than these potential 130,000 year old tools. Homo sapiens didn’t exist 2 million years ago; this pointy rock was probably wielded by species such as Australopithecus garhi, H. habilis, or H. ergaster. Note that one side of this chopper is rounded, intended for holding comfortably in your hand, while the other side has had several chunks of rock smacked off, resulting in convex surfaces. Often you can tel exactly where the stone tool was struck to remove a flake, based on the shape and angle of the surface and the pattern of concentric, circular lines radiating out from the impact spot.

Homo erectus, who lived after the Oldowan tool makers and had a fancier, more complicated lithic technology, did make it out of Africa and spread across southeast Asia, up into China. This is, as far as I know, the first case of a hominin species using tools to significantly expand its range, but we have no evidence of erectus ever expanding into places that get significantly cold in the winter, and boat-building is a pretty advanced skill. We don’t even think erectus made it to Madagascar, which makes it sailing to the Americans rather doubtful.

I dislike passing judgment on the paper without reading it, but my basic instinct is skepticism. While I think the peopling of the Americas will ultimately turn out to be a longer, more complex, and interesting process than the 15,000 years camp, 130,000 years is just too interesting a claim to believe without further evidence (like the bones of said hominins.)

Still, I keep an open mind and await new findings.

(We’ll continue with part 2 next week.)

Do Chilblains Affect Blacks More than Whites?

toes afflicted with chilblains
toes afflicted with chilblains

While tromping through a blizzard, seeking insight into circum-polar peoples, I discovered a condition called chilblains. The relevant Wikipedia page is rather short:

Chilblains … is a medical condition that occurs when a predisposed individual is exposed to cold and humidity, causing tissue damage. It is often confused with frostbite and trench foot. Damage to capillary beds in the skin causes redness, itching, inflammation, and sometimes blisters. Chilblains can be reduced by keeping the feet and hands warm in cold weather, and avoiding extreme temperature changes. Chilblains can be idiopathic (spontaneous and unrelated to another disease), but may also be a manifestation of another serious medical condition that needs to be investigated.

The part they don’t mention is that it can really hurt.

The first HBD-related question I became interested in–after visiting a black friend’s house and observing that she was comfortable without the AC on, even though it was summer–is whether people from different latitudes prefer different temperatures. It seems pretty obvious: surely people from Yakutsk prefer different temperatures than people from Pakistan. It also seems easy to test: just put people in a room and give them complete control over the thermostat. And yet, I’d never heard anyone discuss the idea.

Anyway, the perfunctory Wikipedia page on chilblains mentioned nothing about racial or ethnic predisposition to the condition–even though surely the Eskimo (Inuit) who have genetic admixture from both ice-age Neanderthals and Denisovans:

“Using this method, they found two regions with a strong signal of selection: (i) one region contains the cluster of FADS genes, involved in the metabolism of unsaturated fatty acids; (ii) the other region contains WARS2 and TBX15, located on chromosome 1.” …

“TBX15 plays a role in the differentiation of brown and brite adipocytes. Brown and brite adipocytes produce heat via lipid oxidation when stimulated by cold temperatures, making TBX15 a strong candidate gene for adaptation to life in the Arctic.” …

“The Inuit DNA sequence in this region matches very well with the Denisovan genome, and it is highly differentiated from other present-day human sequences, though we can’t discard the possibility that the variant was introduced from another archaic group whose genomes we haven’t sampled yet,” Dr. Racimo said.

The scientists found that the variant is present at low-to-intermediate frequencies throughout Eurasia, and at especially high frequencies in the Inuits and Native American populations, but almost absent in Africa.

Sub-Saharan Africans have their own archaic admixture, but they have very little to no ice-age hominin–which is probably good for them, except for those who’ve moved further north.

Imagine my surprised upon searching and discovering very little research on whether chilblains disproportionately affects people of different races or ethnicities. If you were a dermatologist–or a genetically prone person–wouldn’t you want to know?

So here’s what I did find:

The National Athletic Trainers Association Position Statement on Cold Injuries notes:

Black individuals have been shown to be 2 to 4 times more likely than individuals from other racial groups to sustain cold injuries. These differences may be due to cold weather experience, but are likely due to anthropometric and body composition differences, including less-pronounced CIVD, increased sympathetic response to cold exposure, and thinner, longer digits.3,6

I think CIVD=Cold-Induced Vasodilation

The Military Surgeon: Journal of the Association of Military Surgeons of the United States, Volumes 36-37, states:

c2ijujzucaayfvv

c2ijvyvveaaehdz c2ijw49vqaajuou

The text continues with descriptions of amputating rotting feet.

A PDF from the UK, titled “Cold Injury,” notes:

c2ilp17uaaaonao

Notice that the incidence of chilblains is actually less in extremely cold places than moderately cold places–attributed here to people in these places being well-equipped for the cold.

c2ilp4evqaet4lu

Finally I found a PDF of a study performed, I believe, by the US Military, Epidemiology of US Army Cold Weather Injuries, 1980-1999:

picture-2

While I would really prefer to have more ethnic groups included in the study, two will have to suffice. It looks like trench foot may be an equal-opportunity offender, but chilblains, frostbite, and other cold-related injuries attack black men (at least in the army) at about 4x the rate of white men, and black women 2x as often as white women (but women in the army may not endure the same conditions as men in the army.)

On a related note, while researching this post, I came across this historic reference to infectious scurvy and diabetes, in the Journal of Tropical Medicine and Hygiene, Volumes 4-5 (published in 1902):

c2iou5uviae83si

Note: this is why it is important to discard bad theories after they’ve been disproven. Otherwise, you kill your scurvy victims by quarantining them instead of giving them oranges.

Tesla vs. Edison

... and fight! 220px-Thomas_Edison2

It has become popular of late, especially on the left, to love Tesla and hate Edison. (Warning: that is a link to the Oatmeal, which is very funny and will suck up large quantities of your time if you let it, but if you aren’t familiar with the leftists hate of Edison and valorization of Tesla, it’s a necessary read.)

Edison, (1847 – 1931) was an American-born (son of a Canadian war refugee of Dutch descent) auto-didact, inventor, and businessman who was awarded over a thousand patents. His most important inventions (or inventions produced by his lab,) include the first actually useful lightbulb, the phonograph, the first movie camera and a device to view the movies on, the electrical grid necessary to power the lightbulb, the movie studio necessary to make the films for people to watch, and the scientific research lab.

He was friends with Henry Ford, a community volunteer, deaf, and a general humanitarian who abhorred violence and prided himself on having never invented an offensive weapon.

His worst mistake appears to have been not realizing what business he was in during the “War of the Currents;” Edison thought he was in the lightbulb-selling business, and since he had invented a lightbulb that ran on DC, he wanted everyone to use DC. He also seems to have been genuinely concerned about the high voltages used by AC, but DC just drops off too quickly to be used in non-urban areas; to get the country electrified required DC. Edison not only lost the Currents War, but also got kicked out of the company he’d founded by his stock holders. The company’s name was later changed to General Electric.

His political views were fairly common for his day–he advocated the populist position on abolishing the gold standard, tax reform, and making loans interest free to help farmers. Religiously, he was basically a GNON-believing deist. He preferred silent films over “talkies” due to being deaf, and had six children, three of whom went into science/inventing, one with a degree from Yale and one from MIT.

The idea that Edison was “merely” a businessman or CEO is completely bollocks. He was not only a brilliant inventor, but also understood how his inventions would be used and created the systems–both human and mechanical–necessary to bring them to full fruition.

Edison's lab in Menlo Park
Edison’s lab in Menlo Park

 

Tesla (1856-1943) was a Serb born in Croatia back when Croatia was part of the Austrian empire. By all accounts, he was exceedingly brilliant. His father was a priest and his mother was the daughter of a priest, but he received a scholarship to the Austrian Polytechnic University, where he burned like a meteor for his first year, earning the highest grades possible in 9 subjects (almost twice the required course load.) In his second year, he became addicted to gambling, then gambled away his tuition money in year three and forgot to study for his finals. He flunked out and ran away.

A couple of years later, his family raised money to send him to university again, which was another fiasco, since Tesla didn’t have training in two of the required subjects and so couldn’t actually attend.

Nevertheless, Tesla managed to get work at a telegraph company and was eventually invited to the US to work under Edison. Here he did excellent work, but quit over a rather stupid sounding misunderstanding about pay, wherein Tesla expected to be paid far more for an invention than Edison had in funds to pay anyone. Edison offered a raise instead, but Tesla decided to strike out on his own.

Tesla attempted to start a business, which ended badly (it sounds like it went south because he wasn’t focusing on the stated goals of the company,) and left him a penniless ditch-digger.

He then hit on a series of successes, including the polyphase induction motor, which ended with him quite handsomely employed by one of Edison’s competitors, Westinghouse, but even here he had difficulties getting along with his co-workers. Eventually it seems he established his own lab and convinced investors to give him $100,000, which he promptly spent on more lab equipment instead of the new lighting system he’d promised. His lab was later sold and torn down to pay off debts.

Tesla received yet another major investment, $150,000 to build a wireless telegraph facility, but appears to have blown the money on stock market speculation. He did manage to finish the project, though without any more funds from his now very jaded investors, but eventually he had to sell the building, and it was demolished.

Many of Tesla’s inventions were clearly brilliant and far ahead of their time. Others are delusions, like his mechanical oscillator. Tesla claimed it nearly brought down the building; Mythbusters built one themselves, and it did no such thing.

There is a kind of brilliance that slides easily into madness, and Tesla’s was clearly of this sort. He was too adept at pattern matching (he could do calculus in his head) to sort out real patterns from ones he’d dreamed up. He never married, but once fell in love with a pigeon at the park, feeding it daily and spending over $2000 dollars on it when its wing was injured.

In his personal life, he was extremely rigid–working and eating at the exact same times every day, eating a very restricted diet, and wearing a fastidiously neat and regimented wardrobe. He was extremely thin and slept very little–perhaps only 2 hours a day. (There are a vanishingly few people in the world who actually do function like this.) He was critical and harsh toward people who didn’t meet his standards, like fat people or secretaries whose clothes he thought were insufficiently attractive. Despite not having any children of his own, he believed the unfit should be sterilized and the rest of the population coerced into a selective breeding program. He also said some unflattering things about Edison upon the man’s death, which is kind of rude.

To prevent him from sinking further into poverty, his former employer, Westinghouse, took pity on him and started paying his hotel bills, (Tesla seems to have not thought of living in a house.) Tesla spent much of his final years claiming to have built a “Death Ray” and claiming that various thieves had broken into his hotel room to steal it.

Upon his death in 1943, the government seized all of his belongings just in case there were actual Death Rays or other such inventions in there that the Nazis might try to steal. The box with Tesla’s Death Ray turned out to have nothing more than an old battery inside. The investigator concluded:

“[Tesla’s] thoughts and efforts during at least the past 15 years were primarily of a speculative, philosophical, and somewhat promotional character often concerned with the production and wireless transmission of power; but did not include new, sound, workable principles or methods for realizing such results.

To be frank, I’ve talked to homeless schizophrenics who sound a lot like Tesla; the line between correct pattern matching and incorrect pattern matching is, at times, easily crossed.

 

The modern habit of shitting on Edison and glorifying Tesla stems from the tendency to see Edison as a stereotypically American businessman who wickedly and cunningly stole ideas from from smarter people to build up his own wealth and reputation. It feeds into the notion that Americans (white Americans, especially,) have built nothing of their own, but stolen all of their wealth and a great many of their ideas from others. Here Tesla–attractive, urbane, brilliant, and most of all, not tainted by the blight of having been born in America–gets to stand in for the usual victimized classes.

Ironically, Edison’s political beliefs line up with the Progressives of his day–that is, socialists/liberals like Teddy Roosevelt and Woodrow Wilson. Tesla, at least as far as the Wikipedia describes any of his beliefs, favored Nazi-style forced sterilization and eugenics. In daily life, Tesla may have been a nicer person than Edison (it is rather difficult to tell from Wikipedia articles what people were like personally,) but I question a left that denigrates one of their own Progressives while upholding a man whose political beliefs are, at best, anathema to their own.

Regardless, Tesla’s failures were not Edison’s fault. Edison may have screwed him on pay, but he didn’t gamble away Tesla’s tuition money, make him fail his classes, nor convince him not to marry. Edison didn’t make him blow his investment money on the stock market or wander around NYC at all hours of the night, feeding pigeons.

Edison, deaf since childhood, didn’t have half the advantages handed to him as Tesla. He had all of three months of schooling; no one ever sent him to university or gave him a scholarship to waste. He may not have been as smart as Tesla, but he was still an intensely intelligent man and adeptly capable of carrying out the business side of the operation, without which no research could get done. Without funding, you don’t have a lab; no lab, no research. Humans do not live in isolation; someone has to do the inglorious work of coordinate things so that other people can reap the benefits of a system set up for them to work in.

Ultimately, Tesla was a brilliant man who should not have been allowed to run his affairs. He needed the structure of a boss, a wife, parents, family, etc., to keep him on track and stop him from doing idiotic things like gambling away his tuition money.

Familial supervision during college could have ensured that he graduated and gotten him on the path toward a tenured position. Perhaps he would have rubbed shoulders with the likes of Einstein and Curie at the Solvay Conference. A boss would have ensured that the strategic, business ends of things–the ends Tesla had no great talent for–got done, leaving Tesla to do the things he did best, to reach far more of his full potential. (In this regard, Edison had advantages Tesla lacked–a wife, family, and a country he had grown up in.) But Tesla was too rigid to submit to someone of inferior intellect (real or perceived), and his family back in Europe was too far away to help him. Loneliness is madness, for humans are social animals, and so brilliant Tesla died alone, poor, and in love with a pigeon.

Tesla's wireless telegraph tower, 1904
Tesla’s wireless telegraph tower, 1904

Just imagine what Edison and Tesla could have created had they put their animosity aside and worked together.

Part 2 coming soon.

 

 

Women in Science–the Bad Old Days

Once we were driving down the highway when my husband said, “Hey, a Federal Reserve Note just flew across the road.”

Me: I think you have been reading too many finance blogs.

 

Oh look, Silver Certificates:

800px-US-$5-SC-1896-Fr.270 Hey, did you know we still have two dollar bills? 800px-US-$1-SC-1896-Fr-224-(3923429)

 

These bills, from the so-called “Education Series,” were printed in 1896 and feature, rather prominently, women. The $1 bill has Martha (and George) Washington. The other bills feature women as allegories of science, history, electricity, commerce, manufacturing, and you know, I can’t really tell if the steam and electricity children are supposed to be male or female.

If someone wants to put women on money, I totally support bringing back these bills, because they’re gorgeous.

There’s a certain sadness in looking at these and thinking, “Gosh, maybe people in the 1800s really were smarter than us.” Today, the five dollar bill would offend too many people (it has a breast on it!) and couldn’t get printed. We’ve become Philistines.

There’s also a sense of, “Wait, are you sure this the bad old days of women’s oppression, when people thought women were dumb and couldn’t handle higher education and shit?” Why would people who think women are dumb use women to illustrate the very concept of “science”?

Here’s a painting of MIT’s Alma Mater (Latin for “Nourishing Mother,”) finished in 1923:

e2b6e964ec0c2ea35ee1cba8f4edc95a

(Sorry it’s a crappy photo. I couldn’t find any good photos.)

“Alma Mater,” of course, is used synonymously with “university.” That is, the university itself (all universities,) is female. From the description:

“The central panel is rigidly symmetrical, with the centrally enthroned Alma Mater approached by two groups of acolytes extending laurel wreaths. The composition deliberately recalls the tradition in Christian art of the ascending Madonna attended by saints and apostles. Alma Mater is surrounded by personifications of learning through the printed page, learning through experiment, and learning through the various branches of knowledge. They hover above the Charles River Basin, with a spectral hint of the MIT buildings in the background.”

Here’s a detail:

Blashfield-popup

Unfortunately, I haven’t found good photos of the side paintings, but they sound dramatic:

“The two side panels … bring the elevated scene down to earth with trees that appear to grow straight up from the floor. Unexplained spectral figures glide through this grove. … The right panel, which has been identified as Humanity Led by Knowledge and Invention depicts a mother and children of varying ages progressing from Chaos to Light, accompanied by cherubs bearing the scales of Justice. On the left, the most dark and dramatic mural squarely faces the ethical challenge that has confronted science from the outset. The Latin inscription (from Genesis) in the roundel spells out: “Ye Shall Be Us Gods Knowing Good and Evil.” The lab-coated scientist is crowned by a figure said to be Hygenia (goddess of Health). He stands between two giant jars containing beneficent and malevolent gasses, symbolizing the constructive and destructive possibilities unleashed with every new discovery. With the horrors of the First World War still fresh, soldiers and diplomats gather at the Council table of the World. Dogs of war lurk near evil gasses, while Famine threatens the background. The strangely out-of-scale, dark colossal head within the shadow of the Tree of Knowledge is said to represent Nature; her relation to the rest of the drama is (perhaps deliberately) unclear.”

If you squint, you might be able to make them out:

morss_center_large

Before art went to shit, the world was full of lovely paintings of things like “Liberty leading the People” or “The Arts and Sciences,” using allegorical human forms that relied upon people’s understanding and knowledge of ancient Greek mythology–not so ancient when people were actually reading it. I suspect there are so few good photos of this painting because people forget, when surrounded by splendor, that splendor is no longer normal.

This habit of using women as allegorical figures to represent science and learning goes back hundreds, if not thousands of years:

These guys thought women were dumb?
12th century illustration of the Seven Liberal Arts: Grammar, Logic, Rhetoric, Arithmetic, Geometry, Music Theory, and Astronomy

The “Liberal Arts” did not originally refer to silly university classes, but to the knowledge thought essential to the education of all free (liber) people, in order to participate properly in civic life. These essential studies were Grammar, Logic, Rhetoric, Arithmetic, Geometry, Music Theory, and Astronomy (one may assume that the functional ability to read is considered a basic prerequisite for learning, not an endpoint in itself as it is in our modern system.) These studies all culminate in their purist expression in Philosophy, the very love of wisdom.

Notice that all of these allegorical figures are women. Did the depiction of women as the purist ideal of mathematical knowledge make male students doubt their own self-worth and drive them away from serious study?

Then why do people think the inverse?

The trend can be traced back further:

Botticelli, Primavera
Botticelli, Primavera

Boticelli depicts the Spring accompanied by the Greek Graces.

Raphael's Parnassus
Raphael’s Parnassus

The Greek Muses were goddesses of inspiration for literature, science, and the arts. Different people list them differently, (I doubt there was ever any widespread agreement on exactly what the muses represented,) but the lists generally look like, “epic poetry, history, music, poetry, tragedy, hymns, dance, comedy, and astronomy,” or “music, science, geography, mathematics, philosophy, art, drama, and inspiration.”

1280px-Muses_sarcophagus_Louvre_MR880

And who can forget Athena herself, goddess of wisdom and warfare?

280px-Athena_aigis_Cdm_Paris_254

220px-Mattei_Athena_Louvre_Ma530_n2

310px-NAMABG-Aphaia_Athena_statue

(Take your artistic pick.)

Who needs Nobel Prize Winners, anyway?

You may have noticed that I like science. I also like scientists–heck, let’s expand this to most of STEM. Good folks.

Scientists tend to be quiet, unassuming folks who get on with the business of making the world a better place by curing cancer, inventing airplanes, and developing the germ theory of disease.

I don’t like it when political ideas try to dictate science. It was bad enough when the Soviet Union tried it (and Maoist China, remember that exciting time when Mao declared that the concept of diminishing returns was bourgeois capitalist lies and that just planting more seeds in your fields would result in more crops, and then millions of people died? Fun times!)

Sometimes scientists say or think unpopular things, like that humans evolved from apes or that some human populations have lower IQs than others. Or that women cry easily or that Global Warming is real.

The mature reaction to someone saying something you find offensive is to make a logical counter-argument. (Or, you know, ignore them.) Indeed, as I’ve said before, one of the beauties of science is that the whole point of it is to disprove incorrect ideas. If there’s an idea floating around in science that you don’t like, well, disprove it with science!

If you can’t, then maybe you’re the one who’s wrong.

Republicans have traditionally been the anti-science side. 49% don’t believe in evolution, versus 37% do. Throwing Democrats and independents into the equation doesn’t help much–overall, 42% of Americans don’t believe in evolution, versus 50% who believe in some form of evolution, (including god-directed evolution):

At least evolution is getting a tiny bit more popular
From Gallup

Unfortunately, a lot of those people who claim to believe in evolution don’t.

For example, according to Gallop, 2005, the majority of Americans–68%–believe that men and women are equally good at math and science. Only 10% believe that men have an innate advantage in math and science, and 8% believe that women are superior.

Do you know how depressing this is? I mean, for starters, the question itself is badly worded. Men and women are about equal on average, but men are disproportionately represented at the high end of mathematical ability and at the low end. As I noted yesterday, this is a natural side effect of Y chromosome variation. But for the purposes of doing math and science as a career, which takes rather more than average talent, men do have an innate advantage.

But instead of getting intelligent discussions about these sorts of things, we get people shouting insults and trying to ruin each other’s careers.

This popped up on FB today:

*winces*
Does this count as a microaggression?

“Sexist,” of course, is an insult, akin to saying that you hate women or believe that they are inherently inferior. So according to these people, anyone who thinks that, IDK, men are more aggressive on average because their brains produce more testosterone is a bad person. Never mind that science supports this notion pretty soundly.

(BTW, it’s pretty hard to argue that society’s anti-woman views are nefariously keeping women out of STEM when the majority of people think men and women are equally talented. For that matter, if there’s any group of people that I’ve found to be extremely accepting of and decent toward women, it’s the folks in STEM. Seriously, these guys are super awesome.)

So you may remember that whole kerfluffle in which Tim Hunt–some nobody who’s contributed nothing of worth to humanity except maybe Nobel Prize-winning work in Medicine/Physiology, small stuff on the scale of human achievement–made some comments about women in science and the entire world spent about 5 minutes losing their collective shit and then a lot of pictures of female scientists got posted on the internet. (Actually, the pictures are kind of nice.)

Oh, and Tim was forced to resign from some honorary professorship.

“The days that followed saw him unceremoniously hounded out of honorary positions at University College London (UCL), the Royal Society and the European Research Council (ERC).

“Under siege at his Hertfordshire home, he sank into despair.

“‘Tim sat on the sofa and started crying. Then I started crying,’ his wife, Professor Mary Collins (herself a prominent scientist) later recalled. ‘We just held on to each other.’”

When it came to light that Tim Hunt may have just been trying to make a joke–a bad one–the provost at his erstwhile University indicated that, (in The Guardian’s words) “Professor Hunt would not be reinstated, it was impossible for an institution to tolerate someone to whom they had awarded an honorary post, even a 71-year-old Nobel prize winner, expressing views even in jest that so comprehensively undermined its own reputation as a leading supporter of female scientists.”

I am just thrilled, oh so thrilled, that university science departments now see their primary purpose as public works programs for women, rather than, IDK, the pursuit of actual fucking science.

Do you know what happens to your science department when you stop focusing on science and turn it into a pity-festival for women? You end up with a bunch of women who can’t hack it in science. Accept men and women on their merits, and you end up with quality scientists. Accept people based on their qualities other than merit, and you end up with hacks.

BTW, I’m female.

You might think Hunt’s comments were totally silly (in which case, go ahead and ignore them,) but I’ve known couples that started in labs. I don’t think it’s any big secret that people sometimes fall in love with co-workers. Is this a problem? I don’t know. Do women cry more than men? Anecdotal experience says yes.

The intelligent response to Hunt’s comments (if you want to do anything at all,) would have been to document whether or not women cry at a higher rate than men when you criticize their lab work and whether lab romances are a problem–and if gender segregated labs would actually work any better, or end up with their own issues. The unintelligent response is to make a big deal out of how offended you are and try to get someone fired.

So what does Connie St Louis, the female scientist journalist who’s actually not a scientist (The Daily Mail claims that St Louis made up/faked a large chunk of her CV, if you believe anything the Daily Mail prints,) and so probably has less experience running a lab than Hunt does, but never mind, we’re all experts now, have to say about starting the whole firestorm that made Hunt lose his probably not very important honorary position?

“The likes of Richard Dawkins and Brian Cox should focus on taking up the real issue of sexism in science. It is absurd to say that scientists can do and say what they like in the name of academic freedom.”

Let’s read that again. “It is absurd to say that scientists can do and say what they like in the name of academic freedom.

What else does St Louis have to say?

“…eight Nobel laureates, plus the ubiquitous Richard Dawkins, have come out in support of Hunt. There are over 2,000 signatures on an online petition to reinstate him to his honorary post at UCL. Contrast this with 200+ signatures on a petition that I started calling on the Royal Society to elect its first female president. The Nobel eight made an idiotic attempt to equate the upset caused by Hunt’s ill advised and sexist comments with some kind of “chilling effect” on academics.”

Of course it has a chilling effect. No one wants to get fired. How does a journalist even presume to claim to know what does and doesn’t have a chilling effect on someone else’s profession, when rather respected people in that profession are claiming that chilling effects exist?

Hell, there’s a reason this blog is anonymous, and it’s people like Connie St Louis. But she continues:

“This is an absurd idea and deserves to be outed for what it is, a deeply cynical attempt to say that scientists can do and say what they like. In the name of academic freedom? Is science so special that any old sexist (or for that matter racist) words that they utter are allowed? The answer is and must be a resounding no.”

Free inquiry is dead.

Remember whom to thank when we all die of cancer plague.

Epigenetics

I remember when I first heard about epigenetics–the concept sounded awesome.

Now I cringe at the word.

To over simplify, “epigenetics” refers to biological processes that help turn on and off specific parts of DNA. For example, while every cell in your body (except sperm and eggs and I think blood cells?) have identical DNA, they obviously do different stuff. Eyeball cells and brain cells and muscle cells are all coded from the exact same DNA, but epigenetic factors make sure you don’t end up with muscles wiggling around in your eye sockets–or as an undifferentiated mass of slime.

If external environmental things can have epigenetic effects, I’d expect cancer to be a biggie, due to cell division and differentiation being epigenetic.

What epigenetics probably doesn’t do is everything people want it to do.

There’s a history, here, of people really wanting genetics to do things it doesn’t–to impose free will onto it.* Lamarck can be forgiven–we didn’t know about DNA back then. His theory was that an organism can pass on characteristics that it acquired during its lifetime to its offspring, thus driving evolution. The classic example given is that if a giraffe stretches its neck to reach leaves high up in the trees, its descendants will be born with long necks. It’s not a bad theory for a guy born in the mid 1700s, but science has advanced a bit since then.

The USSR put substantial resources into trying to make environmental effects show up in one’s descendants–including shooting anyone who disagreed.

Trofim Lysenko, a Soviet agronomist, claimed to be able to make wheat that would grow in winter–and pass on the trait to its offspring–by exposing the wheat seeds to cold. Of course, if that actually worked, Europeans would have developed cold-weather wheat thousands of years ago.

Lysenko was essentially the USSR’s version of an Affirmative Action hire:

“By the late 1920s, the Soviet political leaders had given their support to Lysenko. This support was a consequence, in part, of policies put in place by the Communist Party to rapidly promote members of the proletariat into leadership positions in agriculture, science and industry. Party officials were looking for promising candidates with backgrounds similar to Lysenko’s: born of a peasant family, without formal academic training or affiliations to the academic community.” (From the Wikipedia page on Lysenko)

In 1940, Lysenko became director of the USSR’s Academy of Science’s Institute of Genetics–a position he would hold until 1964. In 1948, scientific dissent from Lysenkoism was formally outlawed.

“From 1934 to 1940, under Lysenko’s admonitions and with Stalin’s approval, many geneticists were executed (including Isaak Agol, Solomon Levit, Grigorii Levitskii, Georgii Karpechenko and Georgii Nadson) or sent to labor camps. The famous Soviet geneticist Nikolai Vavilov was arrested in 1940 and died in prison in 1943. Hermann Joseph Muller (and his teachings about genetics) was criticized as a bourgeois, capitalist, imperialist, and promoting fascism so he left the USSR, to return to the USA via Republican Spain.

In 1948, genetics was officially declared “a bourgeois pseudoscience”; all geneticists were fired from their jobs (some were also arrested), and all genetic research was discontinued.”  (From the Wikipedia page on Lysenkoism.)

Alas, the Wikipedia does not tell me if anyone died from Lyskenkoism itself, say, after their crops failed, but I hear the USSR doesn’t have a great agricultural record.

Lysenko got kicked out in the 60s, but his theories have returned in the form of SJW-inspired claims of the magic of epigenetics to explain how any differences in average group performance or behavior is actually the fault of long-dead white people. Eg:

Trauma May be Woven into DNA of Native Americans, by Mary Pember

” The science of epigenetics, literally “above the gene,” proposes that we pass along more than DNA in our genes; it suggests that our genes can carry memories of trauma experienced by our ancestors and can influence how we react to trauma and stress.”

That’s a bold statement. At least Pember is making Walker’s argument for him.

Of course, that’s not actually what epigenetics says, but I’ll get to that in a bit.

“The Academy of Pediatrics reports that the way genes work in our bodies determines neuroendocrine structure and is strongly influenced by experience.”

That’s an interesting source. While I am sure the A of P knows its stuff, their specialty is medical care for small children, not genetics. Why did Pember not use an authority on genetics?

Note: when thinking about whether or not to trust an article’s science claims, consider the sources they use. If they don’t cite a source or cite an unusual, obscure, or less-than-authoritative source, then there’s a good chance they are lying or cherry-picking data to make a claim that is not actually backed up by the bulk of findings in the field. Notice that Pember does not provide a link to the A of P’s report on the subject, nor provide any other information so that an interested reader can go read the full report.

Wikipedia is actually a decent source on most subjects. Not perfect, of course, but it is usually decent. If I were writing science articles for pay, I would have subscriptions to major science journals and devote part of my day to reading them, as that would be my job. Since I’m just a dude with a blog who doesn’t get paid and so can’t afford a lot of journal memberships and has to do a real job for most of the day, I use a lot of Wikipedia. Sorry.

Also, I just want to note that the structure of this sentence is really wonky. “The way genes work in our bodies”? As opposed to how they work outside of our bodies? Do I have a bunch of DNA running around building neurotransmitters in the carpet or something? Written properly, this sentence would read, “According to the A of P, genes determine neuroenodcrine structures, in a process strongly influenced by experience.”

Pember continues:

“Trauma experienced by earlier generations can influence the structure of our genes, making them more likely to “switch on” negative responses to stress and trauma.”

Pember does not clarify whether she is continuing to cite from the A of P, or just giving her own opinions. The structure of the paragraph implies that this statement comes from the A of P, but again, no link to the original source is given, so I am hard pressed to figure out which it is.

At any rate, this doesn’t sound like something the A of P would say, because it is obviously and blatantly incorrect. Trauma *may* affect the structure of one’s epigenetics, but not the structure of one’s genes. The difference is rather large. Viruses and ionizing radiation can change the structure of your DNA, but “trauma” won’t.

” The now famous 1998 ACES study conducted by the Centers for Disease Control (CDC) and Kaiser Permanente showed that such adverse experiences could contribute to mental and physical illness.”

Um, no shit? Is this one of those cases of paying smart people tons of money to tell us grass is green and sky is blue? Also, that’s a really funny definition of “famous.” Looks like the author is trying to claim her sources have more authority than they actually do.

“Folks in Indian country wonder what took science so long to catch up with traditional Native knowledge.”

I’m pretty sure practically everyone already knew this.

“According to Bitsoi, epigenetics is beginning to uncover scientific proof that intergenerational trauma is real. Historical trauma, therefore, can be seen as a contributing cause in the development of illnesses such as PTSD, depression and type 2 diabetes.”

Okay, do you know what epigenetics actually shows?

The experiment Wikipedia cites is of male mice who were trained to fear a certain smell by giving them small electric shocks when they smelled the smell. The children of these mice, conceived after the foot-shocking was finished, startled in response to the smell–they had inherited their father’s epigenetic markers that enhanced their response to that specific smell.

It’s a big jump from “mice startle at smells” to “causes PTSD.” This is a big jump in particular because of two things:

1. Your epigenetics change all the time. It’s like learning. You don’t just learn one thing and then have this one thing you’ve learned stuck in your head for the entire rest of your life, unable to learn anything new. Your epigenetics change in response to life circumstances throughout your entire life.

Eg, (from the Wikipedia):

“One of the first high-throughput studies of epigenetic differences between monozygotic twins focused in comparing global and locus-specific changes in DNA methylation and histone modifications in a sample of 40 monozygotic twin pairs. In this case, only healthy twin pairs were studied, but a wide range of ages was represented, between 3 and 74 years. One of the major conclusions from this study was that there is an age-dependent accumulation of epigenetic differences between the two siblings of twin pairs. This accumulation suggests the existence of epigenetic “drift”.

In other words, when identical twins are babies, they have very similar epigenetics. As they get older, their epigenetics get more and more different because they have had different experiences out in the world, and their experiences have changed their epigenetics. Your epigenetics change as you age.

Which means that the chances of the exact same epigenetics being passed down from father to child over many generations are essentially zilch.

2. Tons of populations have experienced trauma. If you go back far enough in anyone’s family tree, you can probably find someone who has experienced trauma. My grandparents went through trauma during the Great Depression and WWII. My biological parents were both traumatized as children. So have millions, perhaps billions of other people on this earth. If trauma gets encoded in people’s DNA (or their epigenetics,) then it’s encoded in virtually every person on the face of this planet.

Type 2 Diabetes, Depression, and PTSD are not evenly distributed across the planet. Hell, they aren’t even common in all peoples who have had recent, large oppression events. African Americans have low levels of depression and commit suicide at much lower rates than whites–have white Americans suffered more oppression than black Americans? Whites commit suicide at a higher rate than Indians–have the whites suffered more historical trauma? On a global scale, Israel has a relatively low suicide rate–lower than India’s. Did India recently experience some tragedy worse than the Holocaust? (See yesterday’s post for all stats.)

Type 2 Diabetes reaches its global maximum in Saudia Arabia, Oman, and the UAE, which as far as I know have not been particularly traumatized lately, and is much lower among Holocaust descendants in nearby Israel:

From a BBC article on obesity
From a BBC article on obesity

It’s also very low in Sub-Saharan Africa, even though all of the stuff that causes “intergenerational trauma” probably happened there in spades. Have Americans been traumatized more than the Congolese?

This map doesn’t make any sense from the POV of historical trauma. It makes perfect sense if you know who’s eating fatty Waestern diets they aren’t adapted to. Saudia Arabia and the UAE are fucking rich (I bet Oman is, too,) and their population of nomadic goat herders has settled down to eat all the cake they want. The former nomadic lifestyle did not equip them to digest lots of refined grains, which are hard to grow in the desert. Most of Africa (and Yemen) is too poor to gorge on enough food to get Type-2 Diabetes; China and Mongolia have stuck to their traditional diets, to which they are well adapted. Mexicans are probably not adapted to wheat. The former Soviet countries have probably adopted Western diets. Etc., etc.

Why bring up Type-2 Diabetes at all? Well, it appears Indians get Type-2 Diabetes at about the same rate as Mexicans, [Note: PDF] probably for the exact same reasons: their ancestors didn’t eat a lot of wheat, refined sugar, and refined fats, and so they aren’t adapted to the Western diet. (FWIW, White Americans aren’t all that well adapted to the Western Diet, either.)

Everybody who isn’t adapted to the Western Diet gets high rates of diabetes and obesity if they start eating it, whether they had historical trauma or not. We don’t need epigenetic trauma to explain this.

“The researchers found that Native peoples have high rates of ACE’s and health problems such as posttraumatic stress, depression and substance abuse, diabetes all linked with methylation of genes regulating the body’s response to stress. “The persistence of stress associated with discrimination and historical trauma converges to add immeasurably to these challenges,” the researchers wrote.

Since there is a dearth of studies examining these findings, the researchers stated they were unable to conclude a direct cause between epigenetics and high rates of certain diseases among Native Americans.”

There’s a dearth of studies due to it being really immoral to purposefully traumatize humans and then breed them to see if their kids come out fucked up. Luckily for us, (or not luckily, depending on how you look at it,) however, humans have been traumatizing each other for ages, so we can just look at actually traumatized populations. There does seem to be an effect down the road for people whose parents or grandparents went through famines, but, “the effects could last for two generations.”

As horrible as the treatment of the Indians has been, I am pretty sure they didn’t go through a famine two generations ago on the order of what happened when the Nazis occupied the Netherlands and 18-22,000 people starved.

In other words, there’s no evidence of any long-term epigenetic effects large enough to create the effects they’re claiming. As I’ve said, if epigenetics actually acted like that, virtually everyone on earth would show the effects.

The reason they don’t is because epigenetic effects are relatively short-lived. Your epigenetics get re-written throughout your lifetime.

” Researchers such as Shannon Sullivan, professor of philosophy at UNC Charlotte, suggests in her article “Inheriting Racist Disparities in Health: Epigenetics and the Transgenerational Effects of White Racism,” that the science has faint echoes of eugenics, the social movement claiming to improve genetic features of humans through selective breeding and sterilization.”

I’m glad the philosophers are weighing in on science. I am sure philosophers know all about genetics. Hey, remember what I said about citing sources that are actual authorities on the subject at hand? My cousin Bob has all sorts of things to say about epigenetics, but that doesn’t mean his opinions are worth sharing.

The article ends:

“Isolating and nurturing a resilience gene may well be on the horizon.”

How do you nurture a gene?

 

There are things that epigenetics do. Just not the things people want them to do.

Scientific Nostalgia

So, I hear the Brontosaurus might return to the rolls of official dinosaurs, rather than oopsies. From Yale mag’s “The Brontosaurus is Back“:

Originally discovered and named by Yale paleontologist O. C. Marsh, Class of 1860, the “thunder lizard” was later determined to be the same as the ApatosaurusBut European researchers recently reexamined existing fossils and decided that Brontosaurus is in fact a separate species.”

Well, these things happen. I’m glad scientists are willing to revisit their data and revise their assumptions. Of course, I have no idea how much morphological difference is necessary between two skeletons before we start calling them different species, (by any sane metric, would a wolf hound and a chihuahua be considered the same species?) but I’m willing to trust the paleontologists on this one.

The interesting thing isn’t the reclassification itself, which gets down to somewhat dry and technical details about bone sizes and whatnot, but the fact that people–myself included!–have some sort of reaction to this news, eg:

Dinosaur lovers of a certain age are gratified. “I’m delighted,” says geology professor Jacques Gauthier, the Peabody’s curator of vertebrate paleontology and vertebrate zoology. “It’s what I learned as a kid.”

I’ve seem other people saying the same thing. Those of us who grew up with picture books with brontosauruses in them are happy at the news the brontosaurus is back–like finding an old friend again, or episodes of your favorite childhood show on YouTube. Perhaps you think, “Yes, now I can get a book of dinosaurs for my kids and share the animals I loved with my kids!”

Meanwhile some of us still cling to the notion that Pluto, despite its tiny size and eccentric orbit, really ought to be a planet. Even I feel a touch of anthropomorphizing pity for Pluto, even though I think from an objective POV that the current classification scheme is perfectly sensible.

Pluto is not the first round, rocky body to get named a planet and then demoted: in 1801, Giuseppe Piazzi discovered Ceres, a small, round, rocky body orbiting between Jupiter and Mars.

Finding a planet between Mars and Jupiter was intellectually satisfying on a number of levels, not least of which that it really seems like there ought to be one there. For the next 50 years, Ceres made it into the textbooks as our fifth planet–but by the 1860s, it had been demoted. A host of other, smaller bodies–some of them roundish–had also been discovered orbiting between Mars and Jupiter, and it was now clear that these were a special group of space bodies. They all got named asteroids, and Ceres went down the memory hole.

Ceres is smaller than Pluto, but they have much in common. As scientists discovered more small, Pluto-like bodies beyond Neptune’s orbit, the question of what is a planet revived. Should all non-moon, round bodies (those with enough gravity to make themselves round) be planets? That gets us to at least 13 planets, but possibly dozens–or hundreds–more.

There’s an obvious problem with having hundreds of planets, most of which are miniscule: kids would never learn ’em all. When you get right down to it, there are thousands of rocks and balls of ice and other such things zooming around the sun, and there’s a good reason most of them are known by numbers instead of names. You’ve got to prioritize data, and some sort of definition that would cut out the tiniest round ones was needed. Tiny Pluto, alas, ended up on the wrong side of the definition: not a planet.

Pluto is, of course, completely unaffected by a minor change in human nomenclature. And someday, like Ceres, Pluto may be largely forgotten by the public at large. In the meanwhile, there will still be nostalgia for the friendly science of one’s childhood.