Anthropology Friday: Crackers pt 2

uk-origins3
From JayMan’s post on the American Nations

I am frequently frustrated by our culture’s lack of good ethnonyms. Take “Hispanic.” It just means “someone who speaks Spanish or whose ancestors spoke Spanish.” It includes everyone from Lebanese-Mexican billionaire Carlos Slim to Japanese-Peruvian Alberto Fujimori, from Sephardi Jews to native Bolivians, from white Argentinians to black Cubans, but doesn’t include Brazilians because speaking Portuguese instead of Spanish is a really critical ethnic difference.*

*In conversation, most people use “Hispanic” to mean “Mexican or Central American who’s at least partially Native American,” but the legal definition is what colleges and government agencies are using when determining who gets affirmative action. People think “Oh, those programs are to help poor, brown people,” when in reality the beneficiaries are mostly well-off and light-skinned–people who were well-off back in their home countries.

This is the danger of using euphemisms instead of saying what you actually mean.

Our ethnonyms for other groups are equally terrible. All non-whites are often lumped together under a single “POC” label, as though Nigerian Igbo and Han Chinese were totally equivalent and fungible peoples. Whites are similarly lumped, as if a poor white from the backwoods of Georgia and a wealthy Boston Puritan had anything in common. There are technical names for these groups, used in historical or academic contexts, but if you tell the average person you hail from a mix of “Cavalier-Yeoman and Cracker ancestors,” they’re just going to be confused.

north-american-nations-4-3
map of the American Nations

With the exception of Cajuns and recent immigrants who retain an old-world ethnic identity (eg, Irish, Jewish,) we simply lack common vernacular ethnonyms for the different white groups that settled the US–even though they are actually different.

The map at left comes from Colin Woodard’s American Nations: A History of the 11 Rival Regional Cultures of North America. 

As Woodard himself has noted, DNA studies have confirmed his map to an amazing degree.

American ethnic groups are not just Old World ethnic groups that happen to live in America. They’re real ethnicities that have developed over here during the past 500 years, but we have failed to adopt common names for them.

Woodard’s map implies a level of ethnic separation that is probably not entirely accurate, as these groups settled the American frontier in waves, creating layers of ethnicity that are thicker or thinner in different places. Today, we call these social classes, which is not entirely inaccurate.

Take the South. The area is dominated by two main ethnic blocks, Appalachians (in the mountains) and Cavalier-Plantation owners in the flatter areas. But the Cavalier area was never majority wealthy, elite plantation owners; it has always had a large contingent of middling-class whites, poor whites, and of course poor blacks. In areas of the “Deep South” where soils were poor or otherwise unsuited to cultivated, elite planters never penetrated, leaving the heartier backwoods whites–the Crackers–to their own devices.

If their ancestors spoke French, we recognize them as different, but if not, they’re just “poor”–or worse, “trash.”

Southern identity is a curious thing. Though I was born in the South (and my ancestors have lived there for over 400 years,) I have no meaningful “Southern identity” to speak of–nor do, I think, most southerners. It’s just a place; the core historical event of going to war to protect the interests of rich elites in perpetuating slavery doesn’t seem to resonate with most people I’ve met.

My interest in the region and its peoples stems not from Southern Pride, but the conventional curiosity adoptees tend to feel about their birth families: Where did I come from? What were they like? Were they good people? and Can I find a place where I feel comfortable and fit in? (No.)

My immediate biological family hails from parts of the South that never had any plantations (I had ancestors in Georgia in the 1800s, and ancestors in Virginia in the 1700s, but they’ve been dead for a while; my father lives within walking distance of his great-grandparent’s homestead.)

5a74b9a780f8c.image
Dust Storm, Tulsa, Oklahoma, 1935 “This was a bad idea.”–Grandma

As previously discussed, I don’t exactly feel at home in cities;  perhaps this is because calling my ancestors “farmers” is a rather generous description for folks who thought it was a good idea to move to Oklahoma during the Dust Bowl.

(By the way, the only reason the prairies are consistently farmed today is due to irrigation, drawing water up from the Ogallala and other aquifers, and we are drawing water from those aquifers much faster than it is being replenished. If we keep using water at this rate–or faster, due to population growth–WE WILL RUN OUT. The prairies will go dry and dust storms will rage again.)

To be fair, some of my kin were successful farmers when it actually rained, but some were never so sedentary. Pastoralists, ranchers, hoe-farmers–they were the sorts of people who settled frontiers and moved on when places got too crowded, who drank hard and didn’t always raise their children. They match pretty closely Richard Sapp’s description of the Florida Crackers.

6KmUzif

From a genetic standpoint, the Crackers are either descended from borderlanders and Scotch-Irish (the pink region on the map at the top of the post,) or from folks who got along well with borderlanders and decided to move alongside them. I find it amazing that a relatively small place like Britain could produce such temperamentally different peoples as Puritans and Crackers–the former hard working, domesticated, stiff, and proper; the latter loud, liberty-loving, and more violent.

Peter Frost (evo and proud) has a theory that “core” Europe managed to decrease its homicide rates by executing criminals, thus removing them from the gene pool; the borderlands of Scotland and Ireland were perhaps beyond the reach of the hangman’s noose, or hopping the border allowed criminals to escape the police.

individualism-map-2-hajnal-line
from HBD Chick’s big summary post on the Hajnal Line

HBD Chick’s work focuses primarily on the effects of manorialism and outbreeding within the Hajnal line. Of the Crackers, she writes:

“The third American Revolution reached its climax in the years from 1779 to 1781. This was a rising of British borderers in the southern backcountry against American loyalists and British regulars who invaded the region. The result was a savage struggle which resembled many earlier conflicts in North Britain, with much family feuding and terrible atrocities committed on both sides. Prisoners were slaughtered, homes were burned, women were raped and even small children were put to the sword.” …

i’ve got a couple of posts related to those rambunctious folks from the backcountry whose ancestors came from the borderlands between england and scotland. libertarian crackers takes a quick look at why this group tends to love being independent and is distrustful of big gubmint — to make a long story short, the border folks married closely for much longer than the southern english — and they didn’t experience much manorialism, either (the lowland scots did, but not so much the border groups). did i mention that they’re a bit hot-headed? (not that there’s anything wrong with that! (~_^) ) see also: hatfields and mccoys. not surprising that this group’s war of independence involved “much family feuding.”

Less manorialism, less government control, less executing criminals, more cousin-marriage, more clannishness.

And the differences here aren’t merely cultural. As Nisbett and Cohen found (PDF; h/t HBD Chick):

During the experiment, a confederate bumped some subjects and muttered ‘asshole’ at them. Cortisol (a stress hormone) and testosterone (rises in preparation for violence) were measured before and after the insult. Insulted Southerners showed big jumps in both cortisol and testosterone compared to uninsulted Southerners and insulted Northerners. The difference in psychological and physiological responses to insults was manifest in behavior. Nisbett and Cohen recruited a 6’3” 250 lb (190 cm, 115 kg) American style football player whose task was to walk down the middle of a narrow hall as subjects came the other direction. The experimenters measured how close subjects came to the football player before stepping aside. Northerners stepped aside at around 6 feet regardless of whether they had been insulted. Un-insulted Southerners stepped aside at an average distance of 9 feet, whereas insulted Southerners approached to an average of about 3 feet. Polite but prepared to be violent, un-insulted Southerners take more care, presumably because they attribute a sense of honor to the football player and are normally respectful of others’ honor. When their honor is challenged, they are prepared and willing to challenge someone at considerable risk to their own safety.”

It’s genetic.

(The bit about honor is… not right. I witnessed a lot of football games as a child, and no one ever referred to the players as “honorable.” Southerners just don’t like to get close to each other, which is very sensible if people in your area get aggressive and angry easily. The South also has a lower population density than the North, so people are used to more space.)

As my grandmother says, “You don’t get to pick your ancestors.” I don’t know what I would think of my relatives had I actually grown up with them. They have their sins, like everyone else. But from a distance, as an adult, they’re fine people and they always have entertaining stories.

“Oh, yes, yet another time I almost died…”

As for racial attitudes, if you’re curious, they vary between “probably marched for Civil Rights back in the 50s” and “has never spoken a word, good or bad, generalizing about any ethnic group.” (I have met vocally anti-black people in the South; just not in my family.) I think my relatives are more interested in various strains of Charismatic Christianity than race.

It seems rather unfortunate that Southern identity is so heavily linked to the historical interests of the Plantation Elites. After all, it did the poor whites no good to die in a war fought to protect the interests of the rich. I think the desire to take pride in your ancestors and group is normal, healthy, and instinctive, but Southerners are in an unfortunate place where that identity is heavily infused with a racial ideology most Southerners don’t even agree with.

> Be white
> Be from the south
> Not into Confederacy
> Want an identity of some sort

> Now what?

In my case, I identify with nerds. This past is not an active source of ethnic identity, nor is the Cracker lifestyle even practical in the modern day. But my ancestors have still contributed (mostly genetically) to who I am.

Well, this was going to just be an introduction to today’s anthropology selection, but it turned out rather longer than expected, so let’s just save the real anthropology for next week.

Advertisements

The Unbearable Whiteness of Elizabeth Warren

I almost feel sad for Senator Warren. One day, a little girl looked in the mirror, saw pale skin, brown hair, and blue eyes looking back at her, and thought, “No. This can’t be right. This isn’t me.”

So she found a new identity, based on a family legend–a legend shared by a suspicious number of white people–that one of her ancestors was an American Indian.

warren-penn-state
Elizabeth Warren changed her race at Penn: Source

This new identity conveyed certain advantages: Harvard Law claimed her as a Native American to boost claims of racial diversity among the faculty:

A majority [83%] of Harvard Law School students are unhappy with the level of representation of women and minorities on the Law School faculty, according to a recent survey. …

Law students said they want to learn from a variety of perspectives and approaches to the law. “A black male from a lower socioeconomic background will approach the study of constitutional law in a different way from a white upper-class male,” Reyes said. …

Of 71 current Law School professors and assistant professors, 11 are women, five are black, one is Native American and one is Hispanic, said Mike Chmura, spokesperson for the Law School.

Although the conventional wisdom among students and faculty is that the Law School faculty includes no minority women, Chmura said Professor of Law Elizabeth Warren is Native American.

In response to criticism of the current administration, Chmura pointed to “good progress in recent years.”

As did Penn:

The University of Pennsylvania chose not to tout in the press their newly minted Native American professor. But her minority status was duly noted: The university’s Minority Equity Report, published in April 2005, shows that Warren won a teaching award in 1994. Her name is in bold and italicized to indicate she was a minority. …

The law school was happy to have her count as a diversity statistic, however, and for at least three of the years she taught there — 1991, 1992, and 1994 — an internal publication drawing on statistics from the university’s federal affirmative action report listed one Native American female professor in the university’s law school.

Warren’s Native American identity may have played no role in her hiring (the committees involved appear not to have known or cared about her identity,) but it seems to have been important to Warren herself. As her relatives aged and died, and she moved away from her childhood home in Oklahoma and then Texas, she was faced with that persistent question: Who am I?

The truth, a white woman from a working class family in Oklahoma, apparently wasn’t enough for Elizabeth. (Oklahoma doesn’t carry many status points over in East Coast academic institutions.)

Each of us is the sum of many things, including the stories our families tell us and genetic contributions from all of our ancestors–not just the interesting ones (within a limit–after enough generations, each individual contribution has become so small that it may not be passed on in reproduction.)

I have also done the 23 and Me thing, and found that I hail from something like 20 different ethnic groups–including, like Warren, a little smidge of Native American. But none of those groups make up the majority of my DNA. All of them are me; none of them are me. I just am.

Warren’s announcement of her DNA findings vindicated her claim to a Native American ancestor and simultaneously unveiled the absurdity of her claim to be a Native American. What should have been a set of family tales told to friends and passed on to children and grandchildren about a distant ancestor became a matter of national debate that the Cherokee Nation itself felt compelled to weigh in on:

Using a DNA test to lay claim to any connection to the Cherokee Nation or any tribal nation, even vaguely, is inappropriate and wrong. It makes a mockery out of DNA tests and its legitimate uses while also dishonoring legitimate tribal governments and their citizens, whose ancestors are well documented and whose heritage is proven. Senator Warren is undermining tribal interests with her continued claims of tribal heritage.

Like them or not, the Cherokee have rules about who is and isn’t a Cherokee, because being Cherokee conveys certain benefits–for example, the tribe builds houses for members and helps them look for jobs. This is why conflicts arise over matters like whether the Cherokee Freedmen are official members. When membership in a group conveys benefits, the borders of that group will be policed–and claims like Warren’s, no matter how innocently intended, will be perceived as an attempt at stealing something not meant for her.

Note: I am not saying this kind of group border policing is legitimate. Many “official” Cherokee have about as much actual Cherokee blood in them as Elizabeth Warren, but they have a documented ancestor on the Dawes Rolls, so they qualify and she doesn’t. Border policing is just what happens when there are benefits associated with being part of a group.

I don’t have an issue with Warren’s own self-identity. After all, if race is a social construct,* then she’s doing it exactly right. She’s allowed to have an emotional connection to her own ancestors, whether that connection is documented via the Dawes Rolls or not. All of us here in America should have equal access to Harvard’s benefits, not just the ones who play up a story about their ancestors.

The sad thing, though, is that despite being one of the most powerful and respected women people in America, she still felt the need to be more than she is, to latch onto an identity she doesn’t truly possess.

You know, Elizabeth… it’s fine to just be a white person from Oklahoma. It’s fine to be you.

 

*Note: This blog regards “species” and nouns generally as social constructs, because language is inherently social. That does not erase biology.

Greatest Hits: Can Ice Packs Help Stop a Seizure in Humans?

 

WHO-EpilepsyInfographic_4Pieces
Source: WHO

Over the years, a few posts have proven surprising hits– Can Ice packs help stop a seizure (in humans)?, Turkey: Not very Turkic, Why do Native Americans Have so much Neanderthal DNA?, and Do Black Babies have Blue Eyes?

It’s been a while since these posts aired, so I thought it was time to revisit the material and see if anything new has turned up.

First, Ice packs and Epilepsy

Ice packs (cold packs) applied to the lower back at the first sign of a seizure may be able to halt or significantly decrease the severity of a seizure in humans.

I consider this one of the most important posts I’ve written, because it is the only one that offers useful, real-life advice: if someone is having a seizure, grab an ice pack or two and press them against the person’s back/neck. There is very little you can do for someone who is already having a seizure besides making sure they don’t accidentally hurt themselves, but using ice packs may help decrease the duration and severity of the seizure.

I have received some very positive responses to the post, including this one, by Tom Coventry:

We have been using an ice pack on our 13 yr old Son’s neck to stop seizures for nearly a year now and it works without fail to bring the seizures to an end within seconds of applying the ice. This is an old technique used before medications were invented, you can read about it at The Meridian Foundation papers on Edgar Case and Abdominal epilepsy.

Here is a relevant quote from Cayce’s paper on abdominal epilepsy:

… Also note that the reflex from the abdomen was mediated through the medulla oblongata, a important nerve center at the upper portion of the spinal cord where it enters the skull.  This is significant because Cayce sometimes recommended that a piece of ice be placed at this area during the aura or at the beginning of the seizure.  This simple technique has proven effective in several contemporary cases where Cayce’s therapeutic model has been utilized. Incidentally, this technique for preventing seizures was also used by osteopathic physicians during the early decades of this century and is included in the therapeutic model developed by the Meridian Institute. …

If the subject is currently experiencing seizures and can sense the beginning of the episode, they are encouraged to use a piece of ice at the base of the brain for one to two minutes.

I encountered the ice packs trick on forums where people were talking about treating seizures in dogs. (Yes, there are dogs with epilepsy.) There are many accounts of people successfully stopping or preventing their dogs from going into a seizure by grabbing a cold pack at the first warning signs and putting it directly onto the dog’s lower back:

We have been using ice packs to help manage our girl’s seizures for over a year now. From what I have heard first hand from others is that it either doesn’t work at all or it works fabulously. With our girl it “works fabulously”. It is not the miracle cure and it does not prevent future seizures but it definitely stops her grand mal right in its tracks. It is the most amazing thing I have ever seen. … If we get the ice pack on her within the first 15 seconds or so, the grand mal just suddenly stops. Like a light switch. All motor movement comes to a halt. She continues to be incoherent for a bit but all movements stop.

Oddly, though, I haven’t found much discussion of the use of ice packs on humans. But if it works on dogs, why wouldn’t it work on people? On the grand evolutionary scale, our nervous systems are pretty similar–we’re both mammals with neocortexes, after all.

nrneurol.2014.62-f1
From The Hidden Genetics of Epilepsy

My epileptic friend has also reported continued good success with the technique; her husband says he can feel an immediate change in the pattern of the seizure

My original post outlines some of the scientific evidence in favor of the technique; I’ll just quote one bit:

The Journal of American Holistic Veterinary Medical Association published an article on the use of ice packs to stop seizures in dogs, A Simple, Effective Technique for Arresting Canine Epileptic Seizures, back in 2004. You can read it for a mere $95, or check out the highlights on Dawg Business’s blog:

Fifty-one epileptic canine patients were successfully treated during an epileptic seizure with a technique involving the application of ice on the back (T10 to L4). This technique was found to be effective in aborting or shortening the duration of the seizure.

I suspect the “ice trick” was once fairly well-known before there were medications for preventing seizures, but modern doctors are just taught about the medications. And ice packs, to be clear, can’t cure epilepsy. But they can help people who are in the midst of a seizure.

Any doctors out there, please do some research on this. I think a lot of people could benefit.

Anthropology Friday: The Crackers of Apalachee, Florida

Remington_A_cracker_cowboy
A cracker cowboy, by Frederick Remington, 1895

About two years ago I reviewed Lois Lenski’s Strawberry Girl, a middle grade novel about the conflict between newly arrived, dedicated farmers and long-established families of hoe-farmers/ranchers/hunters in the backwoods of Florida. It was a pleasant book based on solid research among the older residents, but left me with many questions (as surely any children’s book would)–most notably, was the author’s description of the newly arrived farmers as “Crackers” accurate, or should the term be more properly restricted to the older, wilder inhabitants?

I had not known, prior to Lenski’s book, that “Cracker” even was an ethnonym; today it is used primarily as a slur, the original Crackers and their lifestyle having all but disappeared. Who were the Crackers? Where did they come from? Do they hail from the same stock as settled Appalachia (the mountains, not to be confused with Apalachee, the county in Florida we’ll be discussing in this post,) or different stock? Or is there perhaps a common stock that runs throughout America, appearing as more or less of the population in proportion to the favorability of the land for their lifestyles?

Today I happened upon Richard Wayne Sapp’s ethnography of Apalachee County, Florida: Suwannee River Town, Suwannee River Country: political moieties in a southern county community, published in 1976, which directly addresses a great many of my questions. So far it has been a fascinating book, and I am glad I found it.

I must note, though, that there currently is no “Apalachee County” in Florida. (There are an Apalachee Parkway and an Apalachee Park, though.) However, comparing the maps and geographic details in the book with a current map of Florida reveals that Apalachee Count is now Suwannee County. Wikipedia should note the change.

So without further ado, here are a few interesting quotes :

Apalachee County, a north Florida county community, nestles in a bend of the Suwanee River. The urban county seat is the center of government and associational life. Scattered over the country-side are farming neighborhoods whose interactional centers are rural churches. Count seat and rural neighborhoods are coupled by mutual exchanges of goods and services: neither are, of themselves, cultural wholes. The poor quality of its soils and the relative recency of settlement (post-Civil-War) give the community its distinctiveness; it never had a planting elite.

Apalachee society is structured along moiety lines: town and country.

EvX: “Moiety” means half; Wikipedia defines it in anthropology as:

a [kinship] descent group that coexists with only one other descent group within a society. In such cases, the community usually has unilineal descent, either patri- or matri-lineal so that any individual belongs to one of the two moiety groups by birth, and all marriages take place between members of opposite moieties. It is an exogamous clan system with only two clans.

Here I think Sapp is using moiety more in the sense of “two interacting groups that form a society” without implying that all town people take country spouses and vice versa. But continuing:

These halves rest on an earlier “cracker” horizon of isolated single-family homesteads. True crackers subsisted by living off the land and practicing hoe agriculture; they were fiercely independent and socially isolated. Apalachee moieties are also related to regional traditions: townsmen as town naboobs in the Cavalier tradition and countrymen as yeoman farmers in the Calvinist tradition. Townsmen promote associational interaction, valuing familism (nuclear), hierarchy in organisations, “progress,” and paternalistic interaction with countrymen. Countrymen value familism (extended), localism, and personalism, interacting on individually egalitarian rather than ordered associational terms. …

The division of governmental offices falls along moiety lines. Townsmen control municipal government, countrymen control the powerful county bodies. Except for jobs, the governmental institution is not a major source of political prizes. The country moiety is the dominant political force.”

555px-Alcohol_control_in_the_United_States.svg
Wet counties = blue; dry = red; yellow = mixed laws. (Currently.)

EvX: There follows a fascinating description of the battle over a referendum on whether the county should stay “dry” (no legal sale of alcohol) or go “wet” (alcohol sales allowed.) The Wets, led by business interests, had hoped that an influx of new residents who held more pro-alcohol views than established residents would tip the electoral balance in their favor. I find this an interesting admission of one of democracy’s weak points–the ability of newcomers to move into an area and vote to change the laws in ways the folks who already live there don’t like.

The Drys, led by local Baptist pastors, inflamed local sentiments against the wets, who were supposedly trying to overturn the law just to make make a hotel chain more interested in buying a tract of land owned by the leader of the Wets. The Wets argued the sale would attract more businesses to the area, boosting the economy; the Drys argued that the profits would go entirely to the wets and the community itself would reap the degradation and degeneration caused by alcohol.

The Drys won, and the leader of the Wets hasn’t set foot in a church in Apalachee county since then.

(Suwannee/Apalachee county finally allowed the sale of alcohol in 2011.)

1000px-United_States_Counties_Per_Capita_Income
Per capita GDP by county (wikipedia)

Does a county’s wet or dry status impact the willingness of businesses to move into the area, leading to depressed economies for Drys? I wanted to find out, so I pulled up maps of current dry counties and per capita GDP by county. It’s not a perfect comparison since it doesn’t control for cost of living, but it’ll do.

In general, I don’t think the theory holds. Suwanee, dry until 2011, is doing better than neighboring counties that went wet earlier (some of those neighboring counties are very swampy.) Central Mississippi is wet, but doing badly; a string of dry counties runs down the east side of the state, and unless my eyes deceive me, they’re doing better than the wet counties. Kentucky’s drys are economically depressed, but so are West Virginia’s wets. Pennsylvania and Texas’s “mixed” counties are doing fine, while Texas’s wets are doing badly. Virginia has some pretty poor wet counties; Alaska’s dry county is doing great.

However, this is only a comparison of currently dry and wet counties; if I had data that showed for what percent of the 20th century each county allowed the sale of alcohol, that might provide a different picture.

Still, I’m willing to go out on a limb, here: differences in local GDP have more to do with demographics than the sale of one particular beverage.

But back to Sapp:

A system of human community derivative of Europe and still basic to the southern United States is the county-community. … The symbolic heart of this traditional community, the county courthouse, has been the central point of political and economic assembly for county residents. Its people lived dispersed in neighborhoods clustered about small Protestant churches, points of assembly in socialization and socializing as well as bastions of moral and spiritual rectitude.

He quotes Havard, 1972, on the traits of the Calvinist-Yeoman Farmer–radical individualism, personalism, personal independence, populism, regionalist traditions, etc–vs the Cavalier-Planter/Town Nabob–social conformity, caste, paternalistic dependency, conservatism, nationalist patriotism.

He wrote that this split fathered two mainstream traditions in the South: yeoman farmer and plantation farmer. The yeoman farmers, he said, opposed governmental centralization and exhibited an aversion to urbanism, industrialization, and the entrepreneurial classes; they were libertarian, egalitarian, and populist. The plantation whigs, identified withdowntown mercantile interests, supported themselves as planters … bankers, and merchants, sat as the “county seat clique,” developed the theme of racial segregation in the post-bellum era, and promoted a cult of “manners” and paternalism. …

However, the Cavalier plantation elite never really settled in Apalachee/Suwannee county, due to its soil being much too poor for serious agriculture.

As a result, not many slaves were ever brought into the county, nor have their descendants migrated to the area. Since the population is mostly white, racial issues appear only rarely in the book, and it is safe to say that the culture never developed in quite the same ways as it did in the plantation-dominated Deep South.

Rather, Apalachee was settled by the Cavalier-Yeomen farmers and the Crackers:

Although the origin of the term cracker is disputed, Stetson Kennedy claims that cracker first applied to an assortment of “bad characters” who gathered in northern Florida before it became a territory of the United States. Deep-South Southerners later applied the epithet to the “poor white folk of Florida, Georgia, and Alababama.” (Kennedy, 1942, p. 59). He further relates:

“Crackers are mainly descended from the Irish, Scotch, and English stock which, from 1740 on, was slowly populating the huge Southern wilderness behind the thin strip of coastal civilization. These folk settled the Cumberland Valley, the Shenandoah, and spread through every Southern state east of the Mississippi. That branch of the family which settled in the Deep South was predominantly of Irish ancestry…

“The early crackers were the Okies of their day (as they have been ever since). Cheated of land, not by wind and erosion, but by the plantation and slavery system of the Old South, they were nonessentials in an economic, political and social order dominated by the squirearchy of wealthy planters, and in most respects were worse off than the Negro slaves. “

This contradicts the history told in our prior ethnography of Appalachia, which claims pointedly that the denizens of the Cumberland are not descended from the “poor whites” of the Deep South, but from Pennsylvanians. I offer, however, a synthesis: both the whites who settled on the Pennsylvania frontier and followed Daniel Boone into the Cumberland and found it pleasant enough to remain in the mountains and the whites who adopted an only semi-agricultural lifestyle in the backwoods and swamps of Florida hailed from the same original British stock and simply took different routes to get where they were going.

Powell, (1969) a white turpentine camp overseer of the late nineteenth century, called the crackers of Apalachee County “wild woodsmen” (p. 30) and mentioned a man who “had lived the usual life of a shiftless Cracker, hunting and fishing, and hard work did not agree with him.” …

[Powel writes:]

“When I speak of villages throughout this county, I use the word for lack of a better term, for in nine cases out of ten, they were the smallest imaginable focus of the scattering settlement, and usually one general store embraced the sum total of business enterprise. There the natives came at intervals to trade for coffee, tobacco, and the few other necessities that the woods and waters did not provide them with. Alligators’ hides and teeth, bird plumes and various kinds of pelts were the medium of barter. They were a curious people, and there are plenty of them there yet, born and bred to the forest and as ignorant of the affairs of every-day life outside of their domain, as are the bears and deer upon which they mainly subsist. A man who would venture to tell them that the earth moved instead of the sun, or that there was a device by which a message could be flashed for leagues across a wire, wold run the risk of being lynched, as too dangerous a liar to be at large. “

There is a section on the importance of guns and hunting to the locals, even the children, which will be familiar to anyone with any experience of the rural South. I know from family tales that my grandfather began to hunt when he was 8 years old; he used to sell the pelts of skunks he’d killed to furriers, who de-stinked them, dyed them black, and marketed them as “American sable” over in Europe.

Truth in Advertising laws decimated the “American sable” trade.

The true crackers, Powell’s “wild woodsmen,” were never numerous, and they rarely participated in the social life of the wider Apalachee county-community. Crackers were born, lived, and died in the woods. They buried their own in family plots far from the nearest church. … Cracker families settled the Apalachee area without recourse to legal formalities. Thus, when the yeomen farmers … eventually purchase legal titles to land, true crackers were forced out and deeper into Florida.

This is a common problem (especially for anyone whose ancestors arrived in an area before it was officially part of the US.) Where land is abundant, population density is low, and there aren’t any authorities who can enforce land ownership, anyway, people will be happy to farm where they want, hunt where they want, and defend their claims themselves. This tends to lead to a low-intensity lifestyle:

Craker subsistence strategy depended on scratch, perhaps slash-and-burn, summer agriculture and year-round food collecting activities: hunting, fishing, and foraging. Because their farming operations were so small, limited to the part-time efforts of an individual family, they had no need of financial credit.

Indeed, their fiercely independent, egalitarian ethos prohibited them from interacting significantly in the rural neighborhoods of the community. …

Few true crackers remain in Apalachee County … A few families still live on the borders of the county. There they exploit the food resources of the rivers and swamps and perhaps scratch-farm a few acres. …

Florida_Cracker_cow_and_calf
Florida Cracker cow and calf source

This is not (just) laziness; areas with poor soils or little water simply can’t be intensively farmed, and if the forage is bad, herd animals will be better off if they can wander widely in search of food than if they are confined in one particular place.

Incidentally, there is a landrace of cattle known as the Florida Cracker, descended from the hearty Spanish cattle brought to Florida in the 1600s. Unfortunately, the breed has been on the decline and is now listed as “critical” due to laws passed in 1949 against free-ranging livestock and the introduction of larger breeds more suited to confinement.

Not only does the law fence off the cracker’s land, destroy his livelihood, and drive him out, it also kills the cracker cow by fencing off its land.

The author notes that “cracker” is a slur and that it has been expanded in the past half-century to cover all poor whites, with an interesting footnote:

One speculates that the driving force behind withholding respectability from the true crackers and the extension of the consequently disparaging term to include countrymen of the small farmer class originated with the townspeople. This idea parallels the hypothesis that townsmen perpetuated and revitalized the issue of racial politics int he twentieth century.

On change:

The technological changes of the twentieth century have enabled social institutions to penetrate the isolation of the crackers and enforce town mores. Cracker homicides are no longer unreported and uninvestigated or allowed to result in clannish feuding… No longer may the children escape the public school regimen. No longer may they escape taxation…

[yet] the cracker and his world view persist. While only a handful of true crackers endure in the county… modern-day imitators erect trailers in remote corners, moving to north-central Florida …. to escape the “rat race.”

I think that’s enough for today; I hope you’ve enjoyed the book and urge you to take a look at the whole thing. We’ll discuss the more recent Cavalier-Yeomen farmers next week.

Invasive Memes

 

220px-Smallpox_virus_virions_TEM_PHIL_1849
Smallpox virus

Do people eventually grow ideologically resistant to dangerous local memes, but remain susceptible to foreign memes, allowing them to spread like invasive species?

And if so, can we find some way to memetically vaccinate ourselves against deadly ideas?

***

Memetics is the study of how ideas (“memes”) spread and evolve, using evolutionary theory and epidemiology as models. A “viral meme” is one that spreads swiftly through society, “infecting” minds as it goes.

Of course, most memes are fairly innocent (e.g. fashion trends) or even beneficial (“wash your hands before eating to prevent disease transmission”), but some ideas, like communism, kill people.

Ideologies consist of a big set of related ideas rather than a single one, so let’s call them memeplexes.

Almost all ideological memeplexes (and religions) sound great on paper–they have to, because that’s how they spread–but they are much more variable in actual practice.

Any idea that causes its believers to suffer is unlikely to persist–at the very least, because its believers die off.

Over time, in places where people have been exposed to ideological memeplexes, their worst aspects become known and people may learn to avoid them; the memeplexes themselves can evolve to be less harmful.

Over in epidemiology, diseases humans have been exposed to for a long time become less virulent as humans become adapted to them. Chickenpox, for example, is a fairly mild disease that kills few people because the virus has been infecting people for as long as people have been around (the ancestral Varicella-Zoster virus evolved approximately 65 million years ago and has been infecting animals ever since). Rather than kill you, chickenpox prefers to enter your nerves and go dormant for decades, reemerging later as shingles, ready to infect new people.

By contrast, smallpox (Variola major and Variola minor) probably evolved from a rodent-infecting virus about 16,000 to 68,000 years ago. That’s a big range, but either way, it’s much more recent than chickenpox. Smallpox made its first major impact on the historical record around the third century BC, Egypt, and thereafter became a recurring plague in Africa and Eurasia. Note that unlike chickenpox, which is old enough to have spread throughout the world with humanity, smallpox emerged long after major population splits occurred–like part of the Asian clade splitting off and heading into the Americas.

By 1400, Europeans had developed some immunity to smallpox (due to those who didn’t have any immunity dying), but when Columbus landed in the New World, folks here had had never seen the disease before–and thus had no immunity. Diseases like smallpox and measles ripped through native communities, killing approximately 90% of the New World population.

If we extend this metaphor back to ideas–if people have been exposed to an ideology for a long time, they are more likely to have developed immunity to it or the ideology to have adapted to be relatively less harmful than it initially was. For example, the Protestant Reformation and subsequent Catholic counter-reformation triggered a series of European wars that killed 10 million people, but today Catholics and Protestants manage to live in the same countries without killing each other. New religions are much more likely to lead all of their followers in a mass suicide than old, established religions; countries that have just undergone a political revolution are much more likely to kill off large numbers of their citizens than ones that haven’t.

This is not to say that old ideas are perfect and never harmful–chickenpox still kills people and is not a fun disease–but that any bad aspects are likely to become more mild over time as people wise up to bad ideas, (certain caveats applying).

But this process only works for ideas that have been around for a long time. What about new ideas?

You can’t stop new ideas. Technology is always changing. The world is changing, and it requires new ideas to operate. When these new ideas arrive, even terrible ones can spread like wildfire because people have no memetic antibodies to resist them. New memes, in short, are like invasive memetic species.

In the late 1960s, 15 million people still caught smallpox every year. In 1980, it was declared officially eradicated–not one case had been seen since 1977, due to a massive, world-wide vaccination campaign.

Humans can acquire immunity to disease in two main ways. The slow way is everyone who isn’t immune dying; everyone left alive happens to have adaptations that let them not die, which they can pass on to their children. As with chickenpox, over generations, the disease becomes less severe because humans become successively more adapted to it.

The fast way is to catch a disease, produce antibodies that recognize and can fight it off, and thereafter enjoy immunity. This, of course, assumes that you survive the disease.

Vaccination works by teaching body’s immune system to recognize a disease without infecting it with a full-strength germ, using a weakened or harmless version of the germ, instead. Early on, weakened germs from actual smallpox scabs or lesions to inoculate people, a risky method since the germs often weren’t that weak. Later, people discovered that cowpox was similar enough to smallpox that its antibodies could also fight smallpox, but cowpox itself was too adapted to cattle hosts to seriously harm humans. (Today I believe the vaccine uses a different weakened virus, but the principle is the same.)

The good part about memes is that you do not actually have to inject a physical substance into your body in order to learn about them.

Ideologies are very difficult to evaluate in the abstract, because, as mentioned, they are all optimized to sound good on paper. It’s their actual effects we are interested in.

So if we want to learn whether an idea is good or not, it’s probably best not to learn about it by merely reading books written by its advocates. Talk to people in places where the ideas have already been tried and learn from their experiences. If those people tell you this ideology causes mass suffering and they hate it, drop it like a hot potato. If those people are practicing an “impure” version of the ideology, it’s probably an improvement over the original.

For example, “communism” as practiced in China today is quite different from “communism” as practiced there 50 years ago–so much so that the modern system really isn’t communism at all. There was never, to my knowledge, an official changeover from one system to another, just a gradual accretion of improvements. This speaks strongly against communism as an ideology, since no country has managed to be successful by moving toward ideological communist purity, only by moving away from it–though they may still find it useful to retain some of communism’s original ideas.

I think there is a similar dynamic occurring in many Islamic countries. Islam is a relatively old religion that has had time to adapt to local conditions in many different parts of the world. For example, in Morocco, where the climate is more favorable to raising pigs than in other parts of the Islamic world, the taboo against pigs isn’t as strongly observed. The burka is not an Islamic universal, but characteristic of central Asia (the similar niqab is from Yemen). Islamic head coverings vary by culture–such as this kurhars, traditionally worn by unmarried women in Ingushetia, north of the Caucuses, or this cap, popular in Xianjiang. Turkey has laws officially restricting burkas in some areas, and Syria discourages even hijabs. Women in Iran did not go heavily veiled prior to the Iranian Revolution. So the insistence on extensive veiling in many Islamic communities (like the territory conquered by ISIS) is not a continuation of old traditions, but the imposition of a new, idealized, version of Islam.

Purity is counter to practicality.

Of course, this approach is hampered by the fact that what works in one place, time, and community may not work in a different one. Tilling your fields one way works in Europe, and tilling them a different way works in Papua New Guinea. But extrapolating from what works is at least a good start.

 

 

Neanderthal Skull for 3D Printing

e3fa487b36f43641fc86d1fbe40665b4_preview_featured Meet Nandy the Neanderthal. You can download him at Thingiverse.

This is my first creation, Nandy the Neanderthal, based on the Chapelle-aux-Saints 1 skull and this side view. Note that he is based on two different skulls, but still very much a Neanderthal.

Since this is my very first creation and I don’t have a 3D printer yet, (I expect to receive one soon and am planning ahead,) I am still learning all of the ins and outs of this technology and so would appreciate any technical feedback.

Neanderthals evolved around 600,000-800,000 years ago and spread into the Middle East, Europe, and central Asia. They made stone tools, controlled fire, and hunted. They survived in a cold and difficult climate, but likely could make no more than the simplest of clothes. As a result, they may have been, unlike modern humans, hairy.

Cochran and Harpending of West Hunter write in The 10,000 Year Explosion: 

 Chimpanzees have ridges on their finger bones that stem from the way that they clutch their mothers’ fur as infants. Modern humans don’t have these ridges, but Neanderthals do.

Hoffecker, in The Spread of Modern Humans in Europe writes:

Neanderthal sites show no evidence of tools for making tailored clothing. There are only hide scrapers, which might have been used to make blankets or ponchos. This is in contrast to Upper Paleolithic (modern human) sites, which have an abundance of eyed bone needles and bone awls.

Their skulls were, on average, larger than ours, with wide noses, round eyes, and an elongated braincase. Their facial features were robust–that is, strong, thick, and heavy.

The Chappel-aux-Saints 1 Neanderthal lived to about 40 years old. He had lost most of his teeth years before his death, (I gave Nandy some teeth, though,) suffered arthritis, and must have been cared for in his old age by the rest of his tribe. At his death he was most likely buried in a pit dug by his family, which preserved his skeleton in nearly complete condition for 60,000 years.

Anatomically modern humans, Homo sapiens, encountered and interbred with Neanderthals around 40,000 years ago. (Neanderthals are also humans–Homo neanderthalensis.) Today, about 1-5% of the DNA in non-Sub-Saharan Africans hails originally from a Neanderthal ancestor. (Melanesians also have DNA from a cousin of the Neanderthals, the Denisovans, and Sub-Saharan Africans may have their own archaic ancestors.)

Unfortunately for Nandy and his relations, the Neanderthals also began to disappear around 40,000 years ago. Perhaps it was the weather, or Homo sapiens out competed them, or their enormous skulls just caused too much trouble in childbirth. Whatever happened, the Neanderthals remain a mystery, evidence of the past when we weren’t the only human species in town.

The Endless Ratiocination of the Dysphoric Mind

Begin

My endless inquiries made it impossible for me to achieve anything. Moreover, I get to think about my own thoughts of the situation in which I find myself. I even think that I think of it, and divide myself into an infinite retrogressive sequence of ‘I’s who consider each other. I do not know at which ‘I’ to stop as the actual, and as soon as I stop, there is indeed again an ‘I’ which stops at it. I become confused and feel giddy as if I were looking down into a bottomless abyss, and my ponderings result finally in a terrible headache. –Møller, Adventures of a Danish Student

Moller’s Adventures of a Danish Student was one of Niels Bohr’s favorite books; it reflected his own difficulties with cycles of ratiocination, in which the mind protects itself against conclusions by watching itself think.

I have noticed a tendency on the left, especially among the academic-minded, to split the individual into sets of mental twins–one who is and one who feels that it is; one who does and one who observes the doing.

Take the categories of “biological sex” and “gender.” Sex is defined as the biological condition of “producing small gametes” (male) or “producing large gametes” (female) for the purpose of sexual reproduction. Thus we can talk about male and female strawberry plants, male and female molluscs, male and female chickens, male and female Homo Sapiens.

(Indeed, the male-female binary is remarkably common across sexually reproducing plants and animals–it appears that the mathematics of a third sex simply don’t work out, unless you’re a mushroom. How exactly sex is created varies by species, which makes the stability of the sex-binary all the more remarkable.)

And for the first 299,945 years or so of our existence, most people were pretty happy dividing humanity into “men” “women” and the occasional “we’re not sure.” People didn’t understand why or how biology works, but it was a functional enough division for people.

In 1955, John Money decided we needed a new term, “gender,” to describe, as Wikipedia puts it, “the range of characteristics pertaining to, and differentiating between, masculinity and femininity.” Masculinity is further defined as “a set of attributes, behaviors, and roles associated with boys and men;” we can define “femininity” similarly.

So if we put these together, we get a circular definition: gender is a range of characteristics of the attributes of males and females. Note that attributes are already characteristics. They cannot further have characteristics that are not already inherent in themselves.

But really, people invoke “gender” to speak of a sense of self, a self that reflexively looks at itself and perceives itself as possessing traits of maleness of femaleness; the thinker who must think of himself as “male” before he can act as a male. After all, you cannot walk without desiring first to move in a direction; how can you think without first knowing what it is you want to think? It is a cognitive splitting of the behavior of the whole person into two separate, distinct entities–an acting body, possessed of biological sex, and a perceiving mind, that merely perceives and “displays” gender.

But the self that looks at itself looking at itself is not real–it cannot be, for there is only one self. You can look at yourself in the mirror, but you cannot stand outside of yourself and be simultaneously yourself; there is only one you. The alternative, a fractured consciousness, is a symptom of mental disorder and treated with chlorpromazine.

Robert Oppenheimer was once diagnosed with schizophrenia–dementia praecox, as they called it then. Whether he had it or simply confused the therapist by talking about wave/particle dualities is another matter.

Then there are the myriad variants of the claim that men and women “perform femininity” or “display masculinity” or “do gender.” They do not claim that people are feminine or act masculine–such conventional phrasing assumes the existence of a unitary self that is, perceives, and acts. Rather, they posit an inner self that possesses no inherent male or female traits, for whom masculinity and femininity are only created via the interaction of their body and external expectations. In this view, women do not buy clothes because they have some inherent desire to go shopping and buy pretty things, but because society has compelled them to do so in order to comply with external notion of “what it means to be female.” The self who produces large gametes is not the self who shops.

The biological view of human behavior states that most humans engage in a variety of behaviors because similar behaviors contributed to the evolutionary success of our ancestors. We eat because ancestors who didn’t think eating was important died. We jump back when we see something that looks like a spider because ancestors who didn’t got bitten and died. We love cute things with big eyes because they look like babies because we are descended mostly from people who loved their babies.

Sometimes we do things that we don’t enjoy but rationalize will benefit us, like work for an overbearing boss or wear a burka, but most “masculine” and “feminine” behaviors fall into the category of things people do voluntarily, like “compete at sports” or “gossip with friends.” The fact that more men than women play baseball and more women than men enjoy gossiping with friends has nothing to do with an internal self attempting to perform gender roles and everything to do with the challenges ancestral humans faced in reproducing.

But whence this tendency toward ratiocination? I can criticize it as a physical mistake, but does it reflect an underlying psychological reality? Do some people really perceive themselves as a self separate from themselves, a meta-self watching the first self acting in particular manners?

Here is a study that found that folks with more cognitive flexibility tended to be more socially liberal, though economic conservatism/liberalism didn’t particularly correlate with cognitive flexibility.

I find that if I work hard, I may achieve a state of zen, an inner tranquility in which the endless narrative of thoughts coalesce for a moment and I can just be. Zen is flying down a straight road at 80 miles an hour on a motorcycle; zen is working on a math problem that consumes all of your attention; zen is dancing until you only feel the music. The opposite of zen is lying in bed at 3 AM, staring at the ceiling, thinking of all of your failures, unable to switch off your brain and fall asleep.

Dysphoria is a state of unease. Some people have gender dysphoria; a few report temporal dysphoria. It might be better defined at disconnection, a feeling of being eternally out of place. I feel a certain dysphoria every time I surface from reading some text of anthropology, walk outside, and see cars. What are these metal things? What are these straight, right-angled streets? Everything about modern society strikes me as so artificial and counter to nature that I find it deeply unsettling.

It is curious that dysphoria itself is not discussed more in the psychiatric literature. Certainly a specific form or two receives a great deal of attention, but not the general sense itself.

When things are in place, you feel tranquil and at ease; when things are out of place you agitated, always aware of the sense of crawling out of your own skin. People will try any number of things to turn off the dysphoria; a schizophrenic friend reports that enough alcohol will make the voices stop, at least for a while. Drink until your brain shuts up.

But this is only when things are out of place. Healthy people seek a balance between division and unity. Division of the self is necessary for self-criticism and improvement; people can say, then, “I did a bad thing, but I am not a bad person, so I will change my behavior and be better.” Metacognition allows people to reflect on their behavior without feeling that their self is fundamentally at threat, but too much metacognition leads to fragmentation and an inability to act.

People ultimately seek a balanced, unified sense of self.

It is said that not everyone has an inner voice, a meta-self commenting on the acting self, and some have more than one:

My previous blogs have observed that some people –women with bulimia nervosa, for example– have frequent multiple simultaneous experiences, but that multiple experience is not frequent in the general population. …

Consider inner speech. Subject experienced themselves as innerly talking to themselves in 26% of all samples, but there were large individual differences: some subjects never experienced inner speech; other subjects experienced inner speech in as many as 75% of their samples. The median percentage across subjects was 20%.

It’s hard to tell what people really experience, but certainly there is a great deal of variety in people’s internal experiences. Much of thought is not easily describable. Some people hear many voices. Some cannot form mental images:

I think the best way I can describe my aphantasia is to say that I am unaware of anything in my mind except these categories: i) direct sensory input, ii) unheardwords that carry thoughts, iii) unheardmusic, iv) a kind of invisible imagery, which I can best describe as sensation of pictures that are in a sense too faint to see, v) emotions, and vi) thoughts which seem too fastto exist as words. … I see what is around me, unless my eyes are closed when all is always black. I hear, taste, smell and so forth, but I dont have the experience people describe of
hearing a tune or a voice in their heads. Curiously, I do frequently have a tune going around in my head, all I am lacking is the direct experience of hearingit.

The quoted author is, despite his lack of internal imagery, quite intelligent, with a PhD in physics.

Some cannot hear themselves think at all.

I would like to know if there is any correlation between metacognition, ratiocination, and political orientations–I have so far found a little on the subject:

We find a relationship between thinking style and political orientation and that these effects are particularly concentrated on social attitudes. We also find it harder to manipulate intuitive and reflective thinking than a number of prominent studies suggest. Priming manipulations used to induce reflection and intuition in published articles repeatedly fail in our studies. We conclude that conservatives—more specifically, social conservatives—tend to be dispositionally less reflective, social liberals tend to be dispositionally more reflective, and that the relationship between reflection and intuition and political attitudes may be more resistant to easy manipulation than existing research would suggest.

And a bit more:

… Berzonsky and Sullivan (1992) cite evidence that individuals higher in reported
self-reflection also exhibit more openness to experience, more liberal values, and more general tolerance for exploration. As noted earlier, conservatives tend to be less open to experience, more intolerant of ambiguity, and generally more reliant on self-certainty than liberals. That, coupled with the evidence reported by Berzonsky and Sullivan, strongly suggests conservatives engage in less introspective behaviors.

Following an interesting experiment looking at people’s online dating profiles, the authors conclude:

Results from our data support the hypothesis that individuals identifying
themselves as “Ultra Conservative‟ exhibit less introspection in a written passage with personal content than individuals identifying themselves as “Very Liberal‟. Individuals who reported a conservative political orientation often provided more descriptive and explanatory statements in their profile’s “About me and who I‟m looking for‟ section (e.g., “I am 62 years old and live part time in Montana” and “I enjoy hiking, fine restaurants”). In contrast, individuals who reported a liberal political orientation often provided more insightful and introspective statements in their narratives (e.g., “No regrets, that‟s what I believe in” and “My philosophy in life is to make complicated things simple”).

The ratiocination of the scientist’s mind can ultimately be stopped by delving into that most blessed of substances, reality, (or as close to it as we can get.) There is, at base, a fundamentally real thing to delve into, a thing which makes ambiguities disappear. Even a moral dilemma can be resolved with good enough data. We do not need to wander endlessly within our own thoughts; the world is here.

End

 

Denny: the Neanderthal-Denisovan Hybrid

Carte_Neandertaliens
Neanderthal Sites (source: Wikipedia)

Homo Sapiens–that is, us, modern humans, are about 200-300,000 years old. Our ancestor, Homo heidelbergensis, lived in Africa around 700,000-300,000 years ago.

Around 700,000 years ago, another group of humans split off from the main group. By 400,000 years ago, their descendants, Homo neanderthalensis–Neanderthals–had arrived in Europe, and another band of their descendants, the enigmatic Denisovans, arrived in Asia.

While we have found quite a few Neanderthal remains and archaeological sites with tools, hearths, and other artifacts, we’ve uncovered very few Denisovan remains–a couple of teeth, a finger bone, and part of an arm in Denisova Cave, Russia. (Perhaps a few other remains I am unaware of.)

Yet from these paltry remains scientists have extracted enough DNA to ascertain that no only were Denisovans a distinct species, but also that Melanesians, Papuans, and Aborigines derive about 3-6% of their DNA from a Denisovan ancestors. (All non-African populations also have a small amount of Neanderthal DNA, derived from a Neanderthal ancestors.)

If Neanderthals and Homo sapiens interbred, and Denisovans and Homo sapiens interbred, did Neanderthals and Denisovans ever mate?

nature-siberian-neanderthals-17.02.16-v2
The slightly more complicated family tree, not including Denny

Yes.

The girl, affectionately nicknamed Denny, lived and died about 90,000 years ago in Siberia. The remains of an arm, found in Denisova Cave, reveal that her mother was a Neanderthal, her father a Denisovan.

We don’t yet know what Denisovans looked like, because we don’t have any complete skeletons of them, much less good skulls to examine, so we don’t know what a Neanderthal-Denisovan hybrid like Denny looked like.

But the fact that we can extract so much information from a single bone–or fragment of bone–preserved in a Siberian cave for 90,000 years–is amazing.

We are still far from truly understanding what sorts of people our evolutionary cousins were, but we are gaining new insights all the time.

Book Club: How to Create a Mind, pt 2/2

Ray Kurzweil, writer, inventor, thinker

Welcome back to EvX’s Book Club. Today  are finishing Ray Kurzweil’s How to Create a Mind: The Secret of Human thought Revealed.

Spiders are interesting, but Kurzweil’s focus is computers, like Watson, which trounced the competition on Jeopardy!

I’ll let Wikipedia summarize Watson:

Watson was created as a question answering (QA) computing system that IBM built to apply advanced natural language processing, information retrieval, knowledge representation, automated reasoning, and machine learning technologies to the field of open domain question answering.[2]

The sources of information for Watson include encyclopedias, dictionaries, thesauri, newswire articles, and literary works. Watson also used databases, taxonomies, and ontologies. …

Watson parses questions into different keywords and sentence fragments in order to find statistically related phrases.[22] Watson’s main innovation was not in the creation of a new algorithm for this operation but rather its ability to quickly execute hundreds of proven language analysis algorithms simultaneously.[22][24] The more algorithms that find the same answer independently the more likely Watson is to be correct.[22] Once Watson has a small number of potential solutions, it is able to check against its database to ascertain whether the solution makes sense or not.[22]

Kurzweil opines:

That is at least one reason why Watson represents such a significant milestone: Jeopardy! is precisely such a challenging language task. … What is perhaps not evident to many observers is that Watson not only had to master the language in the unexpected and convoluted queries, but for the most part its knowledge was not hand-coded. It obtained that knowledge by actually reading 200 million pages of natural-language documents, including all of Wikipedia… If Watson can understand and respond to questions based on 200 million pages–in three seconds!–here is nothing to stop similar systems from reading the other billions of documents on the Web. Indeed, that effort is now under way.

A point about the history of computing that may be petty of me to emphasize:

Babbage’s conception is quite miraculous when you consider the era in which he lived and worked. However, by the mid-twentieth century, his ideas had been lost in the mists of time (although they were subsequently rediscovered.) It was von Neumann who conceptualized and articulated the key principles of the computer as we know it today, and the world recognizes this by continuing to refer to the von Neumann machine as the principal model of computation. Keep in mind, though, that the von Neumann machine continually communicates data between its various units and within those units, so it could not be built without Shannon’s theorems and the methods he devised for transmitting and storing reliable digital information. …

You know what? No, it’s not petty.

Amazon lists 57 books about Ada Lovelace aimed at children, 14 about Alan Turing, and ZERO about John von Neumann.

(Some of these results are always irrelevant, but they are roughly correct.)

“EvX,” you may be saying, “Why are you counting children’s books?”

Because children are our future, and the books that get published for children show what society deems important for children to learn–and will have an effect on what adults eventually know.

I don’t want to demean Ada Lovelace’s role in the development of software, but surely von Neumann’s contributions to the field are worth a single book!

*Slides soapbox back under the table*

Anyway, back to Kurzweil, now discussing quantum mechanics:

There are two ways to view the questions we have been considering–converse Western an Easter perspective on the nature of consciousness and of reality. In the Western perspective, we start with a physical world that evolves patterns of information. After a few billion years of evolution, the entities in that world have evolved sufficiently to become conscious beings In the Eastern view, consciousness is the fundamental reality, the physical word only come into existence through the thoughts of conscious beings. …

The East-West divide on the issue of consciousness has also found expression in opposing schools of thought in the field of subatomic physics. In quantum mechanics, particles exist in what are called probability fields. Any measurement carried out on them by a measuring device causes what is called a collapse of the wave function, meaning that the particle suddenly assumes a particular location. A popular view is that such a measurement constitutes observation by a conscious observer… Thus the particle assume a particular location … only when it is observed. Basically particles figure that if no one is bothering to look at them, they don’t need to decide where they are. I call this the Buddhist school of quantum mechanics …

Niels Bohr

Or as Niels Bohr put it, “A physicist is just an atom’s way of looking at itself.” He also claimed that we could describe electrons exercised free will in choosing their positions, a statement I do not think he meant literally; “We must be clear that when it comes to atoms, language can be used only as in poetry,” as he put it.

Kurzweil explains the Western interpretation of quantum mechanics:

There is another interpretation of quantum mechanics… In this analysis, the field representing a particle is not a probability field, but rather just a function that has different values in different locations. The field, therefore, is fundamentally what the particle is. … The so-called collapse of the wave function, this view holds, is not a collapse at all. … It is just that a measurement device is also made up of particles with fields, and the interaction of the particle field being measured and the particle fields of the measuring device result in a reading of the particle being in a particular location. The field, however, is still present. This is the Western interpretation of quantum mechanics, although it is interesting to note that the more popular view among physicists worldwide is what I have called the Eastern interpretation.

Soviet atomic bomb, 1951

For example, Bohr has the yin-yang symbol on his coat of arms, along with the motto contraria sunt complementa, or contraries are complementary. Oppenheimer was such a fan of the Bhagavad Gita that he read it in Sanskrit and quoted it upon successful completion of the Trinity Test, “If the radiance of a thousand suns were to burst at once into the sky, that would be like the splendor of the mighty one,” and “Now I am become death, the destroyer of worlds.” He credited the Gita as one of the most important books in his life.

Why the appeal of Eastern philosophy? Is it something about physicists and mathematicians? Leibnitz, after all, was fond of the I Ching. As Wikipedia says:

Leibniz was perhaps the first major European intellectual to take a close interest in Chinese civilization, which he knew by corresponding with, and reading other works by, European Christian missionaries posted in China. Having read Confucius Sinarum Philosophus on the first year of its publication,[153] he concluded that Europeans could learn much from the Confucian ethical tradition. He mulled over the possibility that the Chinese characters were an unwitting form of his universal characteristic. He noted with fascination how the I Ching hexagrams correspond to the binary numbers from 000000 to 111111, and concluded that this mapping was evidence of major Chinese accomplishments in the sort of philosophical mathematics he admired.[154] Leibniz communicated his ideas of the binary system representing Christianity to the Emperor of China hoping it would convert him.[84] Leibniz may be the only major Western philosopher who attempted to accommodate Confucian ideas to prevailing European beliefs.[155]

Leibniz’s attraction to Chinese philosophy originates from his perception that Chinese philosophy was similar to his own.[153] The historian E.R. Hughes suggests that Leibniz’s ideas of “simple substance” and “pre-established harmony” were directly influenced by Confucianism, pointing to the fact that they were conceived during the period that he was reading Confucius Sinarum Philosophus.[153]

Perhaps it is just that physicists and mathematicians are naturally curious people, and Eastern philosophy is novel to a Westerner, or perhaps by adopting Eastern ideas, they were able to purge their minds of earlier theories of how the universe works, creating a blank space in which to evaluate new data without being biased by old conceptions–or perhaps it is just something about the way their minds work.

As for quantum, I favor the de Broglie-Bohm interpretation of quantum mechanics, but obviously I am not a physicist and my opinion doesn’t count for much. What do you think?

But back to the book. If you are fond of philosophical ruminations on the nature of consciousness, like “What if someone who could only see in black and white read extensively about color “red,” could they ever achieve the qualia of actually seeing the color red?” or “What if a man were locked in a room with a perfect Chinese rulebook that told him which Chinese characters to write in response to any set of characters written on notes passed under the door? The responses are be in perfect Chinese, but the man himself understands not a word of Chinese,” then you’ll enjoy the discussion. If you already covered all of this back in Philosophy 101, you might find it a bit redundant.

Kurzweil notes that conditions have improved massively over the past century for almost everyone on earth, but people are increasingly anxious:

A primary reason people believe life is getting worse is because our information about the problems of the world has steadily improved. If there is a battle today somewhere on the planet, we experience it almost as if we were there. During World War II, tens of thousand of people might perish in a battle, and if the public could see it at all it was in a grainy newsreel in a movie theater weeks later. During World War I a small elite could read about the progress of the conflict in the newspaper (without pictures.) During the nineteenth century there was almost no access to news in a timely fashion for anyone.

As for the future of man, machines, and code, Kurzweil is even more optimistic than Auerswald:

The last invention that biological evolution needed to make–the neocortex–is inevitably leading to the last invention that humanity needs to make–truly intelligent machines–and the design of one is inspiring the other. … by the end of this century we will be able to create computation at the limits of what is possible, based on the laws of physics… We call matter and energy organized in this way “computronium” which is vastly more powerful pound per pound than the human brain. It will not jut be raw computation but will be infused with intelligent algorithms constituting all of human-machine knowledge. Over time we will convert much of the mass and energy in our tiny corner of the galaxy that is suitable for this purpose to computronium. … we will need to speed out to the rest of the galaxy and universe. …

How long will it take for us to spread our intelligence in its nonbiological form throughout the universe? … waking up the universe, and then intelligently deciding its fate by infusing it with our human intelligence in its nonbiological form, is our destiny.

Whew! That is quite the ending–and with that, so will we. I hope you enjoyed the book. What did you think of it? Will Humanity 2.0 be good? Bad? Totally different? Or does the Fermi Paradox imply that Kurzweil is wrong? Did you like this shorter Book Club format? And do you have any ideas for our next Book Club pick?

News

  1. The inestimable hbd chick has been banned from Twitter. No word why. She might get her account back (who knows?) Reinstated on Twitter. Her blog is still up. hbd chick has always been a sweet, polite person on Twitter, even to people who are hostile and rude to her, so this banning had nothing to do with misconduct. Someone at Twitter really hates the Hajnal Line.
  2. Since Twitter is increasingly hostile, unwelcome place, I have moved to Gab in solidarity, though PMing me on Twitter still works (because communication is useful.)
  3. The Ladies of HBD have arranged a group chat on Slack. The Join Code is posted in the comments over on the Female Side. Just to be clear, it’s for females.
  4. Vote for our next Book Club selection:

A. Who We Are and How We Got Here, by David Reich

B. The 10,000 Year Explosion, by Cochran and Harpending

C. The Making of the Atomic Bomb, by Richard Rhodes

D. American Nations, by Colin Woodard

E. Enlightenment Now, by Pinker

F. Something else–leave your suggestion in the comments.