Anthropology Friday: The Crackers of Apalachee, Florida

Remington_A_cracker_cowboy
A cracker cowboy, by Frederick Remington, 1895

About two years ago I reviewed Lois Lenski’s Strawberry Girl, a middle grade novel about the conflict between newly arrived, dedicated farmers and long-established families of hoe-farmers/ranchers/hunters in the backwoods of Florida. It was a pleasant book based on solid research among the older residents, but left me with many questions (as surely any children’s book would)–most notably, was the author’s description of the newly arrived farmers as “Crackers” accurate, or should the term be more properly restricted to the older, wilder inhabitants?

I had not known, prior to Lenski’s book, that “Cracker” even was an ethnonym; today it is used primarily as a slur, the original Crackers and their lifestyle having all but disappeared. Who were the Crackers? Where did they come from? Do they hail from the same stock as settled Appalachia (the mountains, not to be confused with Apalachee, the county in Florida we’ll be discussing in this post,) or different stock? Or is there perhaps a common stock that runs throughout America, appearing as more or less of the population in proportion to the favorability of the land for their lifestyles?

Today I happened upon Richard Wayne Sapp’s ethnography of Apalachee County, Florida: Suwannee River Town, Suwannee River Country: political moieties in a southern county community, published in 1976, which directly addresses a great many of my questions. So far it has been a fascinating book, and I am glad I found it.

I must note, though, that there currently is no “Apalachee County” in Florida. (There are an Apalachee Parkway and an Apalachee Park, though.) However, comparing the maps and geographic details in the book with a current map of Florida reveals that Apalachee Count is now Suwannee County. Wikipedia should note the change.

So without further ado, here are a few interesting quotes :

Apalachee County, a north Florida county community, nestles in a bend of the Suwanee River. The urban county seat is the center of government and associational life. Scattered over the country-side are farming neighborhoods whose interactional centers are rural churches. Count seat and rural neighborhoods are coupled by mutual exchanges of goods and services: neither are, of themselves, cultural wholes. The poor quality of its soils and the relative recency of settlement (post-Civil-War) give the community its distinctiveness; it never had a planting elite.

Apalachee society is structured along moiety lines: town and country.

EvX: “Moiety” means half; Wikipedia defines it in anthropology as:

a [kinship] descent group that coexists with only one other descent group within a society. In such cases, the community usually has unilineal descent, either patri- or matri-lineal so that any individual belongs to one of the two moiety groups by birth, and all marriages take place between members of opposite moieties. It is an exogamous clan system with only two clans.

Here I think Sapp is using moiety more in the sense of “two interacting groups that form a society” without implying that all town people take country spouses and vice versa. But continuing:

These halves rest on an earlier “cracker” horizon of isolated single-family homesteads. True crackers subsisted by living off the land and practicing hoe agriculture; they were fiercely independent and socially isolated. Apalachee moieties are also related to regional traditions: townsmen as town naboobs in the Cavalier tradition and countrymen as yeoman farmers in the Calvinist tradition. Townsmen promote associational interaction, valuing familism (nuclear), hierarchy in organisations, “progress,” and paternalistic interaction with countrymen. Countrymen value familism (extended), localism, and personalism, interacting on individually egalitarian rather than ordered associational terms. …

The division of governmental offices falls along moiety lines. Townsmen control municipal government, countrymen control the powerful county bodies. Except for jobs, the governmental institution is not a major source of political prizes. The country moiety is the dominant political force.”

555px-Alcohol_control_in_the_United_States.svg
Wet counties = blue; dry = red; yellow = mixed laws. (Currently.)

EvX: There follows a fascinating description of the battle over a referendum on whether the county should stay “dry” (no legal sale of alcohol) or go “wet” (alcohol sales allowed.) The Wets, led by business interests, had hoped that an influx of new residents who held more pro-alcohol views than established residents would tip the electoral balance in their favor. I find this an interesting admission of one of democracy’s weak points–the ability of newcomers to move into an area and vote to change the laws in ways the folks who already live there don’t like.

The Drys, led by local Baptist pastors, inflamed local sentiments against the wets, who were supposedly trying to overturn the law just to make make a hotel chain more interested in buying a tract of land owned by the leader of the Wets. The Wets argued the sale would attract more businesses to the area, boosting the economy; the Drys argued that the profits would go entirely to the wets and the community itself would reap the degradation and degeneration caused by alcohol.

The Drys won, and the leader of the Wets hasn’t set foot in a church in Apalachee county since then.

(Suwannee/Apalachee county finally allowed the sale of alcohol in 2011.)

1000px-United_States_Counties_Per_Capita_Income
Per capita GDP by county (wikipedia)

Does a county’s wet or dry status impact the willingness of businesses to move into the area, leading to depressed economies for Drys? I wanted to find out, so I pulled up maps of current dry counties and per capita GDP by county. It’s not a perfect comparison since it doesn’t control for cost of living, but it’ll do.

In general, I don’t think the theory holds. Suwanee, dry until 2011, is doing better than neighboring counties that went wet earlier (some of those neighboring counties are very swampy.) Central Mississippi is wet, but doing badly; a string of dry counties runs down the east side of the state, and unless my eyes deceive me, they’re doing better than the wet counties. Kentucky’s drys are economically depressed, but so are West Virginia’s wets. Pennsylvania and Texas’s “mixed” counties are doing fine, while Texas’s wets are doing badly. Virginia has some pretty poor wet counties; Alaska’s dry county is doing great.

However, this is only a comparison of currently dry and wet counties; if I had data that showed for what percent of the 20th century each county allowed the sale of alcohol, that might provide a different picture.

Still, I’m willing to go out on a limb, here: differences in local GDP have more to do with demographics than the sale of one particular beverage.

But back to Sapp:

A system of human community derivative of Europe and still basic to the southern United States is the county-community. … The symbolic heart of this traditional community, the county courthouse, has been the central point of political and economic assembly for county residents. Its people lived dispersed in neighborhoods clustered about small Protestant churches, points of assembly in socialization and socializing as well as bastions of moral and spiritual rectitude.

He quotes Havard, 1972, on the traits of the Calvinist-Yeoman Farmer–radical individualism, personalism, personal independence, populism, regionalist traditions, etc–vs the Cavalier-Planter/Town Nabob–social conformity, caste, paternalistic dependency, conservatism, nationalist patriotism.

He wrote that this split fathered two mainstream traditions in the South: yeoman farmer and plantation farmer. The yeoman farmers, he said, opposed governmental centralization and exhibited an aversion to urbanism, industrialization, and the entrepreneurial classes; they were libertarian, egalitarian, and populist. The plantation whigs, identified withdowntown mercantile interests, supported themselves as planters … bankers, and merchants, sat as the “county seat clique,” developed the theme of racial segregation in the post-bellum era, and promoted a cult of “manners” and paternalism. …

However, the Cavalier plantation elite never really settled in Apalachee/Suwannee county, due to its soil being much too poor for serious agriculture.

As a result, not many slaves were ever brought into the county, nor have their descendants migrated to the area. Since the population is mostly white, racial issues appear only rarely in the book, and it is safe to say that the culture never developed in quite the same ways as it did in the plantation-dominated Deep South.

Rather, Apalachee was settled by the Cavalier-Yeomen farmers and the Crackers:

Although the origin of the term cracker is disputed, Stetson Kennedy claims that cracker first applied to an assortment of “bad characters” who gathered in northern Florida before it became a territory of the United States. Deep-South Southerners later applied the epithet to the “poor white folk of Florida, Georgia, and Alababama.” (Kennedy, 1942, p. 59). He further relates:

“Crackers are mainly descended from the Irish, Scotch, and English stock which, from 1740 on, was slowly populating the huge Southern wilderness behind the thin strip of coastal civilization. These folk settled the Cumberland Valley, the Shenandoah, and spread through every Southern state east of the Mississippi. That branch of the family which settled in the Deep South was predominantly of Irish ancestry…

“The early crackers were the Okies of their day (as they have been ever since). Cheated of land, not by wind and erosion, but by the plantation and slavery system of the Old South, they were nonessentials in an economic, political and social order dominated by the squirearchy of wealthy planters, and in most respects were worse off than the Negro slaves. “

This contradicts the history told in our prior ethnography of Appalachia, which claims pointedly that the denizens of the Cumberland are not descended from the “poor whites” of the Deep South, but from Pennsylvanians. I offer, however, a synthesis: both the whites who settled on the Pennsylvania frontier and followed Daniel Boone into the Cumberland and found it pleasant enough to remain in the mountains and the whites who adopted an only semi-agricultural lifestyle in the backwoods and swamps of Florida hailed from the same original British stock and simply took different routes to get where they were going.

Powell, (1969) a white turpentine camp overseer of the late nineteenth century, called the crackers of Apalachee County “wild woodsmen” (p. 30) and mentioned a man who “had lived the usual life of a shiftless Cracker, hunting and fishing, and hard work did not agree with him.” …

[Powel writes:]

“When I speak of villages throughout this county, I use the word for lack of a better term, for in nine cases out of ten, they were the smallest imaginable focus of the scattering settlement, and usually one general store embraced the sum total of business enterprise. There the natives came at intervals to trade for coffee, tobacco, and the few other necessities that the woods and waters did not provide them with. Alligators’ hides and teeth, bird plumes and various kinds of pelts were the medium of barter. They were a curious people, and there are plenty of them there yet, born and bred to the forest and as ignorant of the affairs of every-day life outside of their domain, as are the bears and deer upon which they mainly subsist. A man who would venture to tell them that the earth moved instead of the sun, or that there was a device by which a message could be flashed for leagues across a wire, wold run the risk of being lynched, as too dangerous a liar to be at large. “

There is a section on the importance of guns and hunting to the locals, even the children, which will be familiar to anyone with any experience of the rural South. I know from family tales that my grandfather began to hunt when he was 8 years old; he used to sell the pelts of skunks he’d killed to furriers, who de-stinked them, dyed them black, and marketed them as “American sable” over in Europe.

Truth in Advertising laws decimated the “American sable” trade.

The true crackers, Powell’s “wild woodsmen,” were never numerous, and they rarely participated in the social life of the wider Apalachee county-community. Crackers were born, lived, and died in the woods. They buried their own in family plots far from the nearest church. … Cracker families settled the Apalachee area without recourse to legal formalities. Thus, when the yeomen farmers … eventually purchase legal titles to land, true crackers were forced out and deeper into Florida.

This is a common problem (especially for anyone whose ancestors arrived in an area before it was officially part of the US.) Where land is abundant, population density is low, and there aren’t any authorities who can enforce land ownership, anyway, people will be happy to farm where they want, hunt where they want, and defend their claims themselves. This tends to lead to a low-intensity lifestyle:

Craker subsistence strategy depended on scratch, perhaps slash-and-burn, summer agriculture and year-round food collecting activities: hunting, fishing, and foraging. Because their farming operations were so small, limited to the part-time efforts of an individual family, they had no need of financial credit.

Indeed, their fiercely independent, egalitarian ethos prohibited them from interacting significantly in the rural neighborhoods of the community. …

Few true crackers remain in Apalachee County … A few families still live on the borders of the county. There they exploit the food resources of the rivers and swamps and perhaps scratch-farm a few acres. …

Florida_Cracker_cow_and_calf
Florida Cracker cow and calf source

This is not (just) laziness; areas with poor soils or little water simply can’t be intensively farmed, and if the forage is bad, herd animals will be better off if they can wander widely in search of food than if they are confined in one particular place.

Incidentally, there is a landrace of cattle known as the Florida Cracker, descended from the hearty Spanish cattle brought to Florida in the 1600s. Unfortunately, the breed has been on the decline and is now listed as “critical” due to laws passed in 1949 against free-ranging livestock and the introduction of larger breeds more suited to confinement.

Not only does the law fence off the cracker’s land, destroy his livelihood, and drive him out, it also kills the cracker cow by fencing off its land.

The author notes that “cracker” is a slur and that it has been expanded in the past half-century to cover all poor whites, with an interesting footnote:

One speculates that the driving force behind withholding respectability from the true crackers and the extension of the consequently disparaging term to include countrymen of the small farmer class originated with the townspeople. This idea parallels the hypothesis that townsmen perpetuated and revitalized the issue of racial politics int he twentieth century.

On change:

The technological changes of the twentieth century have enabled social institutions to penetrate the isolation of the crackers and enforce town mores. Cracker homicides are no longer unreported and uninvestigated or allowed to result in clannish feuding… No longer may the children escape the public school regimen. No longer may they escape taxation…

[yet] the cracker and his world view persist. While only a handful of true crackers endure in the county… modern-day imitators erect trailers in remote corners, moving to north-central Florida …. to escape the “rat race.”

I think that’s enough for today; I hope you’ve enjoyed the book and urge you to take a look at the whole thing. We’ll discuss the more recent Cavalier-Yeomen farmers next week.

Advertisements

Invasive Memes

 

220px-Smallpox_virus_virions_TEM_PHIL_1849
Smallpox virus

Do people eventually grow ideologically resistant to dangerous local memes, but remain susceptible to foreign memes, allowing them to spread like invasive species?

And if so, can we find some way to memetically vaccinate ourselves against deadly ideas?

***

Memetics is the study of how ideas (“memes”) spread and evolve, using evolutionary theory and epidemiology as models. A “viral meme” is one that spreads swiftly through society, “infecting” minds as it goes.

Of course, most memes are fairly innocent (e.g. fashion trends) or even beneficial (“wash your hands before eating to prevent disease transmission”), but some ideas, like communism, kill people.

Ideologies consist of a big set of related ideas rather than a single one, so let’s call them memeplexes.

Almost all ideological memeplexes (and religions) sound great on paper–they have to, because that’s how they spread–but they are much more variable in actual practice.

Any idea that causes its believers to suffer is unlikely to persist–at the very least, because its believers die off.

Over time, in places where people have been exposed to ideological memeplexes, their worst aspects become known and people may learn to avoid them; the memeplexes themselves can evolve to be less harmful.

Over in epidemiology, diseases humans have been exposed to for a long time become less virulent as humans become adapted to them. Chickenpox, for example, is a fairly mild disease that kills few people because the virus has been infecting people for as long as people have been around (the ancestral Varicella-Zoster virus evolved approximately 65 million years ago and has been infecting animals ever since). Rather than kill you, chickenpox prefers to enter your nerves and go dormant for decades, reemerging later as shingles, ready to infect new people.

By contrast, smallpox (Variola major and Variola minor) probably evolved from a rodent-infecting virus about 16,000 to 68,000 years ago. That’s a big range, but either way, it’s much more recent than chickenpox. Smallpox made its first major impact on the historical record around the third century BC, Egypt, and thereafter became a recurring plague in Africa and Eurasia. Note that unlike chickenpox, which is old enough to have spread throughout the world with humanity, smallpox emerged long after major population splits occurred–like part of the Asian clade splitting off and heading into the Americas.

By 1400, Europeans had developed some immunity to smallpox (due to those who didn’t have any immunity dying), but when Columbus landed in the New World, folks here had had never seen the disease before–and thus had no immunity. Diseases like smallpox and measles ripped through native communities, killing approximately 90% of the New World population.

If we extend this metaphor back to ideas–if people have been exposed to an ideology for a long time, they are more likely to have developed immunity to it or the ideology to have adapted to be relatively less harmful than it initially was. For example, the Protestant Reformation and subsequent Catholic counter-reformation triggered a series of European wars that killed 10 million people, but today Catholics and Protestants manage to live in the same countries without killing each other. New religions are much more likely to lead all of their followers in a mass suicide than old, established religions; countries that have just undergone a political revolution are much more likely to kill off large numbers of their citizens than ones that haven’t.

This is not to say that old ideas are perfect and never harmful–chickenpox still kills people and is not a fun disease–but that any bad aspects are likely to become more mild over time as people wise up to bad ideas, (certain caveats applying).

But this process only works for ideas that have been around for a long time. What about new ideas?

You can’t stop new ideas. Technology is always changing. The world is changing, and it requires new ideas to operate. When these new ideas arrive, even terrible ones can spread like wildfire because people have no memetic antibodies to resist them. New memes, in short, are like invasive memetic species.

In the late 1960s, 15 million people still caught smallpox every year. In 1980, it was declared officially eradicated–not one case had been seen since 1977, due to a massive, world-wide vaccination campaign.

Humans can acquire immunity to disease in two main ways. The slow way is everyone who isn’t immune dying; everyone left alive happens to have adaptations that let them not die, which they can pass on to their children. As with chickenpox, over generations, the disease becomes less severe because humans become successively more adapted to it.

The fast way is to catch a disease, produce antibodies that recognize and can fight it off, and thereafter enjoy immunity. This, of course, assumes that you survive the disease.

Vaccination works by teaching body’s immune system to recognize a disease without infecting it with a full-strength germ, using a weakened or harmless version of the germ, instead. Early on, weakened germs from actual smallpox scabs or lesions to inoculate people, a risky method since the germs often weren’t that weak. Later, people discovered that cowpox was similar enough to smallpox that its antibodies could also fight smallpox, but cowpox itself was too adapted to cattle hosts to seriously harm humans. (Today I believe the vaccine uses a different weakened virus, but the principle is the same.)

The good part about memes is that you do not actually have to inject a physical substance into your body in order to learn about them.

Ideologies are very difficult to evaluate in the abstract, because, as mentioned, they are all optimized to sound good on paper. It’s their actual effects we are interested in.

So if we want to learn whether an idea is good or not, it’s probably best not to learn about it by merely reading books written by its advocates. Talk to people in places where the ideas have already been tried and learn from their experiences. If those people tell you this ideology causes mass suffering and they hate it, drop it like a hot potato. If those people are practicing an “impure” version of the ideology, it’s probably an improvement over the original.

For example, “communism” as practiced in China today is quite different from “communism” as practiced there 50 years ago–so much so that the modern system really isn’t communism at all. There was never, to my knowledge, an official changeover from one system to another, just a gradual accretion of improvements. This speaks strongly against communism as an ideology, since no country has managed to be successful by moving toward ideological communist purity, only by moving away from it–though they may still find it useful to retain some of communism’s original ideas.

I think there is a similar dynamic occurring in many Islamic countries. Islam is a relatively old religion that has had time to adapt to local conditions in many different parts of the world. For example, in Morocco, where the climate is more favorable to raising pigs than in other parts of the Islamic world, the taboo against pigs isn’t as strongly observed. The burka is not an Islamic universal, but characteristic of central Asia (the similar niqab is from Yemen). Islamic head coverings vary by culture–such as this kurhars, traditionally worn by unmarried women in Ingushetia, north of the Caucuses, or this cap, popular in Xianjiang. Turkey has laws officially restricting burkas in some areas, and Syria discourages even hijabs. Women in Iran did not go heavily veiled prior to the Iranian Revolution. So the insistence on extensive veiling in many Islamic communities (like the territory conquered by ISIS) is not a continuation of old traditions, but the imposition of a new, idealized, version of Islam.

Purity is counter to practicality.

Of course, this approach is hampered by the fact that what works in one place, time, and community may not work in a different one. Tilling your fields one way works in Europe, and tilling them a different way works in Papua New Guinea. But extrapolating from what works is at least a good start.

 

 

Neanderthal Skull for 3D Printing

e3fa487b36f43641fc86d1fbe40665b4_preview_featured Meet Nandy the Neanderthal. You can download him at Thingiverse.

This is my first creation, Nandy the Neanderthal, based on the Chapelle-aux-Saints 1 skull and this side view. Note that he is based on two different skulls, but still very much a Neanderthal.

Since this is my very first creation and I don’t have a 3D printer yet, (I expect to receive one soon and am planning ahead,) I am still learning all of the ins and outs of this technology and so would appreciate any technical feedback.

Neanderthals evolved around 600,000-800,000 years ago and spread into the Middle East, Europe, and central Asia. They made stone tools, controlled fire, and hunted. They survived in a cold and difficult climate, but likely could make no more than the simplest of clothes. As a result, they may have been, unlike modern humans, hairy.

Cochran and Harpending of West Hunter write in The 10,000 Year Explosion: 

 Chimpanzees have ridges on their finger bones that stem from the way that they clutch their mothers’ fur as infants. Modern humans don’t have these ridges, but Neanderthals do.

Hoffecker, in The Spread of Modern Humans in Europe writes:

Neanderthal sites show no evidence of tools for making tailored clothing. There are only hide scrapers, which might have been used to make blankets or ponchos. This is in contrast to Upper Paleolithic (modern human) sites, which have an abundance of eyed bone needles and bone awls.

Their skulls were, on average, larger than ours, with wide noses, round eyes, and an elongated braincase. Their facial features were robust–that is, strong, thick, and heavy.

The Chappel-aux-Saints 1 Neanderthal lived to about 40 years old. He had lost most of his teeth years before his death, (I gave Nandy some teeth, though,) suffered arthritis, and must have been cared for in his old age by the rest of his tribe. At his death he was most likely buried in a pit dug by his family, which preserved his skeleton in nearly complete condition for 60,000 years.

Anatomically modern humans, Homo sapiens, encountered and interbred with Neanderthals around 40,000 years ago. (Neanderthals are also humans–Homo neanderthalensis.) Today, about 1-5% of the DNA in non-Sub-Saharan Africans hails originally from a Neanderthal ancestor. (Melanesians also have DNA from a cousin of the Neanderthals, the Denisovans, and Sub-Saharan Africans may have their own archaic ancestors.)

Unfortunately for Nandy and his relations, the Neanderthals also began to disappear around 40,000 years ago. Perhaps it was the weather, or Homo sapiens out competed them, or their enormous skulls just caused too much trouble in childbirth. Whatever happened, the Neanderthals remain a mystery, evidence of the past when we weren’t the only human species in town.

The Endless Ratiocination of the Dysphoric Mind

Begin

My endless inquiries made it impossible for me to achieve anything. Moreover, I get to think about my own thoughts of the situation in which I find myself. I even think that I think of it, and divide myself into an infinite retrogressive sequence of ‘I’s who consider each other. I do not know at which ‘I’ to stop as the actual, and as soon as I stop, there is indeed again an ‘I’ which stops at it. I become confused and feel giddy as if I were looking down into a bottomless abyss, and my ponderings result finally in a terrible headache. –Møller, Adventures of a Danish Student

Moller’s Adventures of a Danish Student was one of Niels Bohr’s favorite books; it reflected his own difficulties with cycles of ratiocination, in which the mind protects itself against conclusions by watching itself think.

I have noticed a tendency on the left, especially among the academic-minded, to split the individual into sets of mental twins–one who is and one who feels that it is; one who does and one who observes the doing.

Take the categories of “biological sex” and “gender.” Sex is defined as the biological condition of “producing small gametes” (male) or “producing large gametes” (female) for the purpose of sexual reproduction. Thus we can talk about male and female strawberry plants, male and female molluscs, male and female chickens, male and female Homo Sapiens.

(Indeed, the male-female binary is remarkably common across sexually reproducing plants and animals–it appears that the mathematics of a third sex simply don’t work out, unless you’re a mushroom. How exactly sex is created varies by species, which makes the stability of the sex-binary all the more remarkable.)

And for the first 299,945 years or so of our existence, most people were pretty happy dividing humanity into “men” “women” and the occasional “we’re not sure.” People didn’t understand why or how biology works, but it was a functional enough division for people.

In 1955, John Money decided we needed a new term, “gender,” to describe, as Wikipedia puts it, “the range of characteristics pertaining to, and differentiating between, masculinity and femininity.” Masculinity is further defined as “a set of attributes, behaviors, and roles associated with boys and men;” we can define “femininity” similarly.

So if we put these together, we get a circular definition: gender is a range of characteristics of the attributes of males and females. Note that attributes are already characteristics. They cannot further have characteristics that are not already inherent in themselves.

But really, people invoke “gender” to speak of a sense of self, a self that reflexively looks at itself and perceives itself as possessing traits of maleness of femaleness; the thinker who must think of himself as “male” before he can act as a male. After all, you cannot walk without desiring first to move in a direction; how can you think without first knowing what it is you want to think? It is a cognitive splitting of the behavior of the whole person into two separate, distinct entities–an acting body, possessed of biological sex, and a perceiving mind, that merely perceives and “displays” gender.

But the self that looks at itself looking at itself is not real–it cannot be, for there is only one self. You can look at yourself in the mirror, but you cannot stand outside of yourself and be simultaneously yourself; there is only one you. The alternative, a fractured consciousness, is a symptom of mental disorder and treated with chlorpromazine.

Robert Oppenheimer was once diagnosed with schizophrenia–dementia praecox, as they called it then. Whether he had it or simply confused the therapist by talking about wave/particle dualities is another matter.

Then there are the myriad variants of the claim that men and women “perform femininity” or “display masculinity” or “do gender.” They do not claim that people are feminine or act masculine–such conventional phrasing assumes the existence of a unitary self that is, perceives, and acts. Rather, they posit an inner self that possesses no inherent male or female traits, for whom masculinity and femininity are only created via the interaction of their body and external expectations. In this view, women do not buy clothes because they have some inherent desire to go shopping and buy pretty things, but because society has compelled them to do so in order to comply with external notion of “what it means to be female.” The self who produces large gametes is not the self who shops.

The biological view of human behavior states that most humans engage in a variety of behaviors because similar behaviors contributed to the evolutionary success of our ancestors. We eat because ancestors who didn’t think eating was important died. We jump back when we see something that looks like a spider because ancestors who didn’t got bitten and died. We love cute things with big eyes because they look like babies because we are descended mostly from people who loved their babies.

Sometimes we do things that we don’t enjoy but rationalize will benefit us, like work for an overbearing boss or wear a burka, but most “masculine” and “feminine” behaviors fall into the category of things people do voluntarily, like “compete at sports” or “gossip with friends.” The fact that more men than women play baseball and more women than men enjoy gossiping with friends has nothing to do with an internal self attempting to perform gender roles and everything to do with the challenges ancestral humans faced in reproducing.

But whence this tendency toward ratiocination? I can criticize it as a physical mistake, but does it reflect an underlying psychological reality? Do some people really perceive themselves as a self separate from themselves, a meta-self watching the first self acting in particular manners?

Here is a study that found that folks with more cognitive flexibility tended to be more socially liberal, though economic conservatism/liberalism didn’t particularly correlate with cognitive flexibility.

I find that if I work hard, I may achieve a state of zen, an inner tranquility in which the endless narrative of thoughts coalesce for a moment and I can just be. Zen is flying down a straight road at 80 miles an hour on a motorcycle; zen is working on a math problem that consumes all of your attention; zen is dancing until you only feel the music. The opposite of zen is lying in bed at 3 AM, staring at the ceiling, thinking of all of your failures, unable to switch off your brain and fall asleep.

Dysphoria is a state of unease. Some people have gender dysphoria; a few report temporal dysphoria. It might be better defined at disconnection, a feeling of being eternally out of place. I feel a certain dysphoria every time I surface from reading some text of anthropology, walk outside, and see cars. What are these metal things? What are these straight, right-angled streets? Everything about modern society strikes me as so artificial and counter to nature that I find it deeply unsettling.

It is curious that dysphoria itself is not discussed more in the psychiatric literature. Certainly a specific form or two receives a great deal of attention, but not the general sense itself.

When things are in place, you feel tranquil and at ease; when things are out of place you agitated, always aware of the sense of crawling out of your own skin. People will try any number of things to turn off the dysphoria; a schizophrenic friend reports that enough alcohol will make the voices stop, at least for a while. Drink until your brain shuts up.

But this is only when things are out of place. Healthy people seek a balance between division and unity. Division of the self is necessary for self-criticism and improvement; people can say, then, “I did a bad thing, but I am not a bad person, so I will change my behavior and be better.” Metacognition allows people to reflect on their behavior without feeling that their self is fundamentally at threat, but too much metacognition leads to fragmentation and an inability to act.

People ultimately seek a balanced, unified sense of self.

It is said that not everyone has an inner voice, a meta-self commenting on the acting self, and some have more than one:

My previous blogs have observed that some people –women with bulimia nervosa, for example– have frequent multiple simultaneous experiences, but that multiple experience is not frequent in the general population. …

Consider inner speech. Subject experienced themselves as innerly talking to themselves in 26% of all samples, but there were large individual differences: some subjects never experienced inner speech; other subjects experienced inner speech in as many as 75% of their samples. The median percentage across subjects was 20%.

It’s hard to tell what people really experience, but certainly there is a great deal of variety in people’s internal experiences. Much of thought is not easily describable. Some people hear many voices. Some cannot form mental images:

I think the best way I can describe my aphantasia is to say that I am unaware of anything in my mind except these categories: i) direct sensory input, ii) unheardwords that carry thoughts, iii) unheardmusic, iv) a kind of invisible imagery, which I can best describe as sensation of pictures that are in a sense too faint to see, v) emotions, and vi) thoughts which seem too fastto exist as words. … I see what is around me, unless my eyes are closed when all is always black. I hear, taste, smell and so forth, but I dont have the experience people describe of
hearing a tune or a voice in their heads. Curiously, I do frequently have a tune going around in my head, all I am lacking is the direct experience of hearingit.

The quoted author is, despite his lack of internal imagery, quite intelligent, with a PhD in physics.

Some cannot hear themselves think at all.

I would like to know if there is any correlation between metacognition, ratiocination, and political orientations–I have so far found a little on the subject:

We find a relationship between thinking style and political orientation and that these effects are particularly concentrated on social attitudes. We also find it harder to manipulate intuitive and reflective thinking than a number of prominent studies suggest. Priming manipulations used to induce reflection and intuition in published articles repeatedly fail in our studies. We conclude that conservatives—more specifically, social conservatives—tend to be dispositionally less reflective, social liberals tend to be dispositionally more reflective, and that the relationship between reflection and intuition and political attitudes may be more resistant to easy manipulation than existing research would suggest.

And a bit more:

… Berzonsky and Sullivan (1992) cite evidence that individuals higher in reported
self-reflection also exhibit more openness to experience, more liberal values, and more general tolerance for exploration. As noted earlier, conservatives tend to be less open to experience, more intolerant of ambiguity, and generally more reliant on self-certainty than liberals. That, coupled with the evidence reported by Berzonsky and Sullivan, strongly suggests conservatives engage in less introspective behaviors.

Following an interesting experiment looking at people’s online dating profiles, the authors conclude:

Results from our data support the hypothesis that individuals identifying
themselves as “Ultra Conservative‟ exhibit less introspection in a written passage with personal content than individuals identifying themselves as “Very Liberal‟. Individuals who reported a conservative political orientation often provided more descriptive and explanatory statements in their profile’s “About me and who I‟m looking for‟ section (e.g., “I am 62 years old and live part time in Montana” and “I enjoy hiking, fine restaurants”). In contrast, individuals who reported a liberal political orientation often provided more insightful and introspective statements in their narratives (e.g., “No regrets, that‟s what I believe in” and “My philosophy in life is to make complicated things simple”).

The ratiocination of the scientist’s mind can ultimately be stopped by delving into that most blessed of substances, reality, (or as close to it as we can get.) There is, at base, a fundamentally real thing to delve into, a thing which makes ambiguities disappear. Even a moral dilemma can be resolved with good enough data. We do not need to wander endlessly within our own thoughts; the world is here.

End

 

Denny: the Neanderthal-Denisovan Hybrid

Carte_Neandertaliens
Neanderthal Sites (source: Wikipedia)

Homo Sapiens–that is, us, modern humans, are about 200-300,000 years old. Our ancestor, Homo heidelbergensis, lived in Africa around 700,000-300,000 years ago.

Around 700,000 years ago, another group of humans split off from the main group. By 400,000 years ago, their descendants, Homo neanderthalensis–Neanderthals–had arrived in Europe, and another band of their descendants, the enigmatic Denisovans, arrived in Asia.

While we have found quite a few Neanderthal remains and archaeological sites with tools, hearths, and other artifacts, we’ve uncovered very few Denisovan remains–a couple of teeth, a finger bone, and part of an arm in Denisova Cave, Russia. (Perhaps a few other remains I am unaware of.)

Yet from these paltry remains scientists have extracted enough DNA to ascertain that no only were Denisovans a distinct species, but also that Melanesians, Papuans, and Aborigines derive about 3-6% of their DNA from a Denisovan ancestors. (All non-African populations also have a small amount of Neanderthal DNA, derived from a Neanderthal ancestors.)

If Neanderthals and Homo sapiens interbred, and Denisovans and Homo sapiens interbred, did Neanderthals and Denisovans ever mate?

nature-siberian-neanderthals-17.02.16-v2
The slightly more complicated family tree, not including Denny

Yes.

The girl, affectionately nicknamed Denny, lived and died about 90,000 years ago in Siberia. The remains of an arm, found in Denisova Cave, reveal that her mother was a Neanderthal, her father a Denisovan.

We don’t yet know what Denisovans looked like, because we don’t have any complete skeletons of them, much less good skulls to examine, so we don’t know what a Neanderthal-Denisovan hybrid like Denny looked like.

But the fact that we can extract so much information from a single bone–or fragment of bone–preserved in a Siberian cave for 90,000 years–is amazing.

We are still far from truly understanding what sorts of people our evolutionary cousins were, but we are gaining new insights all the time.

Book Club: How to Create a Mind, pt 2/2

Ray Kurzweil, writer, inventor, thinker

Welcome back to EvX’s Book Club. Today  are finishing Ray Kurzweil’s How to Create a Mind: The Secret of Human thought Revealed.

Spiders are interesting, but Kurzweil’s focus is computers, like Watson, which trounced the competition on Jeopardy!

I’ll let Wikipedia summarize Watson:

Watson was created as a question answering (QA) computing system that IBM built to apply advanced natural language processing, information retrieval, knowledge representation, automated reasoning, and machine learning technologies to the field of open domain question answering.[2]

The sources of information for Watson include encyclopedias, dictionaries, thesauri, newswire articles, and literary works. Watson also used databases, taxonomies, and ontologies. …

Watson parses questions into different keywords and sentence fragments in order to find statistically related phrases.[22] Watson’s main innovation was not in the creation of a new algorithm for this operation but rather its ability to quickly execute hundreds of proven language analysis algorithms simultaneously.[22][24] The more algorithms that find the same answer independently the more likely Watson is to be correct.[22] Once Watson has a small number of potential solutions, it is able to check against its database to ascertain whether the solution makes sense or not.[22]

Kurzweil opines:

That is at least one reason why Watson represents such a significant milestone: Jeopardy! is precisely such a challenging language task. … What is perhaps not evident to many observers is that Watson not only had to master the language in the unexpected and convoluted queries, but for the most part its knowledge was not hand-coded. It obtained that knowledge by actually reading 200 million pages of natural-language documents, including all of Wikipedia… If Watson can understand and respond to questions based on 200 million pages–in three seconds!–here is nothing to stop similar systems from reading the other billions of documents on the Web. Indeed, that effort is now under way.

A point about the history of computing that may be petty of me to emphasize:

Babbage’s conception is quite miraculous when you consider the era in which he lived and worked. However, by the mid-twentieth century, his ideas had been lost in the mists of time (although they were subsequently rediscovered.) It was von Neumann who conceptualized and articulated the key principles of the computer as we know it today, and the world recognizes this by continuing to refer to the von Neumann machine as the principal model of computation. Keep in mind, though, that the von Neumann machine continually communicates data between its various units and within those units, so it could not be built without Shannon’s theorems and the methods he devised for transmitting and storing reliable digital information. …

You know what? No, it’s not petty.

Amazon lists 57 books about Ada Lovelace aimed at children, 14 about Alan Turing, and ZERO about John von Neumann.

(Some of these results are always irrelevant, but they are roughly correct.)

“EvX,” you may be saying, “Why are you counting children’s books?”

Because children are our future, and the books that get published for children show what society deems important for children to learn–and will have an effect on what adults eventually know.

I don’t want to demean Ada Lovelace’s role in the development of software, but surely von Neumann’s contributions to the field are worth a single book!

*Slides soapbox back under the table*

Anyway, back to Kurzweil, now discussing quantum mechanics:

There are two ways to view the questions we have been considering–converse Western an Easter perspective on the nature of consciousness and of reality. In the Western perspective, we start with a physical world that evolves patterns of information. After a few billion years of evolution, the entities in that world have evolved sufficiently to become conscious beings In the Eastern view, consciousness is the fundamental reality, the physical word only come into existence through the thoughts of conscious beings. …

The East-West divide on the issue of consciousness has also found expression in opposing schools of thought in the field of subatomic physics. In quantum mechanics, particles exist in what are called probability fields. Any measurement carried out on them by a measuring device causes what is called a collapse of the wave function, meaning that the particle suddenly assumes a particular location. A popular view is that such a measurement constitutes observation by a conscious observer… Thus the particle assume a particular location … only when it is observed. Basically particles figure that if no one is bothering to look at them, they don’t need to decide where they are. I call this the Buddhist school of quantum mechanics …

Niels Bohr

Or as Niels Bohr put it, “A physicist is just an atom’s way of looking at itself.” He also claimed that we could describe electrons exercised free will in choosing their positions, a statement I do not think he meant literally; “We must be clear that when it comes to atoms, language can be used only as in poetry,” as he put it.

Kurzweil explains the Western interpretation of quantum mechanics:

There is another interpretation of quantum mechanics… In this analysis, the field representing a particle is not a probability field, but rather just a function that has different values in different locations. The field, therefore, is fundamentally what the particle is. … The so-called collapse of the wave function, this view holds, is not a collapse at all. … It is just that a measurement device is also made up of particles with fields, and the interaction of the particle field being measured and the particle fields of the measuring device result in a reading of the particle being in a particular location. The field, however, is still present. This is the Western interpretation of quantum mechanics, although it is interesting to note that the more popular view among physicists worldwide is what I have called the Eastern interpretation.

Soviet atomic bomb, 1951

For example, Bohr has the yin-yang symbol on his coat of arms, along with the motto contraria sunt complementa, or contraries are complementary. Oppenheimer was such a fan of the Bhagavad Gita that he read it in Sanskrit and quoted it upon successful completion of the Trinity Test, “If the radiance of a thousand suns were to burst at once into the sky, that would be like the splendor of the mighty one,” and “Now I am become death, the destroyer of worlds.” He credited the Gita as one of the most important books in his life.

Why the appeal of Eastern philosophy? Is it something about physicists and mathematicians? Leibnitz, after all, was fond of the I Ching. As Wikipedia says:

Leibniz was perhaps the first major European intellectual to take a close interest in Chinese civilization, which he knew by corresponding with, and reading other works by, European Christian missionaries posted in China. Having read Confucius Sinarum Philosophus on the first year of its publication,[153] he concluded that Europeans could learn much from the Confucian ethical tradition. He mulled over the possibility that the Chinese characters were an unwitting form of his universal characteristic. He noted with fascination how the I Ching hexagrams correspond to the binary numbers from 000000 to 111111, and concluded that this mapping was evidence of major Chinese accomplishments in the sort of philosophical mathematics he admired.[154] Leibniz communicated his ideas of the binary system representing Christianity to the Emperor of China hoping it would convert him.[84] Leibniz may be the only major Western philosopher who attempted to accommodate Confucian ideas to prevailing European beliefs.[155]

Leibniz’s attraction to Chinese philosophy originates from his perception that Chinese philosophy was similar to his own.[153] The historian E.R. Hughes suggests that Leibniz’s ideas of “simple substance” and “pre-established harmony” were directly influenced by Confucianism, pointing to the fact that they were conceived during the period that he was reading Confucius Sinarum Philosophus.[153]

Perhaps it is just that physicists and mathematicians are naturally curious people, and Eastern philosophy is novel to a Westerner, or perhaps by adopting Eastern ideas, they were able to purge their minds of earlier theories of how the universe works, creating a blank space in which to evaluate new data without being biased by old conceptions–or perhaps it is just something about the way their minds work.

As for quantum, I favor the de Broglie-Bohm interpretation of quantum mechanics, but obviously I am not a physicist and my opinion doesn’t count for much. What do you think?

But back to the book. If you are fond of philosophical ruminations on the nature of consciousness, like “What if someone who could only see in black and white read extensively about color “red,” could they ever achieve the qualia of actually seeing the color red?” or “What if a man were locked in a room with a perfect Chinese rulebook that told him which Chinese characters to write in response to any set of characters written on notes passed under the door? The responses are be in perfect Chinese, but the man himself understands not a word of Chinese,” then you’ll enjoy the discussion. If you already covered all of this back in Philosophy 101, you might find it a bit redundant.

Kurzweil notes that conditions have improved massively over the past century for almost everyone on earth, but people are increasingly anxious:

A primary reason people believe life is getting worse is because our information about the problems of the world has steadily improved. If there is a battle today somewhere on the planet, we experience it almost as if we were there. During World War II, tens of thousand of people might perish in a battle, and if the public could see it at all it was in a grainy newsreel in a movie theater weeks later. During World War I a small elite could read about the progress of the conflict in the newspaper (without pictures.) During the nineteenth century there was almost no access to news in a timely fashion for anyone.

As for the future of man, machines, and code, Kurzweil is even more optimistic than Auerswald:

The last invention that biological evolution needed to make–the neocortex–is inevitably leading to the last invention that humanity needs to make–truly intelligent machines–and the design of one is inspiring the other. … by the end of this century we will be able to create computation at the limits of what is possible, based on the laws of physics… We call matter and energy organized in this way “computronium” which is vastly more powerful pound per pound than the human brain. It will not jut be raw computation but will be infused with intelligent algorithms constituting all of human-machine knowledge. Over time we will convert much of the mass and energy in our tiny corner of the galaxy that is suitable for this purpose to computronium. … we will need to speed out to the rest of the galaxy and universe. …

How long will it take for us to spread our intelligence in its nonbiological form throughout the universe? … waking up the universe, and then intelligently deciding its fate by infusing it with our human intelligence in its nonbiological form, is our destiny.

Whew! That is quite the ending–and with that, so will we. I hope you enjoyed the book. What did you think of it? Will Humanity 2.0 be good? Bad? Totally different? Or does the Fermi Paradox imply that Kurzweil is wrong? Did you like this shorter Book Club format? And do you have any ideas for our next Book Club pick?

News

  1. The inestimable hbd chick has been banned from Twitter. No word why. She might get her account back (who knows?) Reinstated on Twitter. Her blog is still up. hbd chick has always been a sweet, polite person on Twitter, even to people who are hostile and rude to her, so this banning had nothing to do with misconduct. Someone at Twitter really hates the Hajnal Line.
  2. Since Twitter is increasingly hostile, unwelcome place, I have moved to Gab in solidarity, though PMing me on Twitter still works (because communication is useful.)
  3. The Ladies of HBD have arranged a group chat on Slack. The Join Code is posted in the comments over on the Female Side. Just to be clear, it’s for females.
  4. Vote for our next Book Club selection:

A. Who We Are and How We Got Here, by David Reich

B. The 10,000 Year Explosion, by Cochran and Harpending

C. The Making of the Atomic Bomb, by Richard Rhodes

D. American Nations, by Colin Woodard

E. Enlightenment Now, by Pinker

F. Something else–leave your suggestion in the comments.

 

 

Modern “Anthropology”

Over the past couple years of Anthropology Friday, I have tried to highlight works that cast a light on the varied and myriad human experiences. Not all of them are great works of literature, but they show what anthropology can be. We’ve read about the Eskimo in Kabloona, Jane Goodall’s research with the Gombe chimps in In the Shadow of Man, records of prisons and criminal gangs, Appalachia and Siberian reindeer herders. We’ve read first-hand accounts like Isaac Bacirongo’s Still a Pygmy and the Slave Narrative Collection.

Anthropology, done right, makes the alien familiar and expands our understanding of the many varieties of human experience.

Done wrong, well… Here’s the abstract from Professor Dwayne Dixon’s Endless Question: Youth Becomings and the Anti-Crisis of Kids in Global Japan:

This layered, latitudinal (trilateral), anthropological project traces how three groups of Japanese young people redefine youth through bodily practices, identities, and economic de/attachments. Young Japanese—skateboarders, creative workers, and returnee schoolchildren—embody various relations to the city, visual media, globalized identities, temporary jobs, and education. The dissertation itself is non-linear; it formally enacts the multidirectional, diverse youthful experiences amidst intense global connections, transitioning identities, and uncertain social and economic futures. Multi-media and electronic text create lines of connection between sites and events in the young people’s experiences and larger histories of gender and labor, city life, and global dreams. Against crisis narratives, Japanese youth are creating improvisational, social connections amidst intense change.

Can we translate this into functional English?

Sentence 1: “This layered, latitudinal (trilateral), anthropological project traces how three groups of Japanese young people redefine youth through bodily practices, identities, and economic de/attachments.”

Translation: This anthropology project follows three groups of Japanese youths, documenting their body piercings, identities, and which brands they eagerly consume or shun.

The word “identity” in this sentence is difficult to translate because it is vague and undefined–sexual identities? gender identities? Japanese identities? Millenial identities?–and more importantly, because most people do not bother to think about their “identities” at all.

Sentence 2: “ Young Japanese—skateboarders, creative workers, and returnee schoolchildren—embody various relations to the city, visual media, globalized identities, temporary jobs, and education.

Translation: Young Japanese skateboarders, artists, and continuing-education students live in the city, watch and make videos, have “globalized identities,” work temporary jobs, and go to school.

“Embody” was a difficult word to translate because it means nothing that makes sense in this context. To embody is to “be an expression of or give a tangible or visible form to” something, eg “Romeo embodies love;” or to “include or contain something as a constituent part,” eg, “Freedom of expression is embodied in the Bill of Rights.” We could use “contain” or “symbolize” as synonyms, but neither “Young Japanese… contain various relations to the city…’ nor “Young Japanese… symbolize various relations to the city…” make sense.

“Identities” makes a second apperance and again contributes very little.

Sentence 3: “The dissertation itself is non-linear; it formally enacts the multidirectional, diverse youthful experiences amidst intense global connections, transitioning identities, and uncertain social and economic futures.

Translation: This dissertation is non-linear because the subjects’ lives are too complex to express chronologically.

(I don’t think “formally” means what he thinks it means.)

Sentence 4: “Multi-media and electronic text create lines of connection between sites and events in the young people’s experiences and larger histories of gender and labor, city life, and global dreams.

Translation: Young people use cell phones to text each other about skateboarding events and post videos of themselves skateboarding on the internet.

Sentence 5: “Against crisis narratives, Japanese youth are creating improvisational, social connections amidst intense change.

Translation: You might have heard that Japanese youth are in crisis, but actually they’re making new friends in the middle of this protracted economic malaise.

Dixon’s original is not only unclear and vague, but parts of it aren’t even grammatical. Strip away the buzzwords, and you’re left with “Japanese youth use cellphones and make friends”–not exactly shocking observations.

Let’s compare with missionary Sidney L. Gulick’s account of Japan, written in 1903, on the Japanese character and effects of modernization:

Many writers have dwelt with delight on the cheerful disposition that seems so common in Japan. …. And, on the whole, these pictures are true to life. The many flower festivals are made occasions for family picnics when all care seems thrown to the wind. There is a simplicity and a freshness and a freedom from worry that is delightful to see. But it is also remarked that a change in this regard is beginning to be observed. The coming in of Western machinery, methods of government, of trade and of education, is introducing customs and cares, ambitions and activities, that militate against the older ways. …

The judgment that all Japanese are cheerful rests on shallow grounds. Because, forsooth, millions on holidays bear that appearance, and because on ordinary occasions the average man and woman seem cheerful and happy, the conclusion is reached that all are so. No effort is made to learn of those whose lives are spent in sadness and isolation. I am convinced that the Japan of old, for all its apparent cheer, had likewise its side of deep tragedy. …

Enough of Japan. Here’s an abstract from Medical Anthropology Quarterly, Modeling Population Health: Reflections on the Performativity of Epidemiological Techniques in the Age of Genomics, by Susanne Bauer:

Risk reasoning has become the common-sense mode of knowledge production in the health sciences. Risk assessment techniques of modern epidemiology also co-shape the ways genomic data are translated into population health. Risk computations (e.g., in preventive medicine, clinical decision-support software, or web-based self-tests), loop results from epidemiological studies back into everyday life. Drawing from observations at various European research sites, I analyze how epidemiological techniques mediate and enact the linkages between genomics and public health. This article examines the epidemiological apparatus as a generative machine that is socially performative. The study design and its reshuffling of data and categories in risk modeling recombine old and new categories from census to genomics and realign genes/environment and nature/culture in novel and hybrid ways. In Euro-American assemblage of risk reasoning and related profiling techniques, the individual and the population are no longer separate but intimately entangled.

Note the preponderance of obfuscatory bullshit phrases: “mode of knowledge production,” “data are translated into public health,” “techniques mediate and enact the linkages between,” “the epidemiological apparatus as a generative machine that is socially performative,” etc.

I will attempt to translate this quickly into English:

People care about health risks. People are interested in whether genetic data can uncover health risks. Medical care and health information on the internet bring health-risk assessment into people’s everyday lives. I observe how European doctors use information about genetic risk factors to help treat their patients. This article examines how doctors interact with their patients. I did a study that mixed up and re-combined categories like “census” and “genomics,” “culture” and “environment” in new ways.* In the West, doctors are now using population-level risk assessments to make decisions about individual patients.

*I am not satisfied with the translation of this sentence, but it didn’t make any sense in the original.

Here is an abstract from the journal of Anthropological Theory, The civility of strangers? Caste, ethnicity, and living together in postwar Jaffna, Sri Lanka:

The question asked by this article is as follows: How do different kinds of people live together in a hierarchical world that has been challenged and transformed through the leveling effects of deep ethnicization and war? … When ethnic mobilization—the possibility of egalitarian mutuality and solidarity as well as the pain, trauma and sacrifice of war, and ethnic cleansing—emerges within deeply hierarchical worlds that continually produce modes of distinction, what kinds of struggles arise within inter-ethnic and intra-caste relations? Given that public life is historically built on unequal participation, and that living together has been a historical struggle, we need to ask how we understand the particular embedded civilities that have made living together such a problem over time. Rather than see civility as an abstract code of prescriptions in relation to the maintenance of non-violent order, I suggest that it is possible to see different modalities of civility produced with regard to specific others/strangers. These modalities can conflict with each other, given that civility can be either hierarchically produced or governed by an egalitarian drive toward public forms of dignity and equality. I propose that civility has a social location, discourses, and understandings in hierarchical worlds that are necessarily different depending on who is speaking.

This could have been an interesting article on life in post-war Sri Lanka, but then it descended into a bunch of post-modernist gobbeldy goop. I find this style of writing utterly self-centered–there is nothing in this abstract about how actual Sri Lankans relate to each other, and much of this abstract could be cut and pasted onto a study of almost any culture without losing anything. Public life in America, Mali, China, and Japan involve unequal participation. Civilities are part of every culture. And, yes, what is considered polite changes depending on who is in the conversation, congratulations, you’ve figured out that people talk to their best friends differently than they talk to their bosses.

The problem with anthropology is that somewhere along the way, someone got the idea that they needed to produce Great and Profound Truths rather than just describe people.

[And here is the point where the rest of this post got accidentally deleted because WordPress updated something in their internal software, causing it to no longer communicate with my 11 year old computer.]

Let’s compare this to the Amazon blurb for Philippe Bourgois’s
ethnography of Puerto Rican crack dealers in NYC:

In this compelling study of the crack business in East Harlem,
Philippe Bourgois argues that a cultural struggle for respect has led
some residents of ‘El Barrio’ away from the legal job market, and into
a downward spiral of crime and poverty. During his many years living
in the neighborhood, Bourgois eventually gained the confianza of
enough Barrio residents to present their hopes, plans, and
disappointments in their own words. The result is an engaging and
often disturbing look at the problems of the inner-city, America’s
greatest domestic failing.

Whether you agree with Bourgois or not, at least you can tell what his
thesis is: cultural struggle for respect leads some people away from
legal jobs and into crime and poverty. (In other words, people don’t
want to do legal jobs that are low-status or lead to others treating
them with disrespect.) By contrast, I’m not sure what the author of the article on post-war Sri Lanka is trying to argue.

Obviously these examples do not represent all modern
anthropology–there are plenty of good and interesting writers out
there (like Bourgois.) But the field is absolutely riddled
with narcissistic crap. Where people should use words that vibrantly
describe their subjects, they instead use vague, nebulous words that
sound erudite but give us no real information. “Study of the crack
business in East Harlem,” sounds interesting, “This layered,
latitudinal (trilateral), anthropological project traces how three
groups of Japanese young people redefine youth,” sounds like you once
dropped your ethnography notes and didn’t bother to put them back in
order again, and “this article offers a phenomenological investigation
of the indeterminate structures of ethical experience,” sounds like
you don’t know the first thing about how ordinary humans think.

Many (if not most) modern anthropologists are deeply motivated by
political concerns that have nothing to do with describing varieties
of human cultures (an anthropologist’s job) and everything to do with
the deep culture of academia (the institution that pays them and
publishes their work.) So of course modern anthropology must be
written to support the anthropologist’s own cultural norms, even if
those norms are at complete variance with their ostensible goal.

Few stories reveal this clash better than Napoleon Chagnon’s. In 1964,
Chagnon began his now-famous study of the Yanomamo, producing what is
in my opinion one of the greatest works of anthropology that I have
not yet read. But I’ll let Amazon tell the story, from the blurb on
Chagnon’s recent book, Noble Savages: My Life Among Two Dangerous
Tribes–the Yanomamo and the Anthropologists:

 When Napoleon Chagnon arrived in Venezuela’s Amazon region in 1964
to study the Yanomamö Indians, he expected to find Rousseau’s “noble
savage.” Instead he found a shockingly violent society. He spent years
living among the Yanomamö, observing their often tyrannical headmen,
learning to survive under primitive and dangerous conditions. When he
published his observations, a firestorm of controversy swept through
anthropology departments. Chagnon was vilified by other
anthropologists, condemned by his professional association (which
subsequently rescinded its reprimand), and ultimately forced to give
up his fieldwork. Throughout his ordeal, he never wavered in his defense of science. In 2012 he was elected to the National Academy of
Sciences.

So if you want some modern anthropology, go read Chagnon and let me
know what you think of it.

The World is Written in Beautiful Maths

Eight Suns

This is a timelapse multiple exposure photo of an arctic day, apparently titled “Six Suns” (even though there are 8 in the picture?) With credit to Circosatabolarc for posting the photo on Twitter, where I saw it. Photo by taken by Donald MacMillan of the Crocker Land Expedition, 1913-1917.

Attempting to resolve the name-suns discrepancy, I searched for “Six Suns” and found this photo, also taken by Donald MacMillan, from The Peary-MacMillian Arctic Museum, which actually shows six suns.

I hearby dub this photo “Eight Suns.”

A reverse image search turned up one more similar photo, a postcard titled “Midnight Sun and Moon,” taken at Fort McMurray on the Arctic Coast, sometime before 1943.

As you can see, above the arctic circle, the sun’s arc lies so low relative to the horizon that it appears to move horizontally across the sky. If you extended the photograph into a time-lapse movie, taken at the North Pole, you’d see the sun spiral upward from the Spring Equinox until it reaches 23.5 degrees above the horizon–about a quarter of the way to the top–on the Summer Solstice, and then spiral back down until the Fall Equinox, when it slips below the horizon for the rest of the year.

In other news, here’s a graph of size vs speed for three different classes of animals–flying, running, and swimming creatures–all of which show the same shape. “A general scaling law reveals why the largest animals are not the fastest” H/T NatureEcoEvo

I love this graph; it is a beautiful demonstration of the mathematics underlying bodily shape and design, not just for one class of animals, but for all of us. It is a rule that applies to all moving creatures, despite the fact that running, flying, and swimming are such different activities.

I assume similar scaling laws apply to mechanical and aggregate systems, as well.

Book Club: How to Create a Mind by Ray Kurzweil pt 1/2

Welcome to our discussion of Ray Kurzweil’s How to Create a Mind: The Secret of Human thought Revealed. This book was requested by one my fine readers; I hope you have enjoyed it.

If you aren’t familiar with Ray Kurzweil (you must be new to the internet), he is a computer scientist, inventor, and futurist whose work focuses primarily on artificial intelligence and phrases like “technological singularity.”

Wikipedia really likes him.

The book is part neuroscience, part explanations of how various AI programs work. Kurzweil uses models of how the brain works to enhance his pattern-recognition programs, and evidence from what works in AI programs to build support for theories on how the brain works.

The book delves into questions like “What is consciousness?” and “Could we recognize a sentient machine if we met one?” along with a brief history of computing and AI research.

My core thesis, which I call the Law of Accelerating Returns, (LOAR), is that fundamental measures of of information technology follow predictable and exponential trajectories…

I found this an interesting sequel to Auerswald’s The Code Economy and counterpart to Gazzaniga’s Who’s In Charge? Free Will and the Science of the Brain, which I listened to in audiobook form and therefore cannot quote very easily. Nevertheless, it’s a good book and I recommend it if you want more on brains.

The quintessential example of the law of accelerating returns is the perfectly smooth, doubly exponential growth of the price/performance of computation, which has held steady for 110 years through two world was, the Great Depression, the Cold War, the collapse of the Soviet Union, the reemergence of China, the recent financial crisis, … Some people refer to this phenomenon as “Moore’s law,” but… [this] is just one paradigm among many.

From Ray Kurzweil,

Auerswald claims that the advance of “code” (that is, technologies like writing that allow us to encode information) has, for the past 40,000 years or so, supplemented and enhanced human abilities, making our lives better. Auerswald is not afraid of increasing mechanization and robotification of the economy putting people out of jobs because he believes that computers and humans are good at fundamentally different things. Computers, in fact, were invented to do things we are bad at, like decode encryption, not stuff we’re good at, like eating.

The advent of computers, in his view, lets us concentrate on the things we’re good at, while off-loading the stuff we’re bad at to the machines.

Kurzweil’s view is different. While he agrees that computers were originally invented to do things we’re bad at, he also thinks that the computers of the future will be very different from those of the past, because they will be designed to think like humans.

A computer that can think like a human can compete with a human–and since it isn’t limited in its processing power by pelvic widths, it may well out-compete us.

But Kurzweil does not seem worried:

Ultimately we will create an artificial neocortex that has the full range and flexibility of its human counterpart. …

When we augment our own neocortex with a synthetic version, we won’t have to worry about how much additional neocortex can physically fit into our bodies and brains, as most of it will be in the cloud, like most of the computing we use today. I estimated earlier that we have on the order of 300 million pattern recognizers in our biological neocortex. That’s as much as could b squeezed into our skulls even with the evolutionary innovation of a large forehead and with the neocortex taking about 80 percent of the available space. As soon as we start thinking in the cloud, there will be no natural limits–we will be able to use billions or trillions of pattern recognizers, basically whatever we need. and whatever the law of accelerating returns can provide at each point in time. …

Last but not least, we will be able to back up the digital portion of our intelligence. …

That is kind of what I already do with this blog. The downside is that sometimes you people see my incomplete or incorrect thoughts.

On the squishy side, Kurzweil writes of the biological brain:

The story of human intelligence starts with a universe that is capable of encoding information. This was the enabling factor that allowed evolution to take place. …

The story of evolution unfolds with increasing levels of abstraction. Atoms–especially carbon atoms, which can create rich information structures by linking in four different directions–formed increasingly complex molecules. …

A billion yeas later, a complex molecule called DNA evolved, which could precisely encode lengthy strings of information and generate organisms described by these “programs”. …

The mammalian brain has a distinct aptitude not found in any other class of animal. We are capable of hierarchical thinking, of understanding a structure composed of diverse elements arranged in a pattern, representing that arrangement with a symbol, and then using that symbol as an element in a yet more elaborate configuration. …

I really want to know if squids or octopuses can engage in symbolic thought.

Through an unending recursive process we are capable of building ideas that are ever more complex. … Only Homo sapiens have a knowledge base that itself evolves, grow exponentially, and is passe down from one generation to another.

Kurzweil proposes an experiment to demonstrate something of how our brains encode memories: say the alphabet backwards.

If you’re among the few people who’ve memorized it backwards, try singing “Twinkle Twinkle Little Star” backwards.

It’s much more difficult than doing it forwards.

This suggests that our memories are sequential and in order. They can be accessed in the order they are remembered. We are unable to reverse the sequence of a memory.

Funny how that works.

On the neocortex itself:

A critically important observation about the neocortex is the extraordinary uniformity of its fundamental structure. … In 1957 Mountcastle discovered the columnar organization of the neocortex. … [In 1978] he described the remarkably unvarying organization of the neocortex, hypothesizing that it was composed of a single mechanism that was repeated over and over again, and proposing the cortical column as the basic unit. The difference in the height of certain layers in different region noted above are simply differences in the amount of interconnectivity that the regions are responsible for dealing with. …

extensive experimentation has revealed that there are in fact repeating units within each column. It is my contention that the basic unit is a pattern organizer and that this constitute the fundamental component of the neocortex.

As I read, Kurzweil’s hierarchical models reminded me of Chomsky’s theories of language–both Ray and Noam are both associated with MIT and have probably conversed many times. Kurzweil does get around to discussing Chomsky’s theories and their relationship to his work:

Language is itself highly hierarchical and evolved to take advantage of the hierarchical nature of the neocortex, which in turn reflects the hierarchical nature of reality. The innate ability of human to lean the hierarchical structures in language that Noam Chomsky wrote about reflects the structure of of the neocortex. In a 2002 paper he co-authored, Chomsky cites the attribute of “recursion” as accounting for the unique language faculty of the human species. Recursion, according to Chomsky, is the ability to put together small parts into a larger chunk, and then use that chunk as a part in yet another structure, and to continue this process iteratively In this way we are able to build the elaborate structure of sentences and paragraphs from a limited set of words. Although Chomsky was not explicitly referring here to brain structure, the capability he is describing is exactly what the neocortex does. …

This sounds good to me, but I am under the impression that Chomsky’s linguistic theories are now considered outdated. Perhaps that is only his theory of universal grammar, though. Any linguistics experts care to weigh in?

According to Wikipedia:

Within the field of linguistics, McGilvray credits Chomsky with inaugurating the “cognitive revolution“.[175] McGilvray also credits him with establishing the field as a formal, natural science,[176] moving it away from the procedural form of structural linguistics that was dominant during the mid-20th century.[177] As such, some have called him “the father of modern linguistics”.[178][179][180][181]

The basis to Chomsky’s linguistic theory is rooted in biolinguistics, holding that the principles underlying the structure of language are biologically determined in the human mind and hence genetically transmitted.[182] He therefore argues that all humans share the same underlying linguistic structure, irrespective of sociocultural differences.[183] In adopting this position, Chomsky rejects the radical behaviorist psychology of B. F. Skinner which views the mind as a tabula rasa (“blank slate”) and thus treats language as learned behavior.[184] Accordingly, he argues that language is a unique evolutionary development of the human species and is unlike modes of communication used by any other animal species.[185][186] Chomsky’s nativist, internalist view of language is consistent with the philosophical school of “rationalism“, and is contrasted with the anti-nativist, externalist view of language, which is consistent with the philosophical school of “empiricism“.[187][174]

Anyway, back to Kuzweil, who has an interesting bit about love:

Science has recently gotten into the act as well, and we are now able to identify the biochemical changes that occur when someone falls in love. Dopamine is released, producing feelings of happiness and delight. Norepinephrin levels soar, which lead to a racing heart and overall feelings of exhilaration. These chemicals, along with phenylethylamine, produce elevation, high energy levels, focused attention, loss of appetite, and a general craving for the object of one’s desire. … serotonin level go down, similar to what happens in obsessive-compulsive disorder….

If these biochemical phenomena sound similar to those of the flight-or-fight syndrome, they are, except that we are running toward something or someone; indeed, a cynic might say toward rather than away form danger. The changes are also fully consistent with those of the early phase of addictive behavior. …  Studies of ecstatic religious experiences also show the same physical phenomena, it can be said that the person having such an experiences is falling in love with God or whatever spiritual connection on which they are focused. …

Religious readers care to weigh in?

Consider two related species of voles: the prairie vole and the montane vole. They are pretty much identical, except that the prairie vole has receptors for oxytocin and vasopressin, whereas the montane vole does not. The prairie vole is noted for lifetime monogamous relationships, while the montane vole resorts almost exclusively to one-night stands.

Learning by species:

A mother rat will build a nest for her young even if she has never seen another rat in her lifetime. Similarly, a spider will spin a web, a caterpillar will create her own cocoon, and a beaver will build a damn, even if no contemporary ever showed them how to accomplish these complex tasks. That is not to say that these are not learned behavior. It is just that he animals did not learn them in a single lifetime… The evolution of animal behavior does constitute a learning process, but it is learning by the species, not by the individual and the fruits of this leaning process are encoded in DNA.

I think that’s enough for today; what did you think? Did you enjoy the book? Is Kurzweil on the right track with his pattern recognizers? Are non-biological neocortexes on the horizon? Will we soon convert the solar system to computronium?

Let’s continue this discussion next Monday–so if you haven’t read the book yet, you still have a whole week to finish.