Here are the numbers I’ve found so far for Neanderthal and Denisovan DNA in different populations:
Sriram Sankararaman et al, in The Combined Landscape of Denisovan and Neanderthal Ancestry in Present-Day Humans, 2016, report:
Native Americans: 1.37%
Central Asia: 1.4%
East Asia: 1.39%
Oceana (Melanesians): 1.54%
South Asia: 1.19%
(I have seen it claimed that the high Neanderthal percents for Oceanan populations (that is, Melanesians and their relatives,) could be a result of Denisovan DNA being incompletely distinguished from Neanderthal.)
Prufer et al, [pdf] 2017, report somewhat higher values:
East Asians: 2.3–2.6%
While Lohse and Frantz estimate an even higher rate of between 3.4–7.3% for Europeans and East Asians. (They found 5.9% in their Chinese sample and 5.3% in their European.)
The Mixe and Karitiana people of Brazil have 0.2% Denisovan (source); other estimates for the amount of Denisovan DNA in Native populations are much lower–ie, 0.05%.
I found an older paper by Prufer et al with estimates for three Hispanic populations, but doesn’t clarify if they have Native American ancestry:
Neandertal ancestry (%)
CEU–Euros from Utah
CHB–Han Chinese Beijing
CHS–Han Chinese South
CLM: Colombians from Medellin
MXL: Mexicans from LA
PUR: Puerto Ricans
LWK: Luhya in Webuye, Kenya
ASW: African Americans South West US
Since the paper is older, all of its estimates are lower than current estimates, because we now have more Neanderthal DNA to compare against. However, you can still see the general trend.
The difference between “autosomes” and “X” highlighted here is that (IIRC) autosomes includes all chromosomes except the XY pair, and X is the X from that pair. They’re breaking them up this way because the X chromosome tends to have very little Neanderthal on it (and the Y even less), probably because Neanderthal DNA on these particular chromosomes was selected against.
Neanderthal DNA appears to have been selected for in areas that control hair and skin–people who had just left Africa were adapted to the African environment, and Neanderthal hair and skin traits helped them survive in colder, darker winters. We also see a lot of Neanderthal DNA influencing inflammation/immune response–these may have helped people fend off new diseases. But we see almost no Neanderthal (or Denisovan) DNA in areas of the genome that code for sperm, eggs, testes, ovaries, etc. These parts of people were probably already finely tuned to work together, didn’t need to change with the environment, and changing anything probably just made them less efficient–so Neanderthal (and Denisovan) DNA on the X and Y chromosomes has been purged from the Homo Sapiens gene pool.
Algeria 44.57% = 0.52% Neanderthal
Tunisia 100.16% = 1.172 N
Tunisia 138.13% = 1.6% N (This is an interesting population that has been highly endogamous and thus better reflects historical populations in the area.)
Egypt 58.45% = 0.68% N
Libya 56.36% = 0.66% N
Morroco North 69.17% = 0.81% N
Morocco South 17.90% = 0.21% N
Saharawi 50.90% = 0.6% N
Canary Island* 101.44% = 1.187% N
China Beijing 193.43% = 2.26 % N
China 195.41% = 2.29% N
Texas Indu Gupti 84.37% =0.987% N
Andalusia*118.66% = 1.39% N
Tuscan 94.90% = 1.11% N
Basque BASC 129.48% = 1.51% N
Galicia* GAL 115.86% = 1.36% N
Yoruba YRI 0.00% = 0% N
Luyha LWK −14.89% = N
The authors note that they are not sure how the Luyha received a negative score–perhaps the presence of admixed DNA from yet another species is interfering with the results.
Denisovan DNA is most commonly found in Melanesians, Papulans, Aboriginal Australians and Aboriginal Filipinos, who all have similar amounts around 4-6%, indicating that they probably were all one group when their ancestors met the Denisovans. However, the similar-looking but historically quite isolated Onge people have no Denisovan–so they split off before the event.
In Papuans, Neanderthal DNA tends to be expressed in brain tissue, Denisovan in bones and other tissues.
Asians have a small amount of Denisovan DNA; Tibetans have a particular gene that lets them absorb oxygen effectively at high altitudes that they got from the Denisovans.
The Mende People of Sierra Leon may derive 13% of their DNA from an as-yet unknown hominin species (ancient DNA and bones do not preserve well in parts of Africa, so finding remains and identifying the species may be difficult.)
The Yoruba derive 8 or 9% of their DNA from the same hominin.
Masai have a small fraction of Neanderthal–since they are 30% non-African, probably about 0.35% of their genome–but you can read the paper yourself.
Biaka Pygmies and Bushmen (San): 2% from an unknown archaic.
With more testing, better and more comprehensive numbers are sure to turn up.
I’m sorry, but I no longer think Native Americans (aka American Indians) have higher than usual levels of Neanderthal DNA. Sorry. Their Neanderthal DNA levels are similar to (but slightly lower than) those of other members of the Greater Asian Clade. They also have a small amount of Denisovan DNA–at least some of them.
Why the confusion? Some Neanderthal-derived alleles are indeed more common in Native Americans than in other peoples. For example, the Neanderthal derived allele SLC16A11 occurs in 10% of sampled Chinese, 0% of Europeans, and 50% of sampled Native Americans. (Today, this gene makes people susceptible to Type 2 diabetes, but it must have been very useful to past people to be found in such a large percent of the population.)
And there was one anomalously high Neanderthal DNA measure in Natives living near the Great Slave Lake, Canada. (Look, I didn’t name the lake.)
But this doesn’t mean all Native Americans possess all Neanderthal alleles in greater quantities.
So how much Neanderthal do Native Americans have? Of course, we can’t quite be sure, especially since only a few Neanderthals have even had their DNA analyzed, and with each new Neanderthal sequenced, we have more DNA available to compare against human genomes. But here are some estimates:
Sriram Sankararaman et al, in The Combined Landscape of Denisovan and Neanderthal Ancestry in Present-Day Humans, report:
Native Americans: 1.37%
Central Asia: 1.4%
East Asia: 1.39%
Oceana (Melanesians): 1.54%
South Asia: 1.19%
I have seen it claimed that the high Neanderthal percents for Oceanan populations (that is, Melanesians and their relatives,) could be a result of Denisovan DNA being incompletely distinguished from Neanderthal.
Prufer et al, [pdf] 2017, report somewhat higher values:
East Asians: 2.3–2.6%
While Lohse and Frantz estimate an even higher rate of between 3.4–7.3% for Europeans and East Asians. (They found 5.9% in their Chinese sample and 5.3% in their European.)
The Mixe and Karitiana people of Brazil have 0.2% Denisovan (source); other estimates for the amount of Denisovan DNA in Native populations are much lower–ie, 0.05%.
I found an older paper by Prufer et al with estimates for three Hispanic populations, but doesn’t clarify if they have Native American ancestry:
CLM–Colombians from Medellin: 1.14%
MXL–Mexicans in LA: 1.22%
PUR–Puerto Rico: 1.05%
Since this is an older paper, all of its estimates may be on the low side.
The absolute values of these numbers is probably less important than the overall ratios, since the numbers themselves are still changing as more Neanderthal DNA is uncovered. The ratios in different papers point to Native Americans having, overall, about the same amount of Neanderthal DNA as their relatives in East Asia.
Melanesians, though. There’s an interesting story lying in their DNA.
I am frequently frustrated by our culture’s lack of good ethnonyms. Take “Hispanic.” It just means “someone who speaks Spanish or whose ancestors spoke Spanish.” It includes everyone from Lebanese-Mexican billionaire Carlos Slim to Japanese-Peruvian Alberto Fujimori, from Sephardi Jews to native Bolivians, from white Argentinians to black Cubans, but doesn’t include Brazilians because speaking Portuguese instead of Spanish is a really critical ethnic difference.*
*In conversation, most people use “Hispanic” to mean “Mexican or Central American who’s at least partially Native American,” but the legal definition is what colleges and government agencies are using when determining who gets affirmative action. People think “Oh, those programs are to help poor, brown people,” when in reality the beneficiaries are mostly well-off and light-skinned–people who were well-off back in their home countries.
This is the danger of using euphemisms instead of saying what you actually mean.
Our ethnonyms for other groups are equally terrible. All non-whites are often lumped together under a single “POC” label, as though Nigerian Igbo and Han Chinese were totally equivalent and fungible peoples. Whites are similarly lumped, as if a poor white from the backwoods of Georgia and a wealthy Boston Puritan had anything in common. There are technical names for these groups, used in historical or academic contexts, but if you tell the average person you hail from a mix of “Cavalier-Yeoman and Cracker ancestors,” they’re just going to be confused.
With the exception of Cajuns and recent immigrants who retain an old-world ethnic identity (eg, Irish, Jewish,) we simply lack common vernacular ethnonyms for the different white groups that settled the US–even though they are actually different.
American ethnic groups are not just Old World ethnic groups that happen to live in America. They’re real ethnicities that have developed over here during the past 500 years, but we have failed to adopt common names for them.
Woodard’s map implies a level of ethnic separation that is probably not entirely accurate, as these groups settled the American frontier in waves, creating layers of ethnicity that are thicker or thinner in different places. Today, we call these social classes, which is not entirely inaccurate.
Take the South. The area is dominated by two main ethnic blocks, Appalachians (in the mountains) and Cavalier-Plantation owners in the flatter areas. But the Cavalier area was never majority wealthy, elite plantation owners; it has always had a large contingent of middling-class whites, poor whites, and of course poor blacks. In areas of the “Deep South” where soils were poor or otherwise unsuited to cultivated, elite planters never penetrated, leaving the heartier backwoods whites–the Crackers–to their own devices.
If their ancestors spoke French, we recognize them as different, but if not, they’re just “poor”–or worse, “trash.”
Southern identity is a curious thing. Though I was born in the South (and my ancestors have lived there for over 400 years,) I have no meaningful “Southern identity” to speak of–nor do, I think, most southerners. It’s just a place; the core historical event of going to war to protect the interests of rich elites in perpetuating slavery doesn’t seem to resonate with most people I’ve met.
My interest in the region and its peoples stems not from Southern Pride, but the conventional curiosity adoptees tend to feel about their birth families: Where did I come from? What were they like? Were they good people? and Can I find a place where I feel comfortable and fit in? (No.)
My immediate biological family hails from parts of the South that never had any plantations (I had ancestors in Georgia in the 1800s, and ancestors in Virginia in the 1700s, but they’ve been dead for a while; my father lives within walking distance of his great-grandparent’s homestead.)
As previously discussed, I don’t exactly feel at home in cities; perhaps this is because calling my ancestors “farmers” is a rather generous description for folks who thought it was a good idea to move to Oklahoma during the Dust Bowl.
(By the way, the only reason the prairies are consistently farmed today is due to irrigation, drawing water up from the Ogallala and other aquifers, and we are drawing water from those aquifers much faster than it is being replenished. If we keep using water at this rate–or faster, due to population growth–WE WILL RUN OUT. The prairies will go dry and dust storms will rage again.)
To be fair, some of my kin were successful farmers when it actually rained, but some were never so sedentary. Pastoralists, ranchers, hoe-farmers–they were the sorts of people who settled frontiers and moved on when places got too crowded, who drank hard and didn’t always raise their children. They match pretty closely Richard Sapp’s description of the Florida Crackers.
From a genetic standpoint, the Crackers are either descended from borderlanders and Scotch-Irish (the pink region on the map at the top of the post,) or from folks who got along well with borderlanders and decided to move alongside them. I find it amazing that a relatively small place like Britain could produce such temperamentally different peoples as Puritans and Crackers–the former hard working, domesticated, stiff, and proper; the latter loud, liberty-loving, and more violent.
Peter Frost (evo and proud) has a theory that “core” Europe managed to decrease its homicide rates by executing criminals, thus removing them from the gene pool; the borderlands of Scotland and Ireland were perhaps beyond the reach of the hangman’s noose, or hopping the border allowed criminals to escape the police.
“The third American Revolution reached its climax in the years from 1779 to 1781. This was a rising of British borderers in the southern backcountry against American loyalists and British regulars who invaded the region. The result was a savage struggle which resembled many earlier conflicts in North Britain, with much family feuding and terrible atrocities committed on both sides. Prisoners were slaughtered, homes were burned, women were raped and even small children were put to the sword.” …
i’ve got a couple of posts related to those rambunctious folks from the backcountry whose ancestors came from the borderlands between england and scotland. libertarian crackers takes a quick look at why this group tends to love being independent and is distrustful of big gubmint — to make a long story short, the border folks married closely for much longer than the southern english — and they didn’t experience much manorialism, either (the lowland scots did, but not so much the border groups). did i mention that they’re a bit hot-headed? (not that there’s anything wrong with that! (~_^) ) see also: hatfields and mccoys. not surprising that this group’s war of independence involved “much family feuding.”
Less manorialism, less government control, less executing criminals, more cousin-marriage, more clannishness.
During the experiment, a confederate bumped some subjects and muttered ‘asshole’ at them. Cortisol (a stress hormone) and testosterone (rises in preparation for violence) were measured before and after the insult. Insulted Southerners showed big jumps in both cortisol and testosterone compared to uninsulted Southerners and insulted Northerners. The difference in psychological and physiological responses to insults was manifest in behavior. Nisbett and Cohen recruited a 6’3” 250 lb (190 cm, 115 kg) American style football player whose task was to walk down the middle of a narrow hall as subjects came the other direction. The experimenters measured how close subjects came to the football player before stepping aside. Northerners stepped aside at around 6 feet regardless of whether they had been insulted. Un-insulted Southerners stepped aside at an average distance of 9 feet, whereas insulted Southerners approached to an average of about 3 feet. Polite but prepared to be violent, un-insulted Southerners take more care, presumably because they attribute a sense of honor to the football player and are normally respectful of others’ honor. When their honor is challenged, they are prepared and willing to challenge someone at considerable risk to their own safety.”
(The bit about honor is… not right. I witnessed a lot of football games as a child, and no one ever referred to the players as “honorable.” Southerners just don’t like to get close to each other, which is very sensible if people in your area get aggressive and angry easily. The South also has a lower population density than the North, so people are used to more space.)
As my grandmother says, “You don’t get to pick your ancestors.” I don’t know what I would think of my relatives had I actually grown up with them. They have their sins, like everyone else. But from a distance, as an adult, they’re fine people and they always have entertaining stories.
“Oh, yes, yet another time I almost died…”
As for racial attitudes, if you’re curious, they vary between “probably marched for Civil Rights back in the 50s” and “has never spoken a word, good or bad, generalizing about any ethnic group.” (I have met vocally anti-black people in the South; just not in my family.) I think my relatives are more interested in various strains of Charismatic Christianity than race.
It seems rather unfortunate that Southern identity is so heavily linked to the historical interests of the Plantation Elites. After all, it did the poor whites no good to die in a war fought to protect the interests of the rich. I think the desire to take pride in your ancestors and group is normal, healthy, and instinctive, but Southerners are in an unfortunate place where that identity is heavily infused with a racial ideology most Southerners don’t even agree with.
> Be white
> Be from the south
> Not into Confederacy
> Want an identity of some sort
> Now what?
In my case, I identify with nerds. This past is not an active source of ethnic identity, nor is the Cracker lifestyle even practical in the modern day. But my ancestors have still contributed (mostly genetically) to who I am.
Well, this was going to just be an introduction to today’s anthropology selection, but it turned out rather longer than expected, so let’s just save the real anthropology for next week.
I almost feel sad for Senator Warren. One day, a little girl looked in the mirror, saw pale skin, brown hair, and blue eyes looking back at her, and thought, “No. This can’t be right. This isn’t me.”
So she found a new identity, based on a family legend–a legend shared by a suspicious number of white people–that one of her ancestors was an American Indian.
This new identity conveyed certain advantages: Harvard Law claimed her as a Native American to boost claims of racial diversity among the faculty:
A majority [83%] of Harvard Law School students are unhappy with the level of representation of women and minorities on the Law School faculty, according to a recent survey. …
Law students said they want to learn from a variety of perspectives and approaches to the law. “A black male from a lower socioeconomic background will approach the study of constitutional law in a different way from a white upper-class male,” Reyes said. …
Of 71 current Law School professors and assistant professors, 11 are women, five are black, one is Native American and one is Hispanic, said Mike Chmura, spokesperson for the Law School.
Although the conventional wisdom among students and faculty is that the Law School faculty includes no minority women, Chmura said Professor of Law Elizabeth Warren is Native American.
In response to criticism of the current administration, Chmura pointed to “good progress in recent years.”
The University of Pennsylvania chose not to tout in the press their newly minted Native American professor. But her minority status was duly noted: The university’s Minority Equity Report, published in April 2005, shows that Warren won a teaching award in 1994. Her name is in bold and italicized to indicate she was a minority. …
The law school was happy to have her count as a diversity statistic, however, and for at least three of the years she taught there — 1991, 1992, and 1994 — an internal publication drawing on statistics from the university’s federal affirmative action report listed one Native American female professor in the university’s law school.
Warren’s Native American identity may have played no role in her hiring (the committees involved appear not to have known or cared about her identity,) but it seems to have been important to Warren herself. As her relatives aged and died, and she moved away from her childhood home in Oklahoma and then Texas, she was faced with that persistent question: Who am I?
The truth, a white woman from a working class family in Oklahoma, apparently wasn’t enough for Elizabeth. (Oklahoma doesn’t carry many status points over in East Coast academic institutions.)
Each of us is the sum of many things, including the stories our families tell us and genetic contributions from all of our ancestors–not just the interesting ones (within a limit–after enough generations, each individual contribution has become so small that it may not be passed on in reproduction.)
I have also done the 23 and Me thing, and found that I hail from something like 20 different ethnic groups–including, like Warren, a little smidge of Native American. But none of those groups make up the majority of my DNA. All of them are me; none of them are me. I just am.
Warren’s announcement of her DNA findings vindicated her claim to a Native American ancestor and simultaneously unveiled the absurdity of her claim to be a Native American. What should have been a set of family tales told to friends and passed on to children and grandchildren about a distant ancestor became a matter of national debate that the Cherokee Nation itself felt compelled to weigh in on:
Using a DNA test to lay claim to any connection to the Cherokee Nation or any tribal nation, even vaguely, is inappropriate and wrong. It makes a mockery out of DNA tests and its legitimate uses while also dishonoring legitimate tribal governments and their citizens, whose ancestors are well documented and whose heritage is proven. Senator Warren is undermining tribal interests with her continued claims of tribal heritage.
Like them or not, the Cherokee have rules about who is and isn’t a Cherokee, because being Cherokee conveys certain benefits–for example, the tribe builds houses for members and helps them look for jobs. This is why conflicts arise over matters like whether the Cherokee Freedmen are official members. When membership in a group conveys benefits, the borders of that group will be policed–and claims like Warren’s, no matter how innocently intended, will be perceived as an attempt at stealing something not meant for her.
Note: I am not saying this kind of group border policing is legitimate. Many “official” Cherokee have about as much actual Cherokee blood in them as Elizabeth Warren, but they have a documented ancestor on the Dawes Rolls, so they qualify and she doesn’t. Border policing is just what happens when there are benefits associated with being part of a group.
I don’t have an issue with Warren’s own self-identity. After all, if race is a social construct,* then she’s doing it exactly right. She’s allowed to have an emotional connection to her own ancestors, whether that connection is documented via the Dawes Rolls or not. All of us here in America should have equal access to Harvard’s benefits, not just the ones who play up a story about their ancestors.
The sad thing, though, is that despite being one of the most powerful and respected women people in America, she still felt the need to be more than she is, to latch onto an identity she doesn’t truly possess.
You know, Elizabeth… it’s fine to just be a white person from Oklahoma. It’s fine to be you.
*Note: This blog regards “species” and nouns generally as social constructs, because language is inherently social. That does not erase biology.
Ice packs (cold packs) applied to the lower back at the first sign of a seizure may be able to halt or significantly decrease the severity of a seizure in humans.
I consider this one of the most important posts I’ve written, because it is the only one that offers useful, real-life advice: if someone is having a seizure, grab an ice pack or two and press them against the person’s back/neck. There is very little you can do for someone who is already having a seizure besides making sure they don’t accidentally hurt themselves, but using ice packs may help decrease the duration and severity of the seizure.
We have been using an ice pack on our 13 yr old Son’s neck to stop seizures for nearly a year now and it works without fail to bring the seizures to an end within seconds of applying the ice. This is an old technique used before medications were invented, you can read about it at The Meridian Foundation papers on Edgar Case and Abdominal epilepsy.
Here is a relevant quote from Cayce’s paper on abdominal epilepsy:
… Also note that the reflex from the abdomen was mediated through the medulla oblongata, a important nerve center at the upper portion of the spinal cord where it enters the skull. This is significant because Cayce sometimes recommended that a piece of ice be placed at this area during the aura or at the beginning of the seizure. This simple technique has proven effective in several contemporary cases where Cayce’s therapeutic model has been utilized. Incidentally, this technique for preventing seizures was also used by osteopathic physicians during the early decades of this century and is included in the therapeutic model developed by the Meridian Institute. …
If the subject is currently experiencing seizures and can sense the beginning of the episode, they are encouraged to use a piece of ice at the base of the brain for one to two minutes.
We have been using ice packs to help manage our girl’s seizures for over a year now. From what I have heard first hand from others is that it either doesn’t work at all or it works fabulously. With our girl it “works fabulously”. It is not the miracle cure and it does not prevent future seizures but it definitely stops her grand mal right in its tracks. It is the most amazing thing I have ever seen. … If we get the ice pack on her within the first 15 seconds or so, the grand mal just suddenly stops. Like a light switch. All motor movement comes to a halt. She continues to be incoherent for a bit but all movements stop.
Oddly, though, I haven’t found much discussion of the use of ice packs on humans. But if it works on dogs, why wouldn’t it work on people? On the grand evolutionary scale, our nervous systems are pretty similar–we’re both mammals with neocortexes, after all.
My epileptic friend has also reported continued good success with the technique; her husband says he can feel an immediate change in the pattern of the seizure
My original post outlines some of the scientific evidence in favor of the technique; I’ll just quote one bit:
Fifty-one epileptic canine patients were successfully treated during an epileptic seizure with a technique involving the application of ice on the back (T10 to L4). This technique was found to be effective in aborting or shortening the duration of the seizure.
I suspect the “ice trick” was once fairly well-known before there were medications for preventing seizures, but modern doctors are just taught about the medications. And ice packs, to be clear, can’t cure epilepsy. But they can help people who are in the midst of a seizure.
Any doctors out there, please do some research on this. I think a lot of people could benefit.
About two years ago I reviewed Lois Lenski’s Strawberry Girl, a middle grade novel about the conflict between newly arrived, dedicated farmers and long-established families of hoe-farmers/ranchers/hunters in the backwoods of Florida. It was a pleasant book based on solid research among the older residents, but left me with many questions (as surely any children’s book would)–most notably, was the author’s description of the newly arrived farmers as “Crackers” accurate, or should the term be more properly restricted to the older, wilder inhabitants?
I had not known, prior to Lenski’s book, that “Cracker” even was an ethnonym; today it is used primarily as a slur, the original Crackers and their lifestyle having all but disappeared. Who were the Crackers? Where did they come from? Do they hail from the same stock as settled Appalachia (the mountains, not to be confused with Apalachee, the county in Florida we’ll be discussing in this post,) or different stock? Or is there perhaps a common stock that runs throughout America, appearing as more or less of the population in proportion to the favorability of the land for their lifestyles?
Today I happened upon Richard Wayne Sapp’s ethnography of Apalachee County, Florida: Suwannee River Town, Suwannee River Country: political moieties in a southern county community, published in 1976, which directly addresses a great many of my questions. So far it has been a fascinating book, and I am glad I found it.
I must note, though, that there currently is no “Apalachee County” in Florida. (There are an Apalachee Parkway and an Apalachee Park, though.) However, comparing the maps and geographic details in the book with a current map of Florida reveals that Apalachee Count is now Suwannee County. Wikipedia should note the change.
So without further ado, here are a few interesting quotes :
Apalachee County, a north Florida county community, nestles in a bend of the Suwanee River. The urban county seat is the center of government and associational life. Scattered over the country-side are farming neighborhoods whose interactional centers are rural churches. Count seat and rural neighborhoods are coupled by mutual exchanges of goods and services: neither are, of themselves, cultural wholes. The poor quality of its soils and the relative recency of settlement (post-Civil-War) give the community its distinctiveness; it never had a planting elite.
Apalachee society is structured along moiety lines: town and country.
EvX: “Moiety” means half; Wikipedia defines it in anthropology as:
a [kinship] descent group that coexists with only one other descent group within a society. In such cases, the community usually has unilineal descent, either patri- or matri-lineal so that any individual belongs to one of the two moiety groups by birth, and all marriages take place between members of opposite moieties. It is an exogamousclan system with only two clans.
Here I think Sapp is using moiety more in the sense of “two interacting groups that form a society” without implying that all town people take country spouses and vice versa. But continuing:
These halves rest on an earlier “cracker” horizon of isolated single-family homesteads. True crackers subsisted by living off the land and practicing hoe agriculture; they were fiercely independent and socially isolated. Apalachee moieties are also related to regional traditions: townsmen as town naboobs in the Cavalier tradition and countrymen as yeoman farmers in the Calvinist tradition. Townsmen promote associational interaction, valuing familism (nuclear), hierarchy in organisations, “progress,” and paternalistic interaction with countrymen. Countrymen value familism (extended), localism, and personalism, interacting on individually egalitarian rather than ordered associational terms. …
The division of governmental offices falls along moiety lines. Townsmen control municipal government, countrymen control the powerful county bodies. Except for jobs, the governmental institution is not a major source of political prizes. The country moiety is the dominant political force.”
EvX: There follows a fascinating description of the battle over a referendum on whether the county should stay “dry” (no legal sale of alcohol) or go “wet” (alcohol sales allowed.) The Wets, led by business interests, had hoped that an influx of new residents who held more pro-alcohol views than established residents would tip the electoral balance in their favor. I find this an interesting admission of one of democracy’s weak points–the ability of newcomers to move into an area and vote to change the laws in ways the folks who already live there don’t like.
The Drys, led by local Baptist pastors, inflamed local sentiments against the wets, who were supposedly trying to overturn the law just to make make a hotel chain more interested in buying a tract of land owned by the leader of the Wets. The Wets argued the sale would attract more businesses to the area, boosting the economy; the Drys argued that the profits would go entirely to the wets and the community itself would reap the degradation and degeneration caused by alcohol.
The Drys won, and the leader of the Wets hasn’t set foot in a church in Apalachee county since then.
(Suwannee/Apalachee county finally allowed the sale of alcohol in 2011.)
Does a county’s wet or dry status impact the willingness of businesses to move into the area, leading to depressed economies for Drys? I wanted to find out, so I pulled up maps of current dry counties and per capita GDP by county. It’s not a perfect comparison since it doesn’t control for cost of living, but it’ll do.
In general, I don’t think the theory holds. Suwanee, dry until 2011, is doing better than neighboring counties that went wet earlier (some of those neighboring counties are very swampy.) Central Mississippi is wet, but doing badly; a string of dry counties runs down the east side of the state, and unless my eyes deceive me, they’re doing better than the wet counties. Kentucky’s drys are economically depressed, but so are West Virginia’s wets. Pennsylvania and Texas’s “mixed” counties are doing fine, while Texas’s wets are doing badly. Virginia has some pretty poor wet counties; Alaska’s dry county is doing great.
However, this is only a comparison of currently dry and wet counties; if I had data that showed for what percent of the 20th century each county allowed the sale of alcohol, that might provide a different picture.
Still, I’m willing to go out on a limb, here: differences in local GDP have more to do with demographics than the sale of one particular beverage.
But back to Sapp:
A system of human community derivative of Europe and still basic to the southern United States is the county-community. … The symbolic heart of this traditional community, the county courthouse, has been the central point of political and economic assembly for county residents. Its people lived dispersed in neighborhoods clustered about small Protestant churches, points of assembly in socialization and socializing as well as bastions of moral and spiritual rectitude.
He quotes Havard, 1972, on the traits of the Calvinist-Yeoman Farmer–radical individualism, personalism, personal independence, populism, regionalist traditions, etc–vs the Cavalier-Planter/Town Nabob–social conformity, caste, paternalistic dependency, conservatism, nationalist patriotism.
He wrote that this split fathered two mainstream traditions in the South: yeoman farmer and plantation farmer. The yeoman farmers, he said, opposed governmental centralization and exhibited an aversion to urbanism, industrialization, and the entrepreneurial classes; they were libertarian, egalitarian, and populist. The plantation whigs, identified withdowntown mercantile interests, supported themselves as planters … bankers, and merchants, sat as the “county seat clique,” developed the theme of racial segregation in the post-bellum era, and promoted a cult of “manners” and paternalism. …
However, the Cavalier plantation elite never really settled in Apalachee/Suwannee county, due to its soil being much too poor for serious agriculture.
As a result, not many slaves were ever brought into the county, nor have their descendants migrated to the area. Since the population is mostly white, racial issues appear only rarely in the book, and it is safe to say that the culture never developed in quite the same ways as it did in the plantation-dominated Deep South.
Rather, Apalachee was settled by the Cavalier-Yeomen farmers and the Crackers:
Although the origin of the term cracker is disputed, Stetson Kennedy claims that cracker first applied to an assortment of “bad characters” who gathered in northern Florida before it became a territory of the United States. Deep-South Southerners later applied the epithet to the “poor white folk of Florida, Georgia, and Alababama.” (Kennedy, 1942, p. 59). He further relates:
“Crackers are mainly descended from the Irish, Scotch, and English stock which, from 1740 on, was slowly populating the huge Southern wilderness behind the thin strip of coastal civilization. These folk settled the Cumberland Valley, the Shenandoah, and spread through every Southern state east of the Mississippi. That branch of the family which settled in the Deep South was predominantly of Irish ancestry…
“The early crackers were the Okies of their day (as they have been ever since). Cheated of land, not by wind and erosion, but by the plantation and slavery system of the Old South, they were nonessentials in an economic, political and social order dominated by the squirearchy of wealthy planters, and in most respects were worse off than the Negro slaves. “
This contradicts the history told in our prior ethnography of Appalachia, which claims pointedly that the denizens of the Cumberland are not descended from the “poor whites” of the Deep South, but from Pennsylvanians. I offer, however, a synthesis: both the whites who settled on the Pennsylvania frontier and followed Daniel Boone into the Cumberland and found it pleasant enough to remain in the mountains and the whites who adopted an only semi-agricultural lifestyle in the backwoods and swamps of Florida hailed from the same original British stock and simply took different routes to get where they were going.
Powell, (1969) a white turpentine camp overseer of the late nineteenth century, called the crackers of Apalachee County “wild woodsmen” (p. 30) and mentioned a man who “had lived the usual life of a shiftless Cracker, hunting and fishing, and hard work did not agree with him.” …
“When I speak of villages throughout this county, I use the word for lack of a better term, for in nine cases out of ten, they were the smallest imaginable focus of the scattering settlement, and usually one general store embraced the sum total of business enterprise. There the natives came at intervals to trade for coffee, tobacco, and the few other necessities that the woods and waters did not provide them with. Alligators’ hides and teeth, bird plumes and various kinds of pelts were the medium of barter. They were a curious people, and there are plenty of them there yet, born and bred to the forest and as ignorant of the affairs of every-day life outside of their domain, as are the bears and deer upon which they mainly subsist. A man who would venture to tell them that the earth moved instead of the sun, or that there was a device by which a message could be flashed for leagues across a wire, wold run the risk of being lynched, as too dangerous a liar to be at large. “
There is a section on the importance of guns and hunting to the locals, even the children, which will be familiar to anyone with any experience of the rural South. I know from family tales that my grandfather began to hunt when he was 8 years old; he used to sell the pelts of skunks he’d killed to furriers, who de-stinked them, dyed them black, and marketed them as “American sable” over in Europe.
Truth in Advertising laws decimated the “American sable” trade.
The true crackers, Powell’s “wild woodsmen,” were never numerous, and they rarely participated in the social life of the wider Apalachee county-community. Crackers were born, lived, and died in the woods. They buried their own in family plots far from the nearest church. … Cracker families settled the Apalachee area without recourse to legal formalities. Thus, when the yeomen farmers … eventually purchase legal titles to land, true crackers were forced out and deeper into Florida.
This is a common problem (especially for anyone whose ancestors arrived in an area before it was officially part of the US.) Where land is abundant, population density is low, and there aren’t any authorities who can enforce land ownership, anyway, people will be happy to farm where they want, hunt where they want, and defend their claims themselves. This tends to lead to a low-intensity lifestyle:
Craker subsistence strategy depended on scratch, perhaps slash-and-burn, summer agriculture and year-round food collecting activities: hunting, fishing, and foraging. Because their farming operations were so small, limited to the part-time efforts of an individual family, they had no need of financial credit.
Indeed, their fiercely independent, egalitarian ethos prohibited them from interacting significantly in the rural neighborhoods of the community. …
Few true crackers remain in Apalachee County … A few families still live on the borders of the county. There they exploit the food resources of the rivers and swamps and perhaps scratch-farm a few acres. …
This is not (just) laziness; areas with poor soils or little water simply can’t be intensively farmed, and if the forage is bad, herd animals will be better off if they can wander widely in search of food than if they are confined in one particular place.
Incidentally, there is a landrace of cattle known as the Florida Cracker, descended from the hearty Spanish cattle brought to Florida in the 1600s. Unfortunately, the breed has been on the decline and is now listed as “critical” due to laws passed in 1949 against free-ranging livestock and the introduction of larger breeds more suited to confinement.
Not only does the law fence off the cracker’s land, destroy his livelihood, and drive him out, it also kills the cracker cow by fencing off its land.
The author notes that “cracker” is a slur and that it has been expanded in the past half-century to cover all poor whites, with an interesting footnote:
One speculates that the driving force behind withholding respectability from the true crackers and the extension of the consequently disparaging term to include countrymen of the small farmer class originated with the townspeople. This idea parallels the hypothesis that townsmen perpetuated and revitalized the issue of racial politics int he twentieth century.
The technological changes of the twentieth century have enabled social institutions to penetrate the isolation of the crackers and enforce town mores. Cracker homicides are no longer unreported and uninvestigated or allowed to result in clannish feuding… No longer may the children escape the public school regimen. No longer may they escape taxation…
[yet] the cracker and his world view persist. While only a handful of true crackers endure in the county… modern-day imitators erect trailers in remote corners, moving to north-central Florida …. to escape the “rat race.”
I think that’s enough for today; I hope you’ve enjoyed the book and urge you to take a look at the whole thing. We’ll discuss the more recent Cavalier-Yeomen farmers next week.
Do people eventually grow ideologically resistant to dangerous local memes, but remain susceptible to foreign memes, allowing them to spread like invasive species?
And if so, can we find some way to memetically vaccinate ourselves against deadly ideas?
Memetics is the study of how ideas (“memes”) spread and evolve, using evolutionary theory and epidemiology as models. A “viral meme” is one that spreads swiftly through society, “infecting” minds as it goes.
Of course, most memes are fairly innocent (e.g. fashion trends) or even beneficial (“wash your hands before eating to prevent disease transmission”), but some ideas, like communism, kill people.
Ideologies consist of a big set of related ideas rather than a single one, so let’s call them memeplexes.
Almost all ideological memeplexes (and religions) sound great on paper–they have to, because that’s how they spread–but they are much more variable in actual practice.
Any idea that causes its believers to suffer is unlikely to persist–at the very least, because its believers die off.
Over time, in places where people have been exposed to ideological memeplexes, their worst aspects become known and people may learn to avoid them; the memeplexes themselves can evolve to be less harmful.
Over in epidemiology, diseases humans have been exposed to for a long time become less virulent as humans become adapted to them. Chickenpox, for example, is a fairly mild disease that kills few people because the virus has been infecting people for as long as people have been around (the ancestral Varicella-Zoster virus evolved approximately 65 million years ago and has been infecting animals ever since). Rather than kill you, chickenpox prefers to enter your nerves and go dormant for decades, reemerging later as shingles, ready to infect new people.
By contrast, smallpox (Variola major and Variola minor) probably evolved from a rodent-infecting virus about 16,000 to 68,000 years ago. That’s a big range, but either way, it’s much more recent than chickenpox. Smallpox made its first major impact on the historical record around the third century BC, Egypt, and thereafter became a recurring plague in Africa and Eurasia. Note that unlike chickenpox, which is old enough to have spread throughout the world with humanity, smallpox emerged long after major population splits occurred–like part of the Asian clade splitting off and heading into the Americas.
By 1400, Europeans had developed some immunity to smallpox (due to those who didn’t have any immunity dying), but when Columbus landed in the New World, folks here had had never seen the disease before–and thus had no immunity. Diseases like smallpox and measles ripped through native communities, killing approximately 90% of the New World population.
If we extend this metaphor back to ideas–if people have been exposed to an ideology for a long time, they are more likely to have developed immunity to it or the ideology to have adapted to be relatively less harmful than it initially was. For example, the Protestant Reformation and subsequent Catholic counter-reformation triggered a series of European wars that killed 10 million people, but today Catholics and Protestants manage to live in the same countries without killing each other. New religions are much more likely to lead all of their followers in a mass suicide than old, established religions; countries that have just undergone a political revolution are much more likely to kill off large numbers of their citizens than ones that haven’t.
This is not to say that old ideas are perfect and never harmful–chickenpox still kills people and is not a fun disease–but that any bad aspects are likely to become more mild over time as people wise up to bad ideas, (certain caveats applying).
But this process only works for ideas that have been around for a long time. What about new ideas?
You can’t stop new ideas. Technology is always changing. The world is changing, and it requires new ideas to operate. When these new ideas arrive, even terrible ones can spread like wildfire because people have no memetic antibodies to resist them. New memes, in short, are like invasive memetic species.
In the late 1960s, 15 million people still caught smallpox every year. In 1980, it was declared officially eradicated–not one case had been seen since 1977, due to a massive, world-wide vaccination campaign.
Humans can acquire immunity to disease in two main ways. The slow way is everyone who isn’t immune dying; everyone left alive happens to have adaptations that let them not die, which they can pass on to their children. As with chickenpox, over generations, the disease becomes less severe because humans become successively more adapted to it.
The fast way is to catch a disease, produce antibodies that recognize and can fight it off, and thereafter enjoy immunity. This, of course, assumes that you survive the disease.
Vaccination works by teaching body’s immune system to recognize a disease without infecting it with a full-strength germ, using a weakened or harmless version of the germ, instead. Early on, weakened germs from actual smallpox scabs or lesions to inoculate people, a risky method since the germs often weren’t that weak. Later, people discovered that cowpox was similar enough to smallpox that its antibodies could also fight smallpox, but cowpox itself was too adapted to cattle hosts to seriously harm humans. (Today I believe the vaccine uses a different weakened virus, but the principle is the same.)
The good part about memes is that you do not actually have to inject a physical substance into your body in order to learn about them.
Ideologies are very difficult to evaluate in the abstract, because, as mentioned, they are all optimized to sound good on paper. It’s their actual effects we are interested in.
So if we want to learn whether an idea is good or not, it’s probably best not to learn about it by merely reading books written by its advocates. Talk to people in places where the ideas have already been tried and learn from their experiences. If those people tell you this ideology causes mass suffering and they hate it, drop it like a hot potato. If those people are practicing an “impure” version of the ideology, it’s probably an improvement over the original.
For example, “communism” as practiced in China today is quite different from “communism” as practiced there 50 years ago–so much so that the modern system really isn’t communism at all. There was never, to my knowledge, an official changeover from one system to another, just a gradual accretion of improvements. This speaks strongly against communism as an ideology, since no country has managed to be successful by moving toward ideological communist purity, only by moving away from it–though they may still find it useful to retain some of communism’s original ideas.
I think there is a similar dynamic occurring in many Islamic countries. Islam is a relatively old religion that has had time to adapt to local conditions in many different parts of the world. For example, in Morocco, where the climate is more favorable to raising pigs than in other parts of the Islamic world, the taboo against pigs isn’t as strongly observed. The burka is not an Islamic universal, but characteristic of central Asia (the similar niqab is from Yemen). Islamic head coverings vary by culture–such as this kurhars, traditionally worn by unmarried women in Ingushetia, north of the Caucuses, or this cap, popular in Xianjiang. Turkey has laws officially restricting burkas in some areas, and Syria discourages even hijabs. Women in Iran did not go heavily veiled prior to the Iranian Revolution. So the insistence on extensive veiling in many Islamic communities (like the territory conquered by ISIS) is not a continuation of old traditions, but the imposition of a new, idealized, version of Islam.
Purity is counter to practicality.
Of course, this approach is hampered by the fact that what works in one place, time, and community may not work in a different one. Tilling your fields one way works in Europe, and tilling them a different way works in Papua New Guinea. But extrapolating from what works is at least a good start.
This is my first creation, Nandy the Neanderthal, based on the Chapelle-aux-Saints 1 skull and this side view. Note that he is based on two different skulls, but still very much a Neanderthal.
Since this is my very first creation and I don’t have a 3D printer yet, (I expect to receive one soon and am planning ahead,) I am still learning all of the ins and outs of this technology and so would appreciate any technical feedback.
Neanderthals evolved around 600,000-800,000 years ago and spread into the Middle East, Europe, and central Asia. They made stone tools, controlled fire, and hunted. They survived in a cold and difficult climate, but likely could make no more than the simplest of clothes. As a result, they may have been, unlike modern humans, hairy.
Cochran and Harpending of West Hunter write in The 10,000 Year Explosion:
Chimpanzees have ridges on their finger bones that stem from the way that they clutch their mothers’ fur as infants. Modern humans don’t have these ridges, but Neanderthals do.
Hoffecker, in The Spread of Modern Humans in Europe writes:
Neanderthal sites show no evidence of tools for making tailored clothing. There are only hide scrapers, which might have been used to make blankets or ponchos. This is in contrast to Upper Paleolithic (modern human) sites, which have an abundance of eyed bone needles and bone awls.
Their skulls were, on average, larger than ours, with wide noses, round eyes, and an elongated braincase. Their facial features were robust–that is, strong, thick, and heavy.
The Chappel-aux-Saints 1 Neanderthal lived to about 40 years old. He had lost most of his teeth years before his death, (I gave Nandy some teeth, though,) suffered arthritis, and must have been cared for in his old age by the rest of his tribe. At his death he was most likely buried in a pit dug by his family, which preserved his skeleton in nearly complete condition for 60,000 years.
Anatomically modern humans, Homo sapiens, encountered and interbred with Neanderthals around 40,000 years ago. (Neanderthals are also humans–Homo neanderthalensis.) Today, about 1-5% of the DNA in non-Sub-Saharan Africans hails originally from a Neanderthal ancestor. (Melanesians also have DNA from a cousin of the Neanderthals, the Denisovans, and Sub-Saharan Africans may have their own archaic ancestors.)
Unfortunately for Nandy and his relations, the Neanderthals also began to disappear around 40,000 years ago. Perhaps it was the weather, or Homo sapiens out competed them, or their enormous skulls just caused too much trouble in childbirth. Whatever happened, the Neanderthals remain a mystery, evidence of the past when we weren’t the only human species in town.
My endless inquiries made it impossible for me to achieve anything. Moreover, I get to think about my own thoughts of the situation in which I find myself. I even think that I think of it, and divide myself into an infinite retrogressive sequence of ‘I’s who consider each other. I do not know at which ‘I’ to stop as the actual, and as soon as I stop, there is indeed again an ‘I’ which stops at it. I become confused and feel giddy as if I were looking down into a bottomless abyss, and my ponderings result finally in a terrible headache. –Møller, Adventures of a Danish Student
Moller’s Adventures of a Danish Student was one of Niels Bohr’s favorite books; it reflected his own difficulties with cycles of ratiocination, in which the mind protects itself against conclusions by watching itself think.
I have noticed a tendency on the left, especially among the academic-minded, to split the individual into sets of mental twins–one who is and one who feels that it is; one who does and one who observes the doing.
Take the categories of “biological sex” and “gender.” Sex is defined as the biological condition of “producing small gametes” (male) or “producing large gametes” (female) for the purpose of sexual reproduction. Thus we can talk about male and female strawberry plants, male and female molluscs, male and female chickens, male and female Homo Sapiens.
(Indeed, the male-female binary is remarkably common across sexually reproducing plants and animals–it appears that the mathematics of a third sex simply don’t work out, unless you’re a mushroom. How exactly sex is created varies by species, which makes the stability of the sex-binary all the more remarkable.)
And for the first 299,945 years or so of our existence, most people were pretty happy dividing humanity into “men” “women” and the occasional “we’re not sure.” People didn’t understand why or how biology works, but it was a functional enough division for people.
In 1955, John Money decided we needed a new term, “gender,” to describe, as Wikipedia puts it, “the range of characteristics pertaining to, and differentiating between, masculinity and femininity.” Masculinity is further defined as “a set of attributes, behaviors, and roles associated with boys and men;” we can define “femininity” similarly.
So if we put these together, we get a circular definition: gender is a range of characteristics of the attributes of males and females. Note that attributes are already characteristics. They cannot further have characteristics that are not already inherent in themselves.
But really, people invoke “gender” to speak of a sense of self, a self that reflexively looks at itself and perceives itself as possessing traits of maleness of femaleness; the thinker who must think of himself as “male” before he can act as a male. After all, you cannot walk without desiring first to move in a direction; how can you think without first knowing what it is you want to think? It is a cognitive splitting of the behavior of the whole person into two separate, distinct entities–an acting body, possessed of biological sex, and a perceiving mind, that merely perceives and “displays” gender.
But the self that looks at itself looking at itself is not real–it cannot be, for there is only one self. You can look at yourself in the mirror, but you cannot stand outside of yourself and be simultaneously yourself; there is only one you. The alternative, a fractured consciousness, is a symptom of mental disorder and treated with chlorpromazine.
Robert Oppenheimer was once diagnosed with schizophrenia–dementia praecox, as they called it then. Whether he had it or simply confused the therapist by talking about wave/particle dualities is another matter.
Then there are the myriad variants of the claim that men and women “perform femininity” or “display masculinity” or “do gender.” They do not claim that people are feminine or act masculine–such conventional phrasing assumes the existence of a unitary self that is, perceives, and acts. Rather, they posit an inner self that possesses no inherent male or female traits, for whom masculinity and femininity are only created via the interaction of their body and external expectations. In this view, women do not buy clothes because they have some inherent desire to go shopping and buy pretty things, but because society has compelled them to do so in order to comply with external notion of “what it means to be female.” The self who produces large gametes is not the self who shops.
The biological view of human behavior states that most humans engage in a variety of behaviors because similar behaviors contributed to the evolutionary success of our ancestors. We eat because ancestors who didn’t think eating was important died. We jump back when we see something that looks like a spider because ancestors who didn’t got bitten and died. We love cute things with big eyes because they look like babies because we are descended mostly from people who loved their babies.
Sometimes we do things that we don’t enjoy but rationalize will benefit us, like work for an overbearing boss or wear a burka, but most “masculine” and “feminine” behaviors fall into the category of things people do voluntarily, like “compete at sports” or “gossip with friends.” The fact that more men than women play baseball and more women than men enjoy gossiping with friends has nothing to do with an internal self attempting to perform gender roles and everything to do with the challenges ancestral humans faced in reproducing.
But whence this tendency toward ratiocination? I can criticize it as a physical mistake, but does it reflect an underlying psychological reality? Do some people really perceive themselves as a self separate from themselves, a meta-self watching the first self acting in particular manners?
Here is a study that found that folks with more cognitive flexibility tended to be more socially liberal, though economic conservatism/liberalism didn’t particularly correlate with cognitive flexibility.
I find that if I work hard, I may achieve a state of zen, an inner tranquility in which the endless narrative of thoughts coalesce for a moment and I can just be. Zen is flying down a straight road at 80 miles an hour on a motorcycle; zen is working on a math problem that consumes all of your attention; zen is dancing until you only feel the music. The opposite of zen is lying in bed at 3 AM, staring at the ceiling, thinking of all of your failures, unable to switch off your brain and fall asleep.
Dysphoria is a state of unease. Some people have gender dysphoria; a few report temporal dysphoria. It might be better defined at disconnection, a feeling of being eternally out of place. I feel a certain dysphoria every time I surface from reading some text of anthropology, walk outside, and see cars. What are these metal things? What are these straight, right-angled streets? Everything about modern society strikes me as so artificial and counter to nature that I find it deeply unsettling.
It is curious that dysphoria itself is not discussed more in the psychiatric literature. Certainly a specific form or two receives a great deal of attention, but not the general sense itself.
When things are in place, you feel tranquil and at ease; when things are out of place you agitated, always aware of the sense of crawling out of your own skin. People will try any number of things to turn off the dysphoria; a schizophrenic friend reports that enough alcohol will make the voices stop, at least for a while. Drink until your brain shuts up.
But this is only when things are out of place. Healthy people seek a balance between division and unity. Division of the self is necessary for self-criticism and improvement; people can say, then, “I did a bad thing, but I am not a bad person, so I will change my behavior and be better.” Metacognition allows people to reflect on their behavior without feeling that their self is fundamentally at threat, but too much metacognition leads to fragmentation and an inability to act.
People ultimately seek a balanced, unified sense of self.
My previous blogs have observed that some people –women with bulimia nervosa, for example– have frequent multiple simultaneous experiences, but that multiple experience is not frequent in the general population. …
Consider inner speech. Subject experienced themselves as innerly talking to themselves in 26% of all samples, but there were large individual differences: some subjects never experienced inner speech; other subjects experienced inner speech in as many as 75% of their samples. The median percentage across subjects was 20%.
I think the best way I can describe my aphantasia is to say that I am unaware of anything in my mind except these categories: i) direct sensory input, ii) “unheard”words that carry thoughts, iii) “unheard”music, iv) a kind of “invisible imagery”, which I can best describe as sensation of pictures that are in a sense “too faint to see”, v) emotions, and vi) thoughts which seem too “fast”to exist as words. … I see what is around me, unless my eyes are closed when all is always black. I hear, taste, smell and so forth, but I don‘t have the experience people describe of
hearing a tune or a voice in their heads. Curiously, I do frequently have a tune going around in my head, all I am lacking is the direct experience of “hearing”it.
The quoted author is, despite his lack of internal imagery, quite intelligent, with a PhD in physics.
I would like to know if there is any correlation between metacognition, ratiocination, and political orientations–I have so far found a little on the subject:
We find a relationship between thinking style and political orientation and that these effects are particularly concentrated on social attitudes. We also find it harder to manipulate intuitive and reflective thinking than a number of prominent studies suggest. Priming manipulations used to induce reflection and intuition in published articles repeatedly fail in our studies. We conclude that conservatives—more specifically, social conservatives—tend to be dispositionally less reflective, social liberals tend to be dispositionally more reflective, and that the relationship between reflection and intuition and political attitudes may be more resistant to easy manipulation than existing research would suggest.
… Berzonsky and Sullivan (1992) cite evidence that individuals higher in reported
self-reflection also exhibit more openness to experience, more liberal values, and more general tolerance for exploration. As noted earlier, conservatives tend to be less open to experience, more intolerant of ambiguity, and generally more reliant on self-certainty than liberals. That, coupled with the evidence reported by Berzonsky and Sullivan, strongly suggests conservatives engage in less introspective behaviors.
Following an interesting experiment looking at people’s online dating profiles, the authors conclude:
Results from our data support the hypothesis that individuals identifying
themselves as “Ultra Conservative‟ exhibit less introspection in a written passage with personal content than individuals identifying themselves as “Very Liberal‟. Individuals who reported a conservative political orientation often provided more descriptive and explanatory statements in their profile’s “About me and who I‟m looking for‟ section (e.g., “I am 62 years old and live part time in Montana” and “I enjoy hiking, fine restaurants”). In contrast, individuals who reported a liberal political orientation often provided more insightful and introspective statements in their narratives (e.g., “No regrets, that‟s what I believe in” and “My philosophy in life is to make complicated things simple”).
The ratiocination of the scientist’s mind can ultimately be stopped by delving into that most blessed of substances, reality, (or as close to it as we can get.) There is, at base, a fundamentally real thing to delve into, a thing which makes ambiguities disappear. Even a moral dilemma can be resolved with good enough data. We do not need to wander endlessly within our own thoughts; the world is here.
Homo Sapiens–that is, us, modern humans, are about 200-300,000 years old. Our ancestor, Homo heidelbergensis, lived in Africa around 700,000-300,000 years ago.
Around 700,000 years ago, another group of humans split off from the main group. By 400,000 years ago, their descendants, Homo neanderthalensis–Neanderthals–had arrived in Europe, and another band of their descendants, the enigmatic Denisovans, arrived in Asia.
While we have found quite a few Neanderthal remains and archaeological sites with tools, hearths, and other artifacts, we’ve uncovered very few Denisovan remains–a couple of teeth, a finger bone, and part of an arm in Denisova Cave, Russia. (Perhaps a few other remains I am unaware of.)
Yet from these paltry remains scientists have extracted enough DNA to ascertain that no only were Denisovans a distinct species, but also that Melanesians, Papuans, and Aborigines derive about 3-6% of their DNA from a Denisovan ancestors. (All non-African populations also have a small amount of Neanderthal DNA, derived from a Neanderthal ancestors.)
If Neanderthals and Homo sapiens interbred, and Denisovans and Homo sapiens interbred, did Neanderthals and Denisovans ever mate?
The girl, affectionately nicknamed Denny, lived and died about 90,000 years ago in Siberia. The remains of an arm, found in Denisova Cave, reveal that her mother was a Neanderthal, her father a Denisovan.
We don’t yet know what Denisovans looked like, because we don’t have any complete skeletons of them, much less good skulls to examine, so we don’t know what a Neanderthal-Denisovan hybrid like Denny looked like.
But the fact that we can extract so much information from a single bone–or fragment of bone–preserved in a Siberian cave for 90,000 years–is amazing.
We are still far from truly understanding what sorts of people our evolutionary cousins were, but we are gaining new insights all the time.