Racism OCD and Other Political Neuroses 

 

tumblr_inline_o15j3nj8rc1slq602_500
Source: Evangelion/blog thereupon

In his post on the Chamber of Guf, Slate Star Codex discussed a slate of psychiatric conditions where the sufferer becomes obsessed with not sinning in some particular way. In homosexual OCD, for example, the sufferer becomes obsessed with fear that they are homosexual or might have homosexual thoughts despite not actually being gay; people with incest OCD become paranoid that they might have incestuous thoughts, etc. Notice that in order to be defined as OCD, the sufferers have to not actually be gay or interested in sex with their relatives–this is paranoia about a non-existent transgression. Scott also notes that homosexual OCD is less common among people who don’t think of homosexuality as a sin, but these folks have other paranoias instead.

The “angel” in this metaphor is the selection process by which the brain decides which thoughts, out of the thousands we have each day, to focus on and amplify; “Guf” is the store of all available thoughts. Quoting Scott:

I studied under a professor who was an expert in these conditions. Her theory centered around the question of why angels would select some thoughts from the Guf over others to lift into consciousness. Variables like truth-value, relevance, and interestingness play important roles. But the exact balance depends on our mood. Anxiety is a global prior in favor of extracting fear-related thoughts from the Guf. Presumably everybody’s brain dedicates a neuron or two to thoughts like “a robber could break into my house right now and shoot me”. But most people’s Selecting Angels don’t find them worth bringing into the light of consciousness. Anxiety changes the angel’s orders: have a bias towards selecting thoughts that involve fearful situations and how to prepare for them. A person with an anxiety disorder, or a recent adrenaline injection, or whatever, will absolutely start thinking about robbers, even if they consciously know it’s an irrelevant concern.

In a few unlucky people with a lot of anxiety, the angel decides that a thought provoking any strong emotion is sufficient reason to raise the thought to consciousness. Now the Gay OCD trap is sprung. One day the angel randomly scoops up the thought “I am gay” and hands it to the patient’s consciousness. The patient notices the thought “I am gay”, and falsely interprets it as evidence that they’re actually gay, causing fear and disgust and self-doubt. The angel notices this thought produced a lot of emotion and occupied consciousness for a long time – a success! That was such a good choice of thought! It must have been so relevant! It decides to stick with this strategy of using the “I am gay” thought from now on. …

Politics has largely replaced religion for how most people think of “sin,” and modern memetic structures seem extremely well designed to amplify political sin-based paranoia, as articles like “Is your dog’s Halloween costume racist?” get lots of profitable clicks and get shared widely across social media platforms, whether by fans or opponents of the article.

Both religions and political systems have an interest in promoting such concerns, since they also sell the cures–forgiveness and salvation for the religious; economic and social policies for the political. This works best if it targets a very common subset of thoughts, like sexual attraction or dislike of random strangers, because you really can’t prevent all such thoughts, no matter how hard you try.

The original Tiny House
Medieval illustration of anchorite cell

Personal OCD is bad enough; a religious sufferer obsessed with their own moralistic sin may feel compelled to retreat to a monastery or wall themselves up to avoid temptation. If a whole society becomes obsessed, though, widespread paranoia and social control may result. (Society can probably be modeled as a meta-brain.)

I propose that our society, due to its memetic structure, is undergoing OCD-inducing paranoia spirals where the voices of the most paranoid are being allowed to set political and moral directions. Using racism as an example, it works something like this:

First, we have what I’ll call the Aristotelian Mean State: an appropriate, healthy level of in-group preference that people would not normally call “racism.” This Mean State is characterized by liking and appreciating one’s own culture, generally preferring it to others, but admitting that your culture isn’t perfect and other cultures have good points, too.

Deviating too far from this mean is generally considered sinful–in one direction, we get “My culture is the best and all other cultures should die,” and too far in the other, “All other cultures are best and my culture should die.” One of these is called “racism,” the other “treason.”

When people get Racism OCD, they become paranoid that even innocuous or innocent things–like dog costumes–could be a sign of racism. In this state, people worry about even normal, healthy expressions of ethnic pride, just as a person with homosexual OCD worries about completely normal appreciation of athleticism or admiration of a friend’s accomplishments.

Our culture then amplifies such worries by channeling them through Tumblr and other social media platforms where the argument “What do you mean you’re not against racism?” does wonders to break down resistance and convince everyone that normal, healthy ethnic feelings are abnormal, pathological racism and that sin is everywhere, you must constantly interrogate yourself for sin, you must constantly learn and try harder not to be racist, etc. There is always some new area of life that a Tumblrista can discover is secretly sinful, though you never realized it before, spiraling people into new arenas of self-doubt and paranoia.

As for the rest of the internet, those not predisposed toward Racism OCD are probably predisposed toward Anti-Racism OCD. Just as people with Racism OCD see racism everywhere, folks with Anti-Racism OCD see anti-racism everywhere. These folks think that even normal, healthy levels of not wanting to massacre the outgroup is pathological treason. (This is probably synonymous with Treason OCD, but is currently in a dynamic relationship with the perception that anti-racists are everywhere.)

Since there are over 300 million people in the US alone–not to mention 7 billion in the world–you can always find some case to justify paranoia. You can find people who say they merely have a healthy appreciation for their own culture but really do have murderous attitudes toward the out-group–something the out-group, at least, has good reason to worry about. You can find people who say they have a healthy attitude toward their own group, but still act in ways that could get everyone killed. You can find explicit racists and explicit traitors, and you can find lots of people with amplified, paranoid fears of both.

These two paranoid groups, in turn, can feed off each other, each pointing at the the other and screaming that everyone trying to promote “moderatism” is actually the worst sinners of the other side in disguise and therefore moderatism itself is evil. This feedback loop gives us things like the “It’s okay to be white” posters, which manages to make an entirely innocuous statement sound controversial due to our conviction that people only make innocuous statements because they are trying to make the other guy sound like a paranoid jerk who disputes innocuous statements.

Racism isn’t the only sin devolving into OCD–we can also propose Rape OCD, where people become paranoid about behaviors like flirting, kissing, or even thinking about women. There are probably other OCDs (trans OCD? food contamination OCD) but these are the big ones coming to mind right now.

Thankfully, Scott also proposes that awareness of our own psychology may allow us to recognize and moderate ourselves:

All of these can be treated with the same medications that treat normal OCD. But there’s an additional important step of explaining exactly this theory to the patient, so that they know that not only are they not gay/a pedophile/racist, but it’s actually their strong commitment to being against homosexuality/pedophilia/racism which is making them have these thoughts. This makes the thoughts provoke less strong emotion and can itself help reduce the frequency of obsessions. Even if it doesn’t do that, it’s at least comforting for most people.

The question, then, is how do we stop our national neuroses from causing disasters?

Advertisement

The Endless Ratiocination of the Dysphoric Mind

Begin

My endless inquiries made it impossible for me to achieve anything. Moreover, I get to think about my own thoughts of the situation in which I find myself. I even think that I think of it, and divide myself into an infinite retrogressive sequence of ‘I’s who consider each other. I do not know at which ‘I’ to stop as the actual, and as soon as I stop, there is indeed again an ‘I’ which stops at it. I become confused and feel giddy as if I were looking down into a bottomless abyss, and my ponderings result finally in a terrible headache. –Møller, Adventures of a Danish Student

Moller’s Adventures of a Danish Student was one of Niels Bohr’s favorite books; it reflected his own difficulties with cycles of ratiocination, in which the mind protects itself against conclusions by watching itself think.

I have noticed a tendency on the left, especially among the academic-minded, to split the individual into sets of mental twins–one who is and one who feels that it is; one who does and one who observes the doing.

Take the categories of “biological sex” and “gender.” Sex is defined as the biological condition of “producing small gametes” (male) or “producing large gametes” (female) for the purpose of sexual reproduction. Thus we can talk about male and female strawberry plants, male and female molluscs, male and female chickens, male and female Homo Sapiens.

(Indeed, the male-female binary is remarkably common across sexually reproducing plants and animals–it appears that the mathematics of a third sex simply don’t work out, unless you’re a mushroom. How exactly sex is created varies by species, which makes the stability of the sex-binary all the more remarkable.)

And for the first 299,945 years or so of our existence, most people were pretty happy dividing humanity into “men” “women” and the occasional “we’re not sure.” People didn’t understand why or how biology works, but it was a functional enough division for people.

In 1955, John Money decided we needed a new term, “gender,” to describe, as Wikipedia puts it, “the range of characteristics pertaining to, and differentiating between, masculinity and femininity.” Masculinity is further defined as “a set of attributes, behaviors, and roles associated with boys and men;” we can define “femininity” similarly.

So if we put these together, we get a circular definition: gender is a range of characteristics of the attributes of males and females. Note that attributes are already characteristics. They cannot further have characteristics that are not already inherent in themselves.

But really, people invoke “gender” to speak of a sense of self, a self that reflexively looks at itself and perceives itself as possessing traits of maleness of femaleness; the thinker who must think of himself as “male” before he can act as a male. After all, you cannot walk without desiring first to move in a direction; how can you think without first knowing what it is you want to think? It is a cognitive splitting of the behavior of the whole person into two separate, distinct entities–an acting body, possessed of biological sex, and a perceiving mind, that merely perceives and “displays” gender.

But the self that looks at itself looking at itself is not real–it cannot be, for there is only one self. You can look at yourself in the mirror, but you cannot stand outside of yourself and be simultaneously yourself; there is only one you. The alternative, a fractured consciousness, is a symptom of mental disorder and treated with chlorpromazine.

Robert Oppenheimer was once diagnosed with schizophrenia–dementia praecox, as they called it then. Whether he had it or simply confused the therapist by talking about wave/particle dualities is another matter.

Then there are the myriad variants of the claim that men and women “perform femininity” or “display masculinity” or “do gender.” They do not claim that people are feminine or act masculine–such conventional phrasing assumes the existence of a unitary self that is, perceives, and acts. Rather, they posit an inner self that possesses no inherent male or female traits, for whom masculinity and femininity are only created via the interaction of their body and external expectations. In this view, women do not buy clothes because they have some inherent desire to go shopping and buy pretty things, but because society has compelled them to do so in order to comply with external notion of “what it means to be female.” The self who produces large gametes is not the self who shops.

The biological view of human behavior states that most humans engage in a variety of behaviors because similar behaviors contributed to the evolutionary success of our ancestors. We eat because ancestors who didn’t think eating was important died. We jump back when we see something that looks like a spider because ancestors who didn’t got bitten and died. We love cute things with big eyes because they look like babies because we are descended mostly from people who loved their babies.

Sometimes we do things that we don’t enjoy but rationalize will benefit us, like work for an overbearing boss or wear a burka, but most “masculine” and “feminine” behaviors fall into the category of things people do voluntarily, like “compete at sports” or “gossip with friends.” The fact that more men than women play baseball and more women than men enjoy gossiping with friends has nothing to do with an internal self attempting to perform gender roles and everything to do with the challenges ancestral humans faced in reproducing.

But whence this tendency toward ratiocination? I can criticize it as a physical mistake, but does it reflect an underlying psychological reality? Do some people really perceive themselves as a self separate from themselves, a meta-self watching the first self acting in particular manners?

Here is a study that found that folks with more cognitive flexibility tended to be more socially liberal, though economic conservatism/liberalism didn’t particularly correlate with cognitive flexibility.

I find that if I work hard, I may achieve a state of zen, an inner tranquility in which the endless narrative of thoughts coalesce for a moment and I can just be. Zen is flying down a straight road at 80 miles an hour on a motorcycle; zen is working on a math problem that consumes all of your attention; zen is dancing until you only feel the music. The opposite of zen is lying in bed at 3 AM, staring at the ceiling, thinking of all of your failures, unable to switch off your brain and fall asleep.

Dysphoria is a state of unease. Some people have gender dysphoria; a few report temporal dysphoria. It might be better defined at disconnection, a feeling of being eternally out of place. I feel a certain dysphoria every time I surface from reading some text of anthropology, walk outside, and see cars. What are these metal things? What are these straight, right-angled streets? Everything about modern society strikes me as so artificial and counter to nature that I find it deeply unsettling.

It is curious that dysphoria itself is not discussed more in the psychiatric literature. Certainly a specific form or two receives a great deal of attention, but not the general sense itself.

When things are in place, you feel tranquil and at ease; when things are out of place you agitated, always aware of the sense of crawling out of your own skin. People will try any number of things to turn off the dysphoria; a schizophrenic friend reports that enough alcohol will make the voices stop, at least for a while. Drink until your brain shuts up.

But this is only when things are out of place. Healthy people seek a balance between division and unity. Division of the self is necessary for self-criticism and improvement; people can say, then, “I did a bad thing, but I am not a bad person, so I will change my behavior and be better.” Metacognition allows people to reflect on their behavior without feeling that their self is fundamentally at threat, but too much metacognition leads to fragmentation and an inability to act.

People ultimately seek a balanced, unified sense of self.

It is said that not everyone has an inner voice, a meta-self commenting on the acting self, and some have more than one:

My previous blogs have observed that some people –women with bulimia nervosa, for example– have frequent multiple simultaneous experiences, but that multiple experience is not frequent in the general population. …

Consider inner speech. Subject experienced themselves as innerly talking to themselves in 26% of all samples, but there were large individual differences: some subjects never experienced inner speech; other subjects experienced inner speech in as many as 75% of their samples. The median percentage across subjects was 20%.

It’s hard to tell what people really experience, but certainly there is a great deal of variety in people’s internal experiences. Much of thought is not easily describable. Some people hear many voices. Some cannot form mental images:

I think the best way I can describe my aphantasia is to say that I am unaware of anything in my mind except these categories: i) direct sensory input, ii) unheardwords that carry thoughts, iii) unheardmusic, iv) a kind of invisible imagery, which I can best describe as sensation of pictures that are in a sense too faint to see, v) emotions, and vi) thoughts which seem too fastto exist as words. … I see what is around me, unless my eyes are closed when all is always black. I hear, taste, smell and so forth, but I dont have the experience people describe of
hearing a tune or a voice in their heads. Curiously, I do frequently have a tune going around in my head, all I am lacking is the direct experience of hearingit.

The quoted author is, despite his lack of internal imagery, quite intelligent, with a PhD in physics.

Some cannot hear themselves think at all.

I would like to know if there is any correlation between metacognition, ratiocination, and political orientations–I have so far found a little on the subject:

We find a relationship between thinking style and political orientation and that these effects are particularly concentrated on social attitudes. We also find it harder to manipulate intuitive and reflective thinking than a number of prominent studies suggest. Priming manipulations used to induce reflection and intuition in published articles repeatedly fail in our studies. We conclude that conservatives—more specifically, social conservatives—tend to be dispositionally less reflective, social liberals tend to be dispositionally more reflective, and that the relationship between reflection and intuition and political attitudes may be more resistant to easy manipulation than existing research would suggest.

And a bit more:

… Berzonsky and Sullivan (1992) cite evidence that individuals higher in reported
self-reflection also exhibit more openness to experience, more liberal values, and more general tolerance for exploration. As noted earlier, conservatives tend to be less open to experience, more intolerant of ambiguity, and generally more reliant on self-certainty than liberals. That, coupled with the evidence reported by Berzonsky and Sullivan, strongly suggests conservatives engage in less introspective behaviors.

Following an interesting experiment looking at people’s online dating profiles, the authors conclude:

Results from our data support the hypothesis that individuals identifying
themselves as “Ultra Conservative‟ exhibit less introspection in a written passage with personal content than individuals identifying themselves as “Very Liberal‟. Individuals who reported a conservative political orientation often provided more descriptive and explanatory statements in their profile’s “About me and who I‟m looking for‟ section (e.g., “I am 62 years old and live part time in Montana” and “I enjoy hiking, fine restaurants”). In contrast, individuals who reported a liberal political orientation often provided more insightful and introspective statements in their narratives (e.g., “No regrets, that‟s what I believe in” and “My philosophy in life is to make complicated things simple”).

The ratiocination of the scientist’s mind can ultimately be stopped by delving into that most blessed of substances, reality, (or as close to it as we can get.) There is, at base, a fundamentally real thing to delve into, a thing which makes ambiguities disappear. Even a moral dilemma can be resolved with good enough data. We do not need to wander endlessly within our own thoughts; the world is here.

End

 

Neuropolitics: “Openness” and Cortical Thickness

Brain anatomy–gyruses

I ran across an interesting study today, on openness, creativity, and cortical thickness.

The psychological trait of “openness”–that is, willingness to try new things or experiences–correlates with other traits like creativity and political liberalism. (This might be changing as cultural shifts are changing what people mean by “liberalism,” but it was true a decade ago and is still statistically true today.)

Researchers took a set of 185 intelligent people studying or employed in STEM, gave them personality tests intended to measure “openness,” and then scanned their brains to measure cortical thickness in various areas.

According to Citizendium, “Cortical thickness” is:

a brain morphometric measure used to describe the combined thickness of the layers of the cerebral cortex in mammalian brains, either in local terms or as a global average for the entire brain. Given that cortical thickness roughly correlates with the number of neurons within an ontogenetic column, it is often taken as indicative of the cognitive abilities of an individual, albeit the latter are known to have multiple determinants.

According to the article in PsyPost, reporting on the study:

“The key finding from our study was that there was a negative correlation between Openness and cortical thickness in regions of the brain that underlie memory and cognitive control. This is an interesting finding because typically reduced cortical thickness is associated with decreased cognitive function, including lower psychometric measures of intelligence,” Vartanian told PsyPost.”

Citizendium explains some of the issues associated with too thin or thick cortexs:

Typical values in adult humans are between 1.5 and 3 mm, and during aging, a decrease (also known as cortical thinning) on the order of about 10 μm per year can be observed [3]. Deviations from these patterns can be used as diagnostic indicators for brain disorders: While Alzheimer’s disease, even very early on, is characterized by pronounced cortical thinning[4], Williams syndrome patients exhibit an increase in cortical thickness of about 5-10% in some regions [5], and lissencephalic patients show drastic thickening, up to several centimetres in occipital regions[6].

Obviously people with Alzheimer’s have difficulty remembering things, but people with Williams Syndrome also tend to be low-IQ and have difficulty with memory.

Of course, the cortex is a big region, and it may matter specifically where yours is thin or thick. In this study, the thinness was found in the left middle frontal gyrus, left middle temporal gyrus, left superior temporal gyrus, left inferior parietal lobule, right inferior parietal lobule, and right middle temporal gyrus.

These are areas that, according to the study’s authors, have previously been shown to be activated during neuroimaging studies of creativity, and so the specific places you would expect to see some kind of anatomical difference in particularly creative people.

Hypothetically, maybe reduced cortical thickness, in some people, makes them worse at remembering specific kinds of experiences–and thus more likely to try new ones. For example, if I remember very strongly that I like Tomato Sauce A, and that I hate Tomato Sauce B, I’m likely to just keep buying A. But if every time I go to the store I only have a vague memory that there was a tomato sauce I really liked, I might just pick sauces at random–eventually trying all of them.

The authors have a different interpretation:

“We believe that the reason why Openness is associated with reduced cortical thickness is that this condition reduces the person’s ability to filter the contents of thought, thereby facilitating greater immersion in the sensory, cognitive, and emotional information that might otherwise have been filtered out of consciousness.”

So, less meta-brain, more direct experience? Less worrying, more experiencing?

The authors note a few problems with the study (for starters, it is hardly a representative sample of either “creative” people nor exceptional geniuses, being limited to people in STEM,) but it is still an interesting piece of data and I hope to see more like it.

 

If you want to read more about brains, I recommend Kurzweil’s How to Create a Mind, which I am reading now. It goes into some detail on relevant brain structures, and how they work to create memories, recognize patterns, and let us create thought. (Incidentally, the link goes to Amazon Smile, which raises money for charity; I selected St. Jude’s.)

The Modular Mind

The other day I was walking through the garden when I looked down, saw one of these, leapt back, screamed loudly enough to notify the entire neighborhood:

(The one in my yard was insect free, however.)

After catching my breath, I wondered, “Is that a wasp nest or a beehive?” and crept back for a closer look. Wasp nest. I mentally paged through my knowledge of wasp nests: wasps abandon nests when they fall on the ground. This one was probably empty and safe to step past. I later tossed it onto the compost pile.

The interesting part of this incident wasn’t the nest, but my reaction. I jumped away from the thing before I had even consciously figured out what the nest was. Only once I was safe did I consciously think about the nest.

So I’ve been reading Gazzaniga’s Who’s in Charge? Free Will and the Science of the Brain. (I’m thinking of making this a Book Club pick; debating between this and Kurzweil’s How to Create a Mind: The Secrets of Human thought Revealed, which I have not read, but comes recommended. Feel free to vote for one, the other, or both.)

Gazzaniga discusses a problem faced by brains trying to evolve to be bigger and smarter: how do you get more neurons working without taking up an absurd amount of space connecting each and every neuron to every other neuron?

Imagine a brain with 5 connected neurons: each neuron requires 4 connections to talk to every other neuron. A 5 neuron brain would thus need space for 10 total connections.

The addition of a 6th neuron would require 5 new connections; a 7th neuron requires 6 new connections, etc. A fully connected brain of 100 neurons would require 99 connections per neuron, for a total of 4,950 connections.

The human brain has about 86 billion neurons.

Connecting all of your neurons might work fine if if you’re a sea squirt, with only 230 or so neurons, but it is going to fail hard if you’re trying to hook up 86 billion. The space required to hook up all of these neurons would be massively larger than the space you can actually maintain by eating.

So how does an organism evolving to be smarter deal with the connectivity demands of increasing brain size?

Human social lives suggest an answer: Up on the human scale, one person can, Dunbar estimates, have functional social relationships with about 150 other people, including an understanding of those people’s relationships with each other. 150 people (the “Dunbar number”) is therefore the amount of people who can reliably cooperate or form groups without requiring any top-down organization.

So how do humans survive in groups of a thousand, a million, or a billion (eg, China)? How do we build large-scale infrastructure projects requiring the work of thousands of people and used by millions, like interstate highways? By organization–that is, specialization.

In a small tribe of 150 people, almost everyone in the tribe can do most of the jobs necessary for the tribe’s survival, within the obvious limits of biology. Men and women are both primarily occupied with collecting food. Both prepare clothing and shelter; both can cook. There is some specialization of labor–obviously men can carry heavier loads; women can nurse children–but most people are generally competent at most jobs.

In a modern industrial economy, most people are completely incompetent at most jobs. I have a nice garden, but I don’t even know how to turn on a tractor, much less how to care for a cow. The average person does not know how to knit or sew, much less build a house, wire up the electricity and lay the plumbing. We attend school from 5 to 18 or 22 or 30 and end up less competent at surviving in our own societies than a cave man with no school was in his, not because school is terrible but because modern industrial society requires so much specialized knowledge to keep everything running that no one person can truly master even a tenth of it.

Specialization, not just of people but of organizations and institutions, like hospitals devoted to treating the sick, Walmarts devoted to selling goods, and Microsoft devoted to writing and selling computer software and hardware, lets society function without requiring that everyone learn to be a doctor, merchant, and computer expert.

Source

Similarly, brains expand their competence via specialization, not denser neural connections.

As UPI reports, Intelligence is correlated with fewer neural connections, not more, study finds:

The smartest people may boast more neurons than those of average intelligence, but their brains have fewer neural connections…

Neuroscientists in Germany recruited 259 participants, both men and women, to take IQ tests and have their brains imaged…

The research revealed a strong correlation between the number of dendrites in a person’s cerebral cortex and their intelligence. The smartest participants had fewer neural connections in their cerebral cortex.

Fewer neural connections overall allows different parts of the brain to specialize, increasing local competence.

All things are produced more plentifully and easily and of a better quality when one man does one thing that is natural to him and does it at the right time, and leaves other things. –Plato, The Republic

The brains of mice, as Gazzinga discusses, do not need to be highly specialized, because mice are not very smart and do not do many specialized activities. Human brains, by contrast, are highly specialized, as anyone who has ever had a stroke has discovered. (Henry Harpending of West Hunter, for example, once had a stroke while visiting Germany that knocked out the area of his brain responsible for reading, but since he couldn’t read German in the first place, he didn’t realize anything was wrong until several hours later.)

I read, about a decade ago, that male and female brains have different levels, and patterns, of internal connectivity. (Here and here are articles on the subject.) These differences in connectivity may allow men and women to excel at different skills, and since we humans are a social species that can communicate by talking, this allows us to take cognitive modality beyond the level of a single brain.

So modularity lets us learn (and do) more things, with the downside that sometimes knowledge is highly localized–that is, we have a lot of knowledge that we seem able to access only under specific circumstances, rather than use generally.

For example, I have long wondered at the phenomenon of people who can definitely do complicated math when asked to, but show no practical number sense in everyday life, like the folks from the Yale Philosophy department who are confused about why African Americans are under-represented in their major, even though Yale has an African American Studies department which attracts a disproportionate % of Yale’s African American students. The mathematical certainty that if any major in the whole school that attracts more African American students, then other majors will end up with fewer, has been lost on these otherwise bright minds.

Yalies are not the only folks who struggle to use the things they know. When asked to name a book–any book–ordinary people failed. Surely these people have heard of a book at some point in their lives–the Bible is pretty famous, as is Harry Potter. Even if you don’t like books, they were assigned in school, and your parents probably read The Cat in the Hat and Green Eggs and Ham to you when you were a kid. It is not that they do not have the knowledge as they cannot access it.

Teachers complain all the time that students–even very good ones–can memorize all of the information they need for a test, regurgitate it all perfectly, and then turn around and show no practical understanding of the information at all.

Richard Feynman wrote eloquently of his time teaching future science teachers in Brazil:

In regard to education in Brazil, I had a very interesting experience. I was teaching a group of students who would ultimately become teachers, since at that time there were not many opportunities in Brazil for a highly trained person in science. These students had already had many courses, and this was to be their most advanced course in electricity and magnetism – Maxwell’s equations, and so on. …

I discovered a very strange phenomenon: I could ask a question, which the students would answer immediately. But the next time I would ask the question – the same subject, and the same question, as far as I could tell – they couldn’t answer it at all! For instance, one time I was talking about polarized light, and I gave them all some strips of polaroid.

Polaroid passes only light whose electric vector is in a certain direction, so I explained how you could tell which way the light is polarized from whether the polaroid is dark or light.

We first took two strips of polaroid and rotated them until they let the most light through. From doing that we could tell that the two strips were now admitting light polarized in the same direction – what passed through one piece of polaroid could also pass through the other. But then I asked them how one could tell the absolute direction of polarization, for a single piece of polaroid.

They hadn’t any idea.

I knew this took a certain amount of ingenuity, so I gave them a hint: “Look at the light reflected from the bay outside.”

Nobody said anything.

Then I said, “Have you ever heard of Brewster’s Angle?”

“Yes, sir! Brewster’s Angle is the angle at which light reflected from a medium with an index of refraction is completely polarized.”

“And which way is the light polarized when it’s reflected?”

“The light is polarized perpendicular to the plane of reflection, sir.” Even now, I have to think about it; they knew it cold! They even knew the tangent of the angle equals the index!

I said, “Well?”

Still nothing. They had just told me that light reflected from a medium with an index, such as the bay outside, was polarized; they had even told me which way it was polarized.

I said, “Look at the bay outside, through the polaroid. Now turn the polaroid.”

“Ooh, it’s polarized!” they said.

After a lot of investigation, I finally figured out that the students had memorized everything, but they didn’t know what anything meant. When they heard “light that is reflected from a medium with an index,” they didn’t know that it meant a material such as water. They didn’t know that the “direction of the light” is the direction in which you see something when you’re looking at it, and so on. Everything was entirely memorized, yet nothing had been translated into meaningful words. So if I asked, “What is Brewster’s Angle?” I’m going into the computer with the right keywords. But if I say, “Look at the water,” nothing happens – they don’t have anything under “Look at the water”!

The students here are not dumb, and memorizing things is not bad–memorizing your times tables is very useful–but they have everything lodged in their “memorization module” and nothing in their “practical experience module.” (Note: I am not necessarily suggesting that thee exists a literal, physical spot in the brain where memorized and experienced knowledge reside, but that certain brain structures and networks lodge information in ways that make it easier or harder to access.)

People frequently make arguments that don’t make logical sense when you think them all the way through from start to finish, but do make sense if we assume that people are using specific brain modules for quick reasoning and don’t necessarily cross-check their results with each other. For example, when we are angry because someone has done something bad to us, we tend to snap at people who had nothing to do with it. Our brains are in “fight and punish mode” and latch on to the nearest person as the person who most likely committed the offense, even if we consciously know they weren’t involved.

Political discussions are often marred by folks running what ought to be logical arguments through status signaling, emotional, or tribal modules. The desire to see Bad People punished (a reasonable desire if we all lived in the same physical community with each other) interferes with a discussion of whether said punishment is actually useful, effective, or just. For example, a man who has been incorrectly convicted of the rape of a child will have a difficult time getting anyone to listen sympathetically to his case.

In the case of white South African victims of racially-motivated murder, the notion that their ancestors did wrong and therefore they deserve to be punished often overrides sympathy. As BBC notes, these killings tend to be particularly brutal (they often involve torture) and targeted, but the South African government doesn’t care:

According to one leading political activist, Mandla Nyaqela, this is the after-effect of the huge degree of selfishness and brutality which was shown towards the black population under apartheid. …

Virtually every week the press here report the murders of white farmers, though you will not hear much about it in the media outside South Africa.In South Africa you are twice as likely to be murdered if you are a white farmer than if you are a police officer – and the police here have a particularly dangerous life. The killings of farmers are often particularly brutal. …

Ernst Roets’s organisation has published the names of more than 2,000 people who have died over the last two decades. The government has so far been unwilling to make solving and preventing these murders a priority. …

There used to be 60,000 white farmers in South Africa. In 20 years that number has halved.

The Christian Science Monitor reports on the measures ordinary South Africans have to take in what was once a safe country to not become human shishkabobs, which you should pause and read, but is a bit of a tangent from our present discussion. The article ends with a mind-bending statement about a borrowed dog (dogs are also important for security):

My friends tell me the dog is fine around children, but is skittish around men, especially black men. The people at the dog pound told them it had probably been abused. As we walk past house after house, with barking dog after barking dog, I notice Lampo pays no attention. Instead, he’s watching the stream of housekeepers and gardeners heading home from work. They eye the dog nervously back.

Great, I think, I’m walking a racist dog.

Module one: Boy South Africa has a lot of crime. Better get a dog, cover my house with steel bars, and an extensive security system.

Module two: Associating black people with crime is racist, therefore my dog is racist for being wary of people who look like the person who abused it.

And while some people are obviously sympathetic to the plight of murdered people, “Cry me a river White South African Colonizers” is a very common reaction. (Never mind that the people committing crimes in South Africa today never lived under apartheid; they’ve lived in a black-run country for their entire lives.) Logically, white South Africans did not do anything to deserve being killed, and like the golden goose, killing the people who produce food will just trigger a repeat of Zimbabwe, but the modes of tribalism–“I do not care about these people because they are not mine and I want their stuff”–and punishment–“I read about a horrible thing someone did, so I want to punish everyone who looks like them”–trump logic.

Who dies–and how they die–significantly shapes our engagement with the news. Gun deaths via mass shootings get much more coverage and worry than ordinary homicides, even though ordinary homicides are far more common. homicides get more coverage and worry than suicides, even though suicides are far more common. The majority of gun deaths are actually suicides, but you’d never know that from listening to our national conversation about guns, simply because we are biased to worry far more about other people killng us than about ourselves.

Similarly, the death of one person via volcano receives about the same news coverage as 650 in a flood, 2,000 in a drought, or 40,000 in a famine. As the article notes:

Instead of considering the objective damage caused by natural disasters, networks tend to look for disasters that are “rife with drama”, as one New York Times article put it4—hurricanes, tornadoes, forest fires, earthquakes all make for splashy headlines and captivating visuals. Thanks to this selectivity, less “spectacular” but often times more deadly natural disasters tend to get passed over. Food shortages, for example, result in the most casualties and affect the most people per incident5 but their onset is more gradual than that of a volcanic explosion or sudden earthquake. … This bias for the spectacular is not only unfair and misleading, but also has the potential to misallocate attention and aid.

There are similar biases by continent, with disasters in Africa receiving less attention than disasters in Europe (this correlates with African disasters being more likely to be the slow-motion famines, epidemics and droughts that kill lots of people, and European disasters being splashier, though perhaps we’d consider famines “splashier” if they happened in Paris instead of Ethiopia.)

From Personality and Political Attitudes: “Conservatives are hard-working, organized, closed-minded, and emotionally stable. Liberals are lazy, disorganized, open-minded, and neurotic. Let’s see how the punditocracy spins that one.”

From a neuropolitical perspective, I suspect that patterns such as the Big Five personality traits correlating with particular political positions (“openness” with “liberalism,” for example, or “conscientiousness” with “conservativeness,”) is caused by patterns of brain activity that cause some people to depend more or less on particular brain modules for processing.

For example, conservatives process more of the world through the areas of their brain that are also used for processing disgust, (not one of “the five” but still an important psychological trait) which increases their fear of pathogens, disease vectors, and generally anything new or from the outside. Disgust can go so far as to process other people’s faces or body language as “disgusting” (eg, trans people) even when there is objectively nothing that presents an actual contamination or pathogenic risk involved.

Similarly, people who feel more guilt in one area of their life often feel guilt in others–eg, “White guilt was significantly associated with bulimia nervosa symptomatology.” The arrow of causation is unclear–guilt about eating might spill over into guilt about existing, or guilt about existing might cause guilt about eating, or people who generally feel guilty about everything could have both. Either way, these people are generally not logically reasoning, “Whites have done bad things, therefore I should starve myself.” (Should veganism be classified as a politically motivated eating disorder?)

I could continue forever–

Restrictions on medical research are biased toward preventing mentally salient incidents like thalidomide babies, but against the invisible cost of children who die from diseases that could have been cured had research not been prevented by regulations.

America has a large Somali community but not Congolese, (85,000 Somalis vs. 13,000 Congolese, of whom 10,000 hail from the DRC. Somalia has about 14 million people, the DRC has about 78.7 million people, so it’s not due to there being more Somalis in the world,) for no particular reason I’ve been able to discover, other than President Clinton once disastrously sent a few helicopters to intervene in the eternal Somali civil war and so the government decided that we now have a special obligation to take in Somalis.

–but that’s probably enough.

I have tried here to present a balanced account of different political biases, but I would like to end by noting that modular thinking, while it can lead to stupid decisions, exists for good reasons. If purely logical thinking were superior to modular, we’d probably be better at it. Still, cognitive biases exist and lead to a lot of stupid or sub-optimal results.

Apparently Most People Live in A Strange Time Warp Where Neither Past nor Future Actually Exist

Forget the Piraha. It appears that most Americans are only vaguely aware of these things called “past” and “future”:

Source: CNN poll conducted by SSRS,

A majority of people now report that George W. Bush, whom they once thought was a colossal failure of a president, whose approval ratings bottomed out at 33% when he left office, was actually good. By what measure? He broke the economy, destabilized the Middle East, spent trillions of dollars, and got thousands of Americans and Iraqis killed.

Apparently the logic here is “Sure, Bush might have murdered Iraqi children and tortured prisoners, but at least he didn’t call Haiti a shithole.” We Americans have standards, you know.

He’s just a huggable guy.

I’d be more forgiving if Bush’s good numbers all came from 18 year olds who were 10 when he left office and so weren’t actually paying attention at the time. I’d also be more forgiving if Bush had some really stupid scandals, like Bill Clinton–I can understand why someone might have given Clinton a bad rating in the midst of the Monica Lewinsky scandal, but looking back a decade later, might reflect that Monica didn’t matter that much and as far as president goes, Clinton was fine.

But if you thought invading Iraq was a bad idea back in 2008 then you ought to STILL think it is a bad idea right now.

Note: If you thought it was a good idea at the time, then it’s sensible to think it is still a good idea.

This post isn’t really about Bush. It’s about our human inability to perceive the flow of time and accurately remember the past and prepare for the future.

I recently texted a fellow mom: Would your kid like to come play with my kid? She texted back: My kid is down for a nap.

AND?

What about when the nap is over? I didn’t specify a time in the original text; tomorrow or next week would have been fine.

I don’t think these folks are trying to avoid me. They’re just really bad at scheduling.

People are especially bad at projecting current trends into the future. In a conversation with a liberal friend, he dismissed the idea that there could be any problems with demographic trends or immigration with, “That won’t happen for a hundred years. I’ll be dead then. I don’t care.”

An anthropologist working with the Bushmen noticed that they had to walk a long way each day between the watering hole, where the only water was, and the nut trees, where the food was. “Why don’t you just plant a nut tree near the watering hole?” asked the anthropologist.

“Why bother?” replied a Bushman. “By the time the tree was grown, I’d be dead.”

Of course, the tree would probably only take a decade to start producing, which is within even a Bushman’s lifetime, but even if it didn’t, plenty of people build up wealth, businesses, or otherwise make provisions to provide for their children–or grandchildren–after their deaths.

Likewise, current demographic trends in the West will have major effects within our lifetimes. Between the  1990 and 2010 censuses (twenty years), the number of Hispanics in the US doubled, from 22.4 million to 50.5 million. As a percent of the overall population, they went from 9% to 16%–making them America’s largest minority group, as blacks constitute only 12.6%.

If you’re a Boomer, then Hispanics were only 2-3% of the country during your childhood.

The idea that demographic changes will take a hundred years and therefore don’t matter makes as much sense as saying a tree that takes ten years to grow won’t produce within your lifetime and therefore isn’t worth planting.

Society can implement long term plans–dams are built with hundred year storms and floods in mind; building codes are written with hundred year earthquake risks in mind–but most people seem to exist in a strange time warp in which neither the past nor future really exist. What they do know about the past is oddly compressed–anything from a decade to a century ago is mushed into a vague sense of “before now.” Take this article from the Atlantic on how Micheal Brown (born in 1996,) was shot in 2014 because of the FHA’s redlining policies back in 1943.

I feel like I’m beating a dead horse at this point, but one of the world’s most successful ethnic groups was getting herded into gas chambers in 1943. Somehow the Jews managed to go from being worked to death in the mines below Buchenwald (slave labor dug the tunnels where von Braun’s rockets were developed) to not getting shot by the police on the streets of Ferguson in 2014, 71 years later. It’s a mystery.

And in another absurd case, “Artist reverses gender roles in 50s ads to ‘give men a taste of their own sexist poison’,” because clearly advertisements from over half a century ago are a pressing issue, relevant to the opinions of modern men.

I’m focusing here on political matters because they make the news, but I suspect this is a true psychological trait for most people–the past blurs fuzzily together, and the future is only vaguely knowable.

Politically, there is a tendency to simultaneously assume the past–which continued until last Tuesday–was a long, dark, morass of bigotry and unpleasantness, and that the current state of enlightened beauty will of course continue into the indefinite future without any unpleasant expenditures of effort.

In reality, our species is, more or less, 300,000 years old. Agriculture is only 10,000 years old.

100 years ago, the last great bubonic plague epidemic (yersinia pestis) was still going on. 10 million people died, including 119 Californians. 75 years ago, millions of people were dying in WWII. Sixty years ago, polio was still crippling children (my father caught it, suffering permanent nerve damage.)

In the 1800s, Germany’s infant mortality rate was 50%; in 1950, Europe’s rate was over 10%; today, infant mortality in the developed world is below 0.5%; globally, it’s 4.3%. The death of a child has gone from a universal hardship to an almost unknown suffering.

100 years ago, only one city in the US–Jersey City–routinely disinfected its drinking water. (Before disinfection and sewers, drinking water was routinely contaminated with deadly bacteria like cholera.) I’m still looking for data on the spread of running water, but chances are good your grandparents did not have an indoor toilet when they were children. (I have folks in my extended family who still have trouble when the water table drops and their well dries up.)

Hunger, famines, disease, death… I could continue enumerating, but my point is simple: the prosperity we enjoy is not only unprecedented in the course of human history, but it hasn’t even existed for one full human lifetime.

Rome was once an empire. In the year one hundred, the eternal city had over 1,500,000 citizens. By 500, it had fewer than 50,000. It would not recover for over a thousand years.

Everything we have can be wiped away in another human lifetime if we refuse to admit that the future exists.

Conservatives Over-Generalize; Liberals Under-Generalize

This is a theory about a general trend.

Liberals tend to be very good at learning specific, detailed bits of information, but bad at big-picture ideas. Conservatives tend to be good at big-picture ideas, but bad at specific details. In other words, liberals are the guys who can’t see the forest for the trees, while conservatives have a habit of referring to all trees as “oaks.”

Or all sodas as Cokes:

popvssodamap2

Waitress: What would y’all like to drink?
Lady: Oh, I’ll have a Coke.
Waitress: All right, what kind of Coke?
Lady: Diet Pepsi.

When conservatives speak of general trends among people, liberals are prone to protesting that “Not all X are like that.” For liberals, the fact that one guy might be an exception to a general trend is important enough to mention in any discussion. Liberals who want to un-gender pregnancy discussion, because “men can get pregnant, too,” are a perfect example of this. (See my previous post about TERFS.)

This post was inspired by a friend’s complaint that “Trump keeps saying untrue things,” to which I responded that the Hillary also says lots of untrue things. It seems to me that there is a distinct pattern in the kinds of untruths each camp engages in.

Source
Source

If you ask the average conservative to define the races of man, he’d probably tell you: black, white, and Asian. Give him a crayon and a map of the world, and he’d probably produce something like this:

Ask the average liberal to define the races of man, and he’ll tell you that race is a social construct and that there’s more genetic variation within races than between them.

Diagram of Trans-species polymorphisms, from Evo and Proud
Diagram of Trans-species polymorphisms, from Evo and Proud

Both of these statements are basically correct, (but see here) but in different ways. The Conservative misses the within-racial variety (and may draw the racial borders incorrectly, eg, assuming that north Africans or Australians are Black.) And the Liberal misses that race is actually a real thing, and that the issue of genetic between vs. within also holds true for different species (see: species is a social construct,) and yet we still recognize that “dog” is a useful word for describing a real category of things we encounter in the real world.

Conservatives are prone to saying things like, “Blacks commit more crime than whites,” and liberals are prone to responding that the majority of black people aren’t criminals.

nope-the-claim-trump-says-clinton-acid-washed-her-email-4623517I find that it helps a lot in understanding people if I give them the benefit of the doubt and try to understand what they mean, rather than get hung up on the exact words they use.

NBC perhaps went too far down this path when they claimed that Trump had lied for saying Clinton “acid washed” her email server, when in fact she had used an app called BleachBit. Sure, bleach is a weak base, not an acid, but I don’t think Trump was actually trying to discuss chemistry in this case.

When the newsmedia claimed that the Syrian refugees pouring into Germany would be “good for the German economy,” this was obviously false. Yes, some Syrians are exceptionally bright, hardworking, motivated people who will do their best to benefit their new home. But most refugees are traumatized and don’t speak the local language. Few people would argue that the Syrian educational system turns out grads with test scores equal to the German system. It’s one thing to take refugees for pure humanitarian reasons, because you care about them as people. It’s another thing to pretend that refugees are going to make the average German richer. They won’t.

When Trump says there is so much wrong with black communities, so much poverty and violence, he is, broadly speaking, correct. When Hillary says there is so much good in black communities, like black businesses and churches, she is, narrowly speaking, also correct.

Of course, as Conway et al caution [warning PDF]:

Prior research suggests that liberals are more complex than conservatives. However, it may be that liberals are not more complex in general, but rather only more complex on certain topic domains (while conservatives
are more complex in other domains). Four studies (comprised of over 2,500 participants) evaluated this idea. … By making only small adjustments to a popularly used dogmatism scale, results show that liberals can be significantly more dogmatic if a liberal domain is made salient. Studies 2–4 involve the domain specificity of integrative complexity. A large number of open-ended responses from college students (Studies 2 and 3) and candidates in the 2004 Presidential election (Study 4) across an array of topic domains reveals little or no main effect of political ideology on integrative complexity, but rather topic domain by ideology interactions. Liberals are higher in complexity on some topics, but conservatives are higher on others.

Weight, Taste, and Politics: A Theory of Republican Over-Indulgence

So I was thinking about taste (flavor) and disgust (emotion.)

As I mentioned about a month ago, 25% of people are “supertasters,” that is, better at tasting than the other 75% of people. Supertasters experience flavors more intensely than ordinary tasters, resulting in a preference for “bland” food (food with too much flavor is “overwhelming” to them.) They also have a more difficult time getting used to new foods.

One of my work acquaintances of many years –we’ll call her Echo–is obese, constantly on a diet, and constantly eats sweets. She knows she should eat vegetables and tries to do so, but finds them bitter and unpleasant, and so the general outcome is as you expect: she doesn’t eat them.

Since I find most vegetables quite tasty, I find this attitude very strange–but I am willing to admit that I may be the one with unusual attitudes toward food.

Echo is also quite conservative.

This got me thinking about vegetarians vs. people who think vegetarians are crazy. Why (aside from novelty of the idea) should vegetarians be liberals? Why aren’t vegetarians just people who happen to really like vegetables?

What if there were something in preference for vegetables themselves that correlated with political ideology?

Certainly we can theorize that “supertaster” => “vegetables taste bitter” => “dislike of vegetables” => “thinks vegetarians are crazy.” (Some supertasters might think meat tastes bad, but anecdotal evidence doesn’t support this; see also Wikipedia, where supertasting is clearly associated with responses to plants:

Any evolutionary advantage to supertasting is unclear. In some environments, heightened taste response, particularly to bitterness, would represent an important advantage in avoiding potentially toxic plant alkaloids. In other environments, increased response to bitterness may have limited the range of palatable foods. …

Although individual food preference for supertasters cannot be typified, documented examples for either lessened preference or consumption include:

Mushrooms? Echo was just complaining about mushrooms.

Let’s talk about disgust. Disgust is an important reaction to things that might infect or poison you, triggering reactions from scrunching up your face to vomiting (ie, expelling the poison.) We process disgust in our amygdalas, and some people appear to have bigger or smaller amygdalas than others, with the result that the folks with more amygdalas feel more disgust.

Humans also route a variety of social situations through their amygdalas, resulting in the feeling of “disgust” in response to things that are not rotten food, like other people’s sexual behaviors, criminals, or particularly unattractive people. People with larger amygdalas also tend to find more human behaviors disgusting, and this disgust correlates with social conservatism.

To what extent are “taste” and “disgust” independent of each other? I don’t know; perhaps they are intimately linked into a single feedback system, where disgust and taste sensitivity cause each other, or perhaps they are relatively independent, so that a few unlucky people are both super-sensitive to taste and easily disgusted.

People who find other people’s behavior disgusting and off-putting may also be people who find flavors overwhelming, prefer bland or sweet foods over bitter ones, think vegetables are icky, vegetarians are crazy, and struggle to stay on diets.

What’s that, you say, I’ve just constructed a just-so story?

Well, this is the part where I go looking for evidence. It turns out that obesity and political orientation do correlate:

Michael Shin and William McCarthy, researchers from UCLA, have found an association between counties with higher levels of support for the 2012 Republican presidential candidate and higher levels of obesity in those counties.

Shin and McCarthy's map of obesity vs. political orientation
Shin and McCarthy’s map of obesity vs. political orientation

Looks like the Mormons and Southern blacks are outliers.

(I don’t really like maps like this for displaying data; I would much prefer a simple graph showing orientation on one axis and obesity on the other, with each county as a datapoint.)

(Unsurprisingly, the first 49 hits I got when searching for correlations between political orientation and obesity were almost all about what other people think of fat people, not what fat people think. This is probably because researchers tend to be skinny people who want to fight “fat phobia” but aren’t actually interested in the opinions of fat people.)

The 15 most caffeinated cities, from I love Coffee
The 15 most caffeinated cities, from I love Coffee–note that Phoenix is #7, not #1.

Disgust also correlates with political belief, but we already knew that.

A not entirely scientific survey also indicates that liberals seem to like vegetables better than conservatives:

  • Liberals are 28 percent more likely than conservatives to eat fresh fruit daily, and 17 percent more likely to eat toast or a bagel in the morning, while conservatives are 20 percent more likely to skip breakfast.
  • Ten percent of liberals surveyed indicated they are vegetarians, compared with 3 percent of conservatives.
  • Liberals are 28 percent more likely than conservatives to enjoy beer, with 60 percent of liberals indicating they like beer.

(See above where Wikipedia noted that supertasters dislike beer.) I will also note that coffee, which supertasters tend to dislike because it is too bitter, is very popular in the ultra-liberal cities of Portland and Seattle, whereas heavily sweetened iced tea is practically the official beverage of the South.

The only remaining question is if supertasters are conservative. That may take some research.

Update: I have not found, to my disappointment, a simple study that just looks at correlation between ideology and supertasting (or nontasting.) However, I have found a couple of useful items.

In Verbal priming and taste sensitivity make moral transgressions gross, Herz writes:

Standard tests of disgust sensitivity, a questionnaire developed for this research assessing different types of moral transgressions (nonvisceral, implied-visceral, visceral) with the terms “angry” and “grossed-out,” and a taste sensitivity test of 6-n-propylthiouracil (PROP) were administered to 102 participants. [PROP is commonly used to test for “supertasters.”] Results confirmed past findings that the more sensitive to PROP a participant was the more disgusted they were by visceral, but not moral, disgust elicitors. Importantly, the findings newly revealed that taste sensitivity had no bearing on evaluations of moral transgressions, regardless of their visceral nature, when “angry” was the emotion primed. However, when “grossed-out” was primed for evaluating moral violations, the more intense PROP tasted to a participant the more “grossed-out” they were by all transgressions. Women were generally more disgust sensitive and morally condemning than men, … The present findings support the proposition that moral and visceral disgust do not share a common oral origin, but show that linguistic priming can transform a moral transgression into a viscerally repulsive event and that susceptibility to this priming varies as a function of an individual’s sensitivity to the origins of visceral disgust—bitter taste. [bold mine.]

In other words, supertasters are more easily disgusted, and with verbal priming will transfer that disgust to moral transgressions. (And easily disgusted people tend to be conservatives.)

The Effect of Calorie Information on Consumers’ Food Choice: Sources of Observed Gender Heterogeneity, by Heiman and Lowengart, states:

While previous studies found that inherited taste-blindness to bitter compounds such
as PROP may be a risk factor for obesity, this literature has been hotly disputed
(Keller et al. 2010).

(Always remember, of course, that a great many social-science studies ultimately do not replicate.)

I’ll let you know if I find anything else.

Is Disgust Real? (Part 2 of a series)

(See also: Part 1, Yes, Women Think Male Sexuality is Disgusting; Part 3, Disney Explains Disgust; and Part 4, Disgust vs. Aggression vs. Fertility.)

One of the theories that undergirds a large subset of my thoughts on how brains work is the idea that Disgust is a Real Thing.

I don’t just mean a mild aversion to things that smell bad, like overturned port-a-potties or that fuzzy thing you found growing in the back of the fridge that might have been lasagna, once upon a time. Even I have such aversions.

I mean reactions like screaming and looking like you are about to vomit upon finding a chicken heart in your soup; gagging at the sight of trans people or female body hair; writhing and waving your hands while removing a slug from your porch; or the claim that talking about rats at the dinner table puts you off your meal. Or more generally, people claiming, “That’s disgusting!” or “What a creep!” about things or people that obviously aren’t even stinky.

There is a parable about a deaf person watching people dance to music he can’t hear and assuming that the people have all gone mad.

For most of my life, I assumed these reactions were just some sort of complicated schtick people put on, for totally obtuse reasons. It was only about a year ago that I realized, in a flash of insight, that this disgust is a real thing that people actually feel.

I recently expressed this idea to a friend, and they stared at me in shock. (That, or they were joking.) We both agreed that chicken hearts are a perfectly normal thing to put in soup, so at least I’m not the only one confused by this.

This breakthrough happened as a result of reading a slew of neuro-political articles that I can’t find now, and it looks like the site itself might be gone, which makes me really sad. I’ve linked to at least one of them before, which means that now my old links are dead, too. Damn. Luckily, it looks like Wired has an article covering the same or similar research: Primal Propensity for Disgust Shapes Political Positions.

“The latest such finding comes from a study of people who looked at gross images, such as a man eating earthworms. Viewers who self-identified as conservative, especially those opposing gay marriage, reacted with particularly deep disgust. … Disgust is especially interesting to researchers because it’s such a fundamental sensation, an emotional building block so primal that feelings of moral repugnance originate in neurobiological processes shared with a repugnance for rotten food.”

So when people say that some moral or political thing is, “disgusting,” I don’t think they’re being metaphorical; I think they actually, literally mean that the idea of it makes them want to vomit.

Which begs the question: Why?

Simply put, I suspect that some of us have more of our brain space devoted to processing disgust than others. I can handle lab rats–or pieces of dead lab rats–without any internal reaction, I don’t care if there are trans people in my bathroom, and I suspect my sense of smell isn’t very good. My opinions on moral issues are routed primarily through what I hope are the rational, logic-making parts of my brain.

By contrast, people with stronger disgust reactions probably have more of their brain space devoted to disgust, and so are routing more of their sensory experiences through that region, and so feel strong, physical disgust in reaction to a variety of things, like people with different cultural norms than themselves. Their moral reasoning comes from a more instinctual place.

It is tempting to claim that processing things logically is superior to routing them through the disgust regions, but sometimes things are disgusting for good, evolutionarily sound reasons. Having an instinctual aversion to rats is not such a bad thing, given that they have historically been disease vectors. Most of our instincts exist to protect and help us, after all.

(See also: Part 1, Yes, Women Think Male Sexuality is Disgusting; Part 3, Disney Explains Disgust; and Part 4, Disgust vs. Aggression vs. Fertility.)

A Zombie-Free Uncanny Valley

Maybe the Uncanny Valley has nothing to do with avoiding sick/dead people, maybe nothing to do with anything specifically human-oriented at all, but with plain-ol’ conceptual category violations? Suppose you are trying to divide some class of reality into two discrete categories, like “plants” and “animals” or “poetry” and “prose”. Edge cases that don’t fit neatly into either category may be problematic, annoying, or otherwise troubling. Your brain tries to cram something into Category A, then a new data point comes along, and you switch to cramming it into Category B. Then more data and back to A. Then back to B. This might happen even at a subconscious level, flicking back and forth between two categories you normally assign instinctively, like human and non-human, forcing you to devote brain power to something that’s normally automatic. This is probably stressful for the brain.

In some cases, edge cases may be inconsequential and people may just ignore them; in some cases, though, group membership is important–people seem particularly keen on arguments about peoples’ inclusion in various human groups, hence accusations that people are “posers” or otherwise claiming membership they may not deserve.

Some people may prefer discreet categories more strongly than others, and so be more bothered by edge cases; other people may be more mentally flexible or capable of dealing with a third category labeled “edge cases”. It’s also possible that some people do not bother with discreet categories at all.

It would be interesting to test people’s preference for discreet categories, and then see if this correlates with disgust at humanoid robots or any particular political identities.

It would also be interesting to see if there are ways to equip people with different conceptual paradigms for dealing with data that better accommodate edge cases; a “Core vs. Periphery” approach may be better in some cases than discreet categories, for example.

Amygdalaaas

So, building on last night’s potential revelation, let’s review what we may or may not know about amygdalas.

I had read (summaries of) studies indicating that conservatives have larger amygdalas than liberals. From this I concluded that amygdalas were likely to be involved in some sort of processing that produces more conservative results.

But what does the amygdala do? Articles indicated involvement in the disgust reflex. I concluded that people who are more easily disgusted are just more likely to feel disgust in response to novelty.

But, Frost brings up a good point: has the initial orientation/amygdala data been confounded by researchers failing to control for ethnicity? If researchers have classed, say, Muslims who vote Labour as liberals, then obviously the data is meaningless?

There is an to solve this conundrum: Just find some data that controls for ethnicity and measures liberalness or conservatism, (rather than use self-reported orientation). If no suitable studies exist, do one–go to some rural college, collect an ethnically similar cohort, give them a quiz about their attitudes toward novel foods and other such non-emotionally charged things that seem to correlate with political orientation, and then scan the brains of equal numbers of liberals and conservatives and see what you get. If you find some sort of correlation, repeat the experiment with other ethnicities. If you find nothing, widen your scope to compare whites from different areas, or whites who are further apart in political orientation. If you still find nothing, do a multi-ethnic study and see if the initial results were just ethnic differences in party affiliation.

Recent articles further indicate that my interpretations may have been wrong, or that the picture is more complicated than I realized, eg, Neural and cognitive characteristics of extraordinary altruists, “Functional imaging and behavioral tasks included face-emotion processing paradigms that reliably distinguish psychopathic individuals from controls. Here we show that extraordinary altruists can be distinguished from controls by their enhanced volume in right amygdala and enhanced responsiveness of this structure to fearful facial expressions, an effect that predicts superior perceptual sensitivity to these expressions. These results mirror the reduced amygdala volume and reduced responsiveness to fearful facial expressions observed in psychopathic individuals.”

This does not support my “conservatives have larger disgust reflexes becuz amygdalas,” theory, but would be consistent with a “extremely conservative people were mis-categorized as liberals in other studies,” result.

More research is necessary.