So I was thinking about taste (flavor) and disgust (emotion.)
As I mentioned about a month ago, 25% of people are “supertasters,” that is, better at tasting than the other 75% of people. Supertasters experience flavors more intensely than ordinary tasters, resulting in a preference for “bland” food (food with too much flavor is “overwhelming” to them.) They also have a more difficult time getting used to new foods.
One of my work acquaintances of many years –we’ll call her Echo–is obese, constantly on a diet, and constantly eats sweets. She knows she should eat vegetables and tries to do so, but finds them bitter and unpleasant, and so the general outcome is as you expect: she doesn’t eat them.
Since I find most vegetables quite tasty, I find this attitude very strange–but I am willing to admit that I may be the one with unusual attitudes toward food.
Echo is also quite conservative.
This got me thinking about vegetarians vs. people who think vegetarians are crazy. Why (aside from novelty of the idea) should vegetarians be liberals? Why aren’t vegetarians just people who happen to really like vegetables?
What if there were something in preference for vegetables themselves that correlated with political ideology?
Certainly we can theorize that “supertaster” => “vegetables taste bitter” => “dislike of vegetables” => “thinks vegetarians are crazy.” (Some supertasters might think meat tastes bad, but anecdotal evidence doesn’t support this; see also Wikipedia, where supertasting is clearly associated with responses to plants:
Any evolutionary advantage to supertasting is unclear. In some environments, heightened taste response, particularly to bitterness, would represent an important advantage in avoiding potentially toxic plant alkaloids. In other environments, increased response to bitterness may have limited the range of palatable foods. …
Although individual food preference for supertasters cannot be typified, documented examples for either lessened preference or consumption include:
Mushrooms? Echo was just complaining about mushrooms.
Let’s talk about disgust. Disgust is an important reaction to things that might infect or poison you, triggering reactions from scrunching up your face to vomiting (ie, expelling the poison.) We process disgust in our amygdalas, and some people appear to have bigger or smaller amygdalas than others, with the result that the folks with more amygdalas feel more disgust.
Humans also route a variety of social situations through their amygdalas, resulting in the feeling of “disgust” in response to things that are not rotten food, like other people’s sexual behaviors, criminals, or particularly unattractive people. People with larger amygdalas also tend to find more human behaviors disgusting, and this disgust correlates with social conservatism.
To what extent are “taste” and “disgust” independent of each other? I don’t know; perhaps they are intimately linked into a single feedback system, where disgust and taste sensitivity cause each other, or perhaps they are relatively independent, so that a few unlucky people are both super-sensitive to taste and easily disgusted.
People who find other people’s behavior disgusting and off-putting may also be people who find flavors overwhelming, prefer bland or sweet foods over bitter ones, think vegetables are icky, vegetarians are crazy, and struggle to stay on diets.
What’s that, you say, I’ve just constructed a just-so story?
Michael Shin and William McCarthy, researchers from UCLA, have found an association between counties with higher levels of support for the 2012 Republican presidential candidate and higher levels of obesity in those counties.
Looks like the Mormons and Southern blacks are outliers.
(I don’t really like maps like this for displaying data; I would much prefer a simple graph showing orientation on one axis and obesity on the other, with each county as a datapoint.)
(Unsurprisingly, the first 49 hits I got when searching for correlations between political orientation and obesity were almost all about what other people think of fat people, not what fat people think. This is probably because researchers tend to be skinny people who want to fight “fat phobia” but aren’t actually interested in the opinions of fat people.)
Liberals are 28 percent more likely than conservatives to eat fresh fruit daily, and 17 percent more likely to eat toast or a bagel in the morning, while conservatives are 20 percent more likely to skip breakfast.
Ten percent of liberals surveyed indicated they are vegetarians, compared with 3 percent of conservatives.
Liberals are 28 percent more likely than conservatives to enjoy beer, with 60 percent of liberals indicating they like beer.
(See above where Wikipedia noted that supertasters dislike beer.) I will also note that coffee, which supertasters tend to dislike because it is too bitter, is very popular in the ultra-liberal cities of Portland and Seattle, whereas heavily sweetened iced tea is practically the official beverage of the South.
The only remaining question is if supertasters are conservative. That may take some research.
Update: I have not found, to my disappointment, a simple study that just looks at correlation between ideology and supertasting (or nontasting.) However, I have found a couple of useful items.
Standard tests of disgust sensitivity, a questionnaire developed for this research assessing different types of moral transgressions (nonvisceral, implied-visceral, visceral) with the terms “angry” and “grossed-out,” and a taste sensitivity test of 6-n-propylthiouracil (PROP) were administered to 102 participants. [PROP is commonly used to test for “supertasters.”] Results confirmed past findings that the more sensitive to PROP a participant was the more disgusted they were by visceral, but not moral, disgust elicitors. Importantly, the findings newly revealed that taste sensitivity had no bearing on evaluations of moral transgressions, regardless of their visceral nature, when “angry” was the emotion primed. However, when “grossed-out” was primed for evaluating moral violations, the more intense PROP tasted to a participant the more “grossed-out” they were by all transgressions. Women were generally more disgust sensitive and morally condemning than men, … The present findings support the proposition that moral and visceral disgust do not share a common oral origin, but show that linguistic priming can transform a moral transgression into a viscerally repulsive event and that susceptibility to this priming varies as a function of an individual’s sensitivity to the origins of visceral disgust—bitter taste. [bold mine.]
In other words, supertasters are more easily disgusted, and with verbal priming will transfer that disgust to moral transgressions. (And easily disgusted people tend to be conservatives.)
This is an attempt at a coherent explanation for why left-handedness (and right-handedness) exist in the distributions that they do.
Handedness is a rather exceptional human trait. Most animals don’t have a dominant hand (or foot.) Horses have no dominant hooves; anteaters dig equally well with both paws; dolphins don’t favor one flipper over the other; monkeys don’t fall out of trees if they try to grab a branch with their left hands. Only humans have a really distinct tendency to use one side of their bodies over the other.
And about 90% of us use our right hands, and about 10% of us use our left hands, (Wikipedia claims 10%, but The Lopsided Ape reports 12%.) an observation that appears to hold pretty consistently throughout both time and culture, so long as we aren’t dealing with a culture where lefties are forced to write with their right hands.
A simple Mendel-square two-gene explanation for handedness–a dominant allele for right-handedness and a recessive one for left-handedness, with equal proportions of alleles in society, would result in a 75% righties to 25% lefties. Even if the proportions weren’t equal, the offspring of two lefties ought to be 100% left-handed. This is not, however, what we see. The children of two lefties have only a 25% chance or so of being left-handed themselves.
So let’s try a more complicated model.
Let’s assume that there are two alleles that code for right-handedness. (Hereafter “R”) You get one from your mom and one from your dad.
Each of these alleles is accompanied by a second allele that codes for either nothing (hereafter “O”) or potentially switches the expression of your handedness (hereafter “S”)
Everybody in the world gets two identical R alleles, one from mom and one from dad.
Everyone also gets two S or O alleles, one from mom and one from dad. One of these S or O alleles affects one of your Rs, and the other affects the other R.
Your potential pairs, then, are:
RO/RO, RO/RS, RS/RO, or RS/RS
RO=right handed allele.
RS=50% chance of expressing for right or left dominance; RS/RS thus => 25% chance of both alleles coming out lefty.
So RO/RO, RO/RS, and RS/RO = righties, (but the RO/ROs may have especially dominant right hands; half of the RO/RS guys may have weakly dominant right hands.)
Only RS/RS produces lefties, and of those, only 25% defeat the dominance odds.
This gets us our observed correlation of only 25% of children of left-handed couples being left-handed themselves.
(Please note that this is still a very simplified model; Wikipedia claims that there may be more than 40 alleles involved.)
What of the general population as a whole?
Assuming random mating in a population with equal quantities of RO/RO, RO/RS, RS/RO and RS/RS, we’d end up with 25% of children RS/RS. But if only 25% of RS/RS turn out lefties, only 6.25% of children would be lefties. We’re still missing 4-6% of the population.
This implies that either: A. Wikipedia has the wrong #s for % of children of lefties who are left-handed; B. about half of lefties are RO/RS (about 1/8th of the RO/RS population); C. RS is found in twice the proportion as RO in the population; or D. my model is wrong.
Dr Chris McManus reported in his book Right Hand, Left Hand on a study he had done based on a review of scientific literature which showed parent handedness for 70,000 children. On average, the chances of two right-handed parents having a left-handed child were around 9% left-handed children, two left-handed parents around 26% and one left and one right-handed parent around 19%. …
More than 50% of left-handers do not know of any other left-hander anywhere in their living family.
This implies B, that about half of lefties are RO/RS. Having one RS combination gives you a 12.5% chance of being left-handed; having two RS combinations gives you a 25% chance.
And that… I think that works. And it means we can refine our theory–we don’t need two R alleles; we only need one. (Obviously it is more likely a whole bunch of alleles that code for a whole system, but since they act together, we can model them as one.) The R allele is then modified by a pair of alleles that comes in either O (do nothing,) or S (switch.)
One S allele gives you a 12.5% chance of being a lefty; two doubles your chances to 25%.
Interestingly, this model suggests that not only does no gene for “left handedness” exist, but that “left handedness” might not even be the allele’s goal. Despite the rarity of lefties, the S allele is found in 75% of the population (an equal % as the O allele.) My suspicion is that the S allele is doing something else valuable, like making sure we don’t become too lopsided in our abilities or try to shunt all of our mental functions to one side of our brain.
We’re talking about foods, not whether you prefer Beethoven or Lil’ Wayne.
Certainly there are broad correlations between the foods people enjoy and their ethnicity/social class. If you know whether I chose fried okra, chicken feet, gefilte fish, escargot, or grasshoppers for dinner, you can make a pretty good guess about my background. (Actually, I have eaten all of these things. The grasshoppers were over-salted, but otherwise fine.) The world’s plethora of tasty (and not-so-tasty) cuisines is due primarily to regional variations in what grows well where (not a lot of chili peppers growing up in Nunavut, Canada,) and cost (the rich can always afford fancier fare than the poor,) with a side dish of seemingly random cultural taboos like “don’t eat pork” or “don’t eat cows” or “don’t eat grasshoppers.”
But do people vary in their experience of taste? Does intelligence influence how you perceive your meal, driving smarter (or less-smart) people to seek out particular flavor profiles or combinations? Or could there be other psychological or neurological factors at play n people’s eating decisions?
This post was inspired by a meal my husband, an older relative and I shared recently at McDonald’s. It had been a while since we’d last patronized McDonald’s, but older relative likes their burgers, so we went and ordered some new-to-us variety of meat-on-a-bun. As my husband and I sat there, deconstructing the novel taste experience and comparing it to other burgers, the older relative gave us this look of “Jeez, the idiots are discussing the flavor of a burger! Just eat it already!”
As we dined later that evening at my nemesis, Olive Garden, I began wondering whether we actually experienced the food the same way. Perhaps there is something in people that makes them prefer bland, predictable food. Perhaps some people are better at discerning different flavors, and the people who cannot discern them end up with worse food because they can’t tell?
Unfortunately, it appears that not a lot of people have studied whether there is any sort of correlation between IQ and taste (or smell.) There’s a fair amount of research on taste (and smell,) like “do relatives of schizophrenics have impaired senses of smell?” (More on Schizophrenics and their decreased ability to smell) or “can we get fat kids to eat more vegetables?” Oh, and apparently the nature of auditory hallucinations in epileptics varies with IQ (IIRC.) But not much that directly addresses the question.
I did find two references that, somewhat in passing, noted that they found no relationship between taste and IQ, but these weren’t studies designed to test for that. For example, in A Food Study of Monotony, published in 1958 (you know I am really looking for sources when I have to go back to 1958,) researchers restricted the diets of military personnel employed at an army hospital to only 4 menus to see how quickly and badly they’d get bored of the food. They found no correlation between boredom and IQ, but people employed at an army hospital are probably pre-selected for being pretty bright (and having certain personality traits in common, including ability to stand army food.)
Interestingly, three traits did correlate with (or against) boredom:
Fatter people got bored fastest (the authors speculate that they care the most about their food,) while depressed and feminine men (all subjects in the study were men) got bored the least. Depressed people are already disinterested in food, so it is hard to get less-interested, but no explanation was given of what they meant by “femininity” or how this might affect food preferences. (Also, the hypochondriacs got bored quickly.)
Some foods inspire boredom (or even disgust) quickly, while others are virtually immune. Milk and bread, for example, can be eaten every day without complaint (though you might get bored if bread were your only food.) Potted meat, by contrast, gets old fast.
Although self-reported eating practices were not associated with educational level, intelligence, nor various indices of psychopathology, they were related to the demographic variables of gender and age: older participants reported eating more fiber in their diets than did younger ones, and women reported more avoidance of fats from meats than did men.
Self-reported eating habits may not be all that reliable, though.
Participants with autism were significantly less accurate than control participants in identifying sour tastes and marginally less accurate for bitter tastes, but they were not different in identifying sweet and salty stimuli. … Olfactory identification was significantly worse among participants with autism. … True differences exist in taste and olfactory identification in autism. Impairment in taste identification with normal detection thresholds suggests cortical, rather than brainstem dysfunction.
(Another study of the eating habits of autistic kids found that the pickier ones were rated by their parents as more severely impaired than the less picky ones, but then severe food aversions are a form of life impairment. By the way, do not tell the parents of an autistic kid, “oh, he’ll eat when he’s hungry.” They will probably respond politely, but mentally they are stabbing you.)
On brainstem vs. cortical function–it appears that we do some of our basic flavor identification way down in the most instinctual part of the brain, as Facial Expressions in Response to Taste and Smell Stimulation explores. The authors found that pretty much everyone makes the same faces in response to sweet, sour, and bitter flavors–whites and blacks, old people and newborns, retarded people and blind people, even premature infants, blind infants, and infants born missing most of their brains. All of which is another point in favor of my theory that disgust is real. (And if that is not enough science of taste for you, I recommend Place and Taste Aversion Learning, in which animals with brain lesions lost their fear of new foods.)
Genetics obviously plays a role in taste. If you are one of the 14% or so of people who think cilantro tastes like soap (and I sympathize, because cilantro definitely tastes like soap,) then you’ve already discovered this in a very practical way. Genetics also obviously determine whether you continue producing the enzyme for milk digestion after infancy (lactase persistence). According to Why are you a picky eater? Blame genes, brains, and breastmilk:
Researchers at Philadelphia’s Monell Chemical Senses Center, a scientific institute dedicated to the study of smell and taste, have found that this same gene also predicts the strength of sweet-tooth cravings among children. Kids who were more sensitive to bitterness preferred sugary foods and drinks. However, adults with the bitter receptor genes remained picky about bitter foods but did not prefer more sweets, the Monell study found. This suggests that sometimes age and experience can override genetics.
I suspect that there is actually a sound biological, evolutionary reason why kids crave sweets more than grownups, and this desire for sweets is somewhat “turned off” as we age.
Ethnobotanist Gary Paul Nabhan suggests that diet had a key role in human evolution, specifically, that human genetic diversity is predominately a product of regional differences in ancestral diets. Chemical compounds found within animals and plants varied depending on climate. These compounds induced changes in gene expression, which can vary depending on the amount within the particular food and its availability. The Agricultural Age led to further diet-based genetic diversity. Cultivation of foods led to the development of novel plants and animals that were not available in the ancestral environment. …
There are other fascinating examples of gene-diet interaction. Culturally specific recipes, semi-quantitative blending of locally available foods and herbs, and cooking directions needed in order to reduce toxins present in plants, emerged over time through a process of trial-and error and were transmitted through the ages. The effects on genes by foods can be extremely complex given the range of plant-derived compounds available within a given region. The advent of agriculture is suggested to have overridden natural selection by random changes in the environment. The results of human-driven selection can be highly unexpected. …
In sedentary herding societies, drinking water was frequently contaminated by livestock waste. The author suggests in order to avoid contaminated water, beverages made with fermented grains or fruit were drunk instead. Thus, alcohol resistance was selected for in populations that herded animals, such as Europeans. By contrast, those groups which did not practice herding, such as East Asians and Native Americans, did not need to utilize alcohol as a water substitute and are highly sensitive to the effects of alcohol.
Speaking of genetics:
Indians and Africans are much more likely than Europeans and native South Americans to have an allele that lets them eat a vegetarian diet:
The vegetarian allele evolved in populations that have eaten a plant-based diet over hundreds of generations. The adaptation allows these people to efficiently process omega-3 and omega-6 fatty acids and convert them into compounds essential for early brain development and controlling inflammation. In populations that live on plant-based diets, this genetic variation provided an advantage and was positively selected in those groups.
In Inuit populations of Greenland, the researchers uncovered that a previously identified adaptation is opposite to the one found in long-standing vegetarian populations: While the vegetarian allele has an insertion of 22 bases (a base is a building block of DNA) within the gene, this insertion was found to be deleted in the seafood allele.
Dr. Hirsh, neurological director of the Smell and Taste Research and Treatment Foundation in Chicago, stands by his book that is based on over 24 years of scientific study and tests on more than 18,000 people’s food choices and personalities.)
Byrnes assessed the group using the Arnett Inventory of Sensation Seeking (AISS), a test for the personality trait of sensation-seeking, defined as desiring novel and intense stimulation and presumed to contribute to risk preferences. Those in the group who score above the mean AISS score are considered more open to risks and new experiences, while those scoring below the mean are considered less open to those things.
The subjects were given 25 micrometers of capsaicin, the active component of chili peppers, and asked to rate how much they liked a spicy meal as the burn from the capsaicin increased in intensity. Those in the group who fell below the mean AISS rapidly disliked the meal as the burn increased. People who were above the mean AISS had a consistently high liking of the meal even as the burn increased. Those in the mean group liked the meal less as the burn increased, but not nearly as rapidly as those below the mean.
And then there are the roughly 25% of us who are “supertasters“:
A supertaster is a person who experiences the sense of taste with far greater intensity than average. Women are more likely to be supertasters, as are those from Asia, South America and Africa. The cause of this heightened response is unknown, although it is thought to be related to the presence of the TAS2R38 gene, the ability to taste PROP and PTC, and at least in part, due to an increased number of fungiform papillae.
Perhaps the global distribution of supertasters is related to the distribution of vegetarian-friendly alleles. It’s not surprising that women are more likely to be supertasters, as they have a better sense of smell than men. What may be surprising is that supertasters tend not to be foodies who delight in flavoring their foods with all sorts of new spices, but instead tend toward more restricted, bland diets. Because their sense of taste is essentially on overdrive, flavors that taste “mild” to most people taste “overwhelming” on their tongues. As a result, they tend to prefer a much more subdued palette–which is, of course, perfectly tasty to them.
“During research back in the 1980s, we discovered that people are more reluctant to try new foods of animal origin than those of plant origin,” Pelchat says. “That’s ironic in two ways. As far as taste is concerned, the range of flavors in animal meat isn’t that large compared to plants, so there isn’t as much of a difference. And, of course, people are much more likely to be poisoned by eating plants than by animals, as long as the meat is properly cooked.” …
It’s also possible that reward mechanisms in our brain can drive changes in taste. Pelchat’s team once had test subjects sample tiny bits of unfamiliar food with no substantial nutritional value, and accompanied them with pills that contained either nothing or a potent cocktail of caloric sugar and fat. Subjects had no idea what was in the pills they swallowed. They learned to like the unfamiliar flavors more quickly when they were paired with a big caloric impact—suggesting that body and brain combined can alter tastes more easily when unappetizing foods deliver big benefits.
So trying to get people to adopt new foods while losing weight may not be the best idea.
(For all that people complain about kids’ pickiness, parents are much pickier. Kids will happily eat playdoh and crayons, but one stray chicken heart in your parents’ soup and suddenly it’s “no more eating at your house.”)
Of course, you can’t talk about food without encountering meddlers who are convinced that people should eat whatever they’re convinced is the perfect diet, like these probably well-meaning folks trying to get Latinos to eat fewer snacks:
Latinos are the largest racial and ethnic minority group in the United States and bear a disproportionate burden of obesity related chronic disease. Despite national efforts to improve dietary habits and prevent obesity among Latinos, obesity rates remain high. …
there is a need for more targeted health promotion and nutrition education efforts on the risks associated with soda and energy-dense food consumption to help improve dietary habits and obesity levels in low-income Latino communities.
Never mind that Latinos are one of the healthiest groups in the country, with longer life expectancies than whites! We’d better make sure they know that their food ways are not approved of!
(Just in case it is not clear already: different people are adapted to and will be healthy on different diets. There is no magical, one-size-fits-all diet.)
4,000 Scottish children aged 3-5 years old were examined to compare the intelligence dampening effects of fast food consumption versus “from scratch” fare prepared with only fresh ingredients.
Higher fast food consumption by the children was linked with lower intelligence and this was even after adjustments for wealth and social status were taken into account.
It’d be better if they controlled for parental IQ.
The conclusions of this study confirm previous research which shows long lasting effects on IQ from a child’s diet. An Australian study from the University of Adelaide published in August 2012 showed that toddlers who consume junk food grow less smart as they get older. In that study, 7000 children were examined at the age of 6 months, 15 months, 2 years to examine their diet.
When the children were examined again at age 8, children who were consuming the most unhealthy food had IQs up to 2 points lower than children eating a wholesome diet.
As we were discussing yesterday, I theorize that people have neural feedback loops that reward them for conforming/imitating others/obeying authorities and punish them for disobeying/not conforming.
This leads people to obey authorities or go along with groups even when they know, logically, that they shouldn’t.
There are certainly many situations in which we want people to conform even though they don’t want to, like when my kids have to go to bed or buckle their seatbelts–as I said yesterday, the feedback loop exists because it is useful.
But there are plenty of situations where we don’t want people to conform, like when trying to brainstorm new ideas.
Under what conditions will people disobey authority?
But in person, people may disobey authorities when they have some other social systtem to fall back on. If disobeying an authority in Society A means I lose social status in Society A, I will be more likely to disobey if I am a member in good standing in Society B.
If I can use my disobedience against Authority A as social leverage to increase my standing in Society B, then I am all the more likely to disobey. A person who can effectively stand up to an authority figure without getting punished must be, our brains reason, a powerful person, an authority in their own right.
Teenagers do this all the time, using their defiance against adults, school, teachers, and society in general to curry higher social status among other teenagers, the people they actually care about impressing.
SJWs do this, too:
I normally consider the president of Princeton an authority figure, and even though I probably disagree with him on far more political matters than these students do, I’d be highly unlikely to be rude to him in real life–especially if I were a student he could get expelled from college.
But if I had an outside audience–Society B–clapping and cheering for me behind the scenes, the urge to obey would be weaker. And if yelling at the President of Princeton could guarantee me high social status, approval, job offers, etc., then there’s a good chance I’d do it.
But then I got to thinking: Are there any circumstances under which these students would have accepted the president’s authority?
Obviously if the man had a proven track record of competently performing a particular skill the students wished to learn, they might follow hi example.
If authority works via neural feedback loops, employing some form of “mirror neurons,” do these systems activate more strongly when the people we are perceiving look more like ourselves (or our internalized notion of people in our “tribe” look like, since mirrors are a recent invention)?
In other words, what would a cross-racial version of the Milgram experiment look like?
Unfortunately, it doesn’t look like anyone has tried it (and to do it properly, it’d need to be a big experiment, involving several “scientists” of different races [so that the study isn’t biased by one “scientist” just being bad at projecting authority] interacting with dozens of students of different races, which would be a rather large undertaking.) I’m also not finding any studies on cross-racial authority (I did find plenty of websites offering practical advice about different groups’ leadership styles,) though I’m sure someone has studied it.
However, I did find cross-racial experiments on empathy, which may involve the same brain systems, and so are suggestive:
Using transcranial magnetic stimulation, we explored sensorimotor empathic brain responses in black and white individuals who exhibited implicit but not explicit ingroup preference and race-specific autonomic reactivity. We found that observing the pain of ingroup models inhibited the onlookers’ corticospinal system as if they were feeling the pain. Both black and white individuals exhibited empathic reactivity also when viewing the pain of stranger, very unfamiliar, violet-hand models. By contrast, no vicarious mapping of the pain of individuals culturally marked as outgroup members on the basis of their skin color was found. Importantly, group-specific lack of empathic reactivity was higher in the onlookers who exhibited stronger implicit racial bias.
Using the event-related potential (ERP) approach, we tracked the time-course of white participants’ empathic reactions to white (own-race) and black (other-race) faces displayed in a painful condition (i.e. with a needle penetrating the skin) and in a nonpainful condition (i.e. with Q-tip touching the skin). In a 280–340 ms time-window, neural responses to the pain of own-race individuals under needle penetration conditions were amplified relative to neural responses to the pain of other-race individuals displayed under analogous conditions.
In this study, we used functional magnetic resonance imaging (fMRI) to investigate how people perceive the actions of in-group and out-group members, and how their biased view in favor of own team members manifests itself in the brain. We divided participants into two teams and had them judge the relative speeds of hand actions performed by an in-group and an out-group member in a competitive situation. Participants judged hand actions performed by in-group members as being faster than those of out-group members, even when the two actions were performed at physically identical speeds. In an additional fMRI experiment, we showed that, contrary to common belief, such skewed impressions arise from a subtle bias in perception and associated brain activity rather than decision-making processes, and that this bias develops rapidly and involuntarily as a consequence of group affiliation. Our findings suggest that the neural mechanisms that underlie human perception are shaped by social context.
None of these studies shows definitevely whether or not in-group vs. out-group biases are an inherent feature of neurological systems, or Avenanti’s finding that people were more empathetic toward a purple-skinned person than to a member of a racial out-group suggests that some amount of learning is involved in the process–and that rather than comparing people against one’s in-group, we may be comparing them against our out-group.
At any rate, you may get similar outcomes either way.
In cases where you want to promote group cohesion and obedience, it may be beneficial to sort people by self-identity.
In cases where you want to guard against groupthink, obedience, or conformity, it may be beneficial to mix up the groups. Intellectual diversity is great, but even ethnic diversity may help people resist defaulting to obedience, especially when they know they shouldn’t.
Using data from two panel studies on U.S. firms and an online experiment, we examine investor reactions to increases in board diversity. Contrary to conventional wisdom, we find that appointing female directors has no impact on objective measures of performance, such as ROA, but does result in a systematic decrease in market value.
(Solal argues that investors may perceive the hiring of women–even competent ones–as a sign that the company is pursuing social justice goals instead of money-making goals and dump the stock.)
Additionally, diverse companies may find it difficult to work together toward a common goal–there is a good quantity of evidence that increasing diversity decreases trust and inhibits group cohesion. EG, from The downside of diversity:
IT HAS BECOME increasingly popular to speak of racial and ethnic diversity as a civic strength. From multicultural festivals to pronouncements from political leaders, the message is the same: our differences make us stronger.
But a massive new study, based on detailed interviews of nearly 30,000 people across America, has concluded just the opposite. Harvard political scientist Robert Putnam — famous for “Bowling Alone,” his 2000 book on declining civic engagement — has found that the greater the diversity in a community, the fewer people vote and the less they volunteer, the less they give to charity and work on community projects. In the most diverse communities, neighbors trust one another about half as much as they do in the most homogenous settings. The study, the largest ever on civic engagement in America, found that virtually all measures of civic health are lower in more diverse settings.
As usual, I suspect there is an optimum level of diversity–depending on a group’s purpose and its members’ preferences–that helps minimize groupthink while still preserving most of the benefits of cohesion.
So I was thinking the other day about the question of why do people go along with others and do things even when they know they believe (or know) they shouldn’t. As Tolstoy asks, why did the French army go along with this mad idea to invade Russia in 1812? Why did Milgram’s subjects obey his orders to “electrocute” people? Why do I feel emotionally distressed when refusing to do something, even when I have very good reasons to refuse?
As I mentioned ages ago, I suspect that normal people have neural circuits that reward them for imitating others and punish them for failing to imitate. Mirror neurons probably play a critical role in this process, but probably aren’t the complete story.
These feedback loops are critical for learning–infants only a few months old begin the process of learning to talk by moving their mouths and making “ba ba” noises in imitation of their parents. (Hence why it is called “babbling.”) They do not consciously say to themselves, “let me try to communicate with the big people by making their noises;” they just automatically move their faces to match the faces you make at them. It’s an instinct.
You probably do this, too. Just watch what happens when one person in a room yawns and then everyone else feels compelled to do it, too. Or if you suddenly turn and look at something behind the group of people you’re with–others will likely turn and look, too.
Autistic infants have trouble with imitation, (and according to Wikipedia, several studies have found abnormalities in their mirror neuron systems, though I suspect the matter is far from settled–among other things, I am not convinced that everyone with an ASD diagnosis actually has the same thing going on.) Nevertheless, there is probably a direct link between autistic infants’ difficulties with imitation and their difficulties learning to talk.
For adults, imitation is less critical (you can, after all, consciously decide to learn a new language,) but still important for survival. If everyone in your village drinks out of one well and avoids the other well, even if no one can explain why, it’s probably a good idea to go along and only drink out of the “good” well. Something pretty bad probably happened to the last guy who drank out of the “bad” well, otherwise the entire village wouldn’t have stopped drinking out of it. If you’re out picking berries with your friends when suddenly one of them runs by yelling “Tiger!” you don’t want to stand there and yell, “Are you sure?” You want to imitate them, and fast.
Highly non-conformist people probably have “defective” or low-functioning feedback loops. They simply feel less compulsion to imitate others–it doesn’t even occur to them to imitate others! These folks might die in interesting ways, but in the meanwhile, they’re good sources for ideas other people just wouldn’t have thought of. I suspect they are concentrated in the arts, though clearly some of them are in programming.
Normal people’s feedback loops kick in when they are not imitating others around them, making them feel embarrassed, awkward, or guilty. When they imitate others, their brains reward them, making them feel happy. This leads people to enjoy a variety of group-based activities, from football games to prayer circles to line dancing to political rallies.
At its extreme, these groups become “mobs,” committing violent acts that many of the folks involved wouldn’t under normal circumstances.
Highly conformist people’s feedback loops are probably over-active, making them feel awkward or uncomfortable while simply observing other people not imitating the group. This discomfort can only be relieved by getting those other people to conform. These folks tend to favor more restrictive social policies and can’t understand why other people would possibly want to do those horrible, non-conforming things.
To reiterate: this feedback system exists because helped your ancestors survive. It is not people being “sheep;” it is a perfectly sensible approach to learning about the world and avoiding dangers. And different people have stronger or weaker feedback loops, resulting in more or less instinctual desire to go along with and imitate others.
However, there are times when you shouldn’t imitate others. Times when, in fact, everyone else is wrong.
The Milgram Experiment places the subject in a situation where their instinct to obey the experimenter (an “authority figure”) is in conflict with their rational desire not to harm others (and their instinctual empathizing with the person being “electrocuted.”)
In case you have forgotten the Milgram Experiment, it went like this: an unaware subject is brought into the lab, where he meets the “scientist” and a “student,” who are really in cahoots. The subject is told that he is going to assist with an experiment to see whether administering electric shocks to the “student” will make him learn faster. The “student” also tells the student, in confidence, that he has a heart condition.
The real experiment is to see if the subject will shock the “student” to death at the “scientist’s” urging.
No actual shocks are administered, but the “student” is a good actor, making out that he is in terrible pain and then suddenly going silent, etc.
Before the experiment, Milgram polled various people, both students and “experts” in psychology, and pretty much everyone agreed that virtually no one would administer all of the shocks, even when pressured by the “scientist.”
In Milgram’s first set of experiments, 65 percent (26 of 40) of experiment participants administered the experiment’s final massive 450-volt shock, though many were very uncomfortable doing so; at some point, every participant paused and questioned the experiment; some said they would refund the money they were paid for participating in the experiment. Throughout the experiment, subjects displayed varying degrees of tension and stress. Subjects were sweating, trembling, stuttering, biting their lips, groaning, digging their fingernails into their skin, and some were even having nervous laughing fits or seizures. (bold mine)
I’m skeptical about the seizures, but the rest sounds about right. Resisting one’s own instinctual desire to obey–or putting the desire to obey in conflict with one’s other desires–creates great emotional discomfort.
So while on my walk today, I got to thinking about various potential implications of the hippocampal theory of time preference.
The short version if you don’t want to read yesterday’s post is that one’s degree of impulsivity/ability to plan / high or low time preference seems to be mediated by an interaction between the nucleus accumbens, which seems to a desire center, and the hippocampus, which does a lot of IQ-related tasks like learn new things and track objects through space. Humans with hippocampal damage become amnesiacs; rats with the connection between their nucleus accumbens and hipocampus severed lose their ability to delay gratification even for superior rewards, becoming slaves to instant gratification.
So, my suspicion:
Relatively strong hippocampus => inhibition of the nucleus accumbens => low time preference.
Relatively weak hippocamus => uninhibited nucleus accumbens => high time preference (aka impulsivity.)
Also, Strong hippocampus = skill at high IQ tasks.
(Other theories on the subject: Intelligent people make lots of money and so marry attractive people, resulting in a general correlation between IQ and attractiveness; there is something about eating too much or the particular foods being eaten that causes brain degeneration.)
People generally claim that overweight people lack “willpower.” Note that I am not arguing about willpower; willpower is only a tiny part of the equation.
The skinny people I know do not have willpower. They just do not have big appetites. They are not sitting there saying, “OMG, I am so hungry, but I am going to force myself not to eat right now;” they just don’t actually feel that much hunger.
The fat people I know have big appetites. They’ve always had big appetites. Some of them have documented large appetites going back to infancy. Sure, their ability to stay on a diet may be directly affected by willpower, but they’re starting from a fundamentally different hunger setpoint.
So what might be going on is just a matter of whether the hippocampus or nucleus accumbens happens to be dominant. Where the NE is dominant, the person feels hunger (and all desires) quite strongly. Where the hippocampus is dominant, the person simply doesn’t feel as much hunger (or other desires.)
That a strong hippocampus also leads to high IQ may just be, essentially, a side effect of this trade-off between the two regions.
We might expect, therefore, to see higher inhibition in smart people across a range of behaviors–take socializing, sex, and drug use. *Wanders off to Google*
So, first of all, it looks like there’s a study that claims that higher IQ people do more drugs than lower IQ people. Since the study only looks at self-reported drug use, and most people lie about their illegal drug use, I consider this study probably not very useful; also, drug use is not the same as drug addiction, and there’s a big difference between trying something once and doing it compulsively.
I am reminded here of a story about P. A. M. Dirac, one of my favorite scientists:
“An anecdote recounted in a review of the 2009 biography tells of Werner Heisenberg and Dirac sailing on an ocean liner to a conference in Japan in August 1929. “Both still in their twenties, and unmarried, they made an odd couple. Heisenberg was a ladies’ man who constantly flirted and danced, while Dirac—’an Edwardian geek’, as biographer Graham Farmelo puts it—suffered agonies if forced into any kind of socialising or small talk. ‘Why do you dance?’ Dirac asked his companion. ‘When there are nice girls, it is a pleasure,’ Heisenberg replied. Dirac pondered this notion, then blurted out: ‘But, Heisenberg, how do you know beforehand that the girls are nice?'”” (from the Wikipedia.)
Folks speculate that Dirac was autistic; obviously folks don’t speculate such things about Heisenberg.
Autism I have previously speculated may be a side effect of the recent evolution of high math IQ, and the current theory implies a potential correlation between various ASDs and inhibition.
The atypical gamma response to contextual modulation that we identified can be seen as the link between the behavioral output (atypical visual perception) and the underlying brain mechanism (an imbalance in excitatory and inhibitory neuronal processing). The impaired inhibition–excitation balance is suggested to be part of the core etiological pathway of ASD (Ecker et al., 2013). Gamma oscillations emerge from interactions between neuronal excitation and inhibition (Buzsaki and Wang, 2012), are important for neuronal communication (Fries, 2009), and have been associated with e.g., perceptual grouping mechanisms (Singer, 1999).
“It has been suggested that the restricted, stereotyped and repetitive behaviours typically found in autism are underpinned by deficits of inhibitory control. … Following sham, adults with autism relative to controls had reduced activation in key inhibitory regions of inferior frontal cortex and thalamus, but increased activation of caudate and cerebellum. However, brain activation was modulated in opposite ways by depletion in each group. Within autistic individuals depletion upregulated fronto-thalamic activations and downregulated striato-cerebellar activations toward control sham levels, completely ‘normalizing’ the fronto-cerebellar dysfunctions. The opposite pattern occurred in controls. Moreover, the severity of autism was related to the degree of differential modulation by depletion within frontal, striatal and thalamic regions. Our findings demonstrate that individuals with autism have abnormal inhibitory networks, and that serotonin has a differential, opposite, effect on them in adults with and without autism. Together these factors may partially explain the severity of autistic behaviours and/or provide a novel (tractable) treatment target.”
This may not have anything at all to do with the hippocampus-NA system, of course.
““What we found in animal models and others have found postmortem in schizophrenic patients is that the hippocampus is lacking a certain type of GABA-ergic [GABA-producing] neuron that puts the brakes on the system,” says Grace. “What we’re trying to do is fix the GABA system that’s broken and, by doing that, stabilize the system so the dopamine system responses are back to normal, so that we can actually fix what’s wrong rather than trying to patch it several steps downstream.””
Wow, I made it through two whole posts on the brain without mentioning the amygdala even once.
Time Preference isn’t sexy and exciting, like anything related to, well, sex. It isn’t controversial like IQ and gender. In fact, most of the ink spilled on the subject isn’t even found in evolutionary or evolutionary psychology texts, but over in economics papers about things like interest rates that no one but economists would want to read.
So why do I think Time Preference is so important?
Time Preference (aka future time orientation, time discounting, delay discounting, temporal discounting,) is the degree to which you value having a particular item today versus having it tomorrow. “High time preference” means you want things right now, whereas “low time preference” means you’re willing to wait.
A relatively famous test of Time Preference is to offer a child a cookie right now, but tell them they can have two cookies if they wait 10 minutes. Some children take the cookie right now, some wait ten minutes, and some try to wait ten minutes but succumb to the cookie right now about halfway through.
Obviously, many factors can influence your Time Preference–if you haven’t eaten in several days, for example, you’ll probably not only eat the cookie right away, but also start punching me until I give you the second cookie. If you don’t like cookies, you won’t have any trouble waiting for another, but you won’t have much to do with it. Etc. But all these things held equal, your basic inclination toward high or low time preference is probably biological–and by “biological,” I mean, “mostly genetic.”
The scientists train rats to touch pictures with their noses in return for sugar cubes. Picture A gives them one cube right away, while picture B gives them more cubes after a delay. If the delay is too long or the reward too small, the rats just take the one cube right away. But there’s a sweet spot–apparently 4 cubes after a short wait—where the rats will figure it’s worth their while to tap picture B instead of picture A.
But if you snip the connection between the rats’ hippocampi and nucleus accumbenses, suddenly they lose all ability to wait for sugar cubes and just eat their sugar cubes right now, like a pack of golden retrievers in a room full of squeaky toys. They become completely unable to wait for the better payout of four sugar cubes, no matter how much they might want to.
So we know that this connection between the hippocampus and the nucleus accumbens is vitally important to your Time Orientation, though I don’t know what other modifications, such as low hippocampal volume or low nucleus accumbens would do.
So what do the hippocampus and nucleus accumbens do?
According to the Wikipedia, the hippocampus plays an important part in inhibition, memory, and spatial orientation. People with damaged hippocampi become amnesiacs, unable to form new memories.There is a pretty direct relationship between hippocampus size and memory, as documented primarily in old people:
“There is, however, a reliable relationship between the size of the hippocampus and memory performance — meaning that not all elderly people show hippocampal shrinkage, but those who do tend to perform less well on some memory tasks. There are also reports that memory tasks tend to produce less hippocampal activation in elderly than in young subjects. Furthermore, a randomized-control study published in 2011 found that aerobic exercise could increase the size of the hippocampus in adults aged 55 to 80 and also improve spatial memory.” (wikipedia)
Amnesiacs (and Alzheimer’s patients) also get lost a lot, which seems like a perfectly natural side effect of not being able to remember where you are, except that rat experiments show something even more interesting: specific cells that light up as the rats move around, encoding data about where they are.
“Neural activity sampled from 30 to 40 randomly chosen place cells carries enough information to allow a rat’s location to be reconstructed with high confidence.” (wikipedia)
According to Wikipedia, the Inhibition function theory is a little older, but seems like a perfectly reasonable theory to me.
“[Inhibition function theory] derived much of its justification from two observations: first, that animals with hippocampal damage tend to be hyperactive; second, that animals with hippocampal damage often have difficulty learning to inhibit responses that they have previously been taught, especially if the response requires remaining quiet as in a passive avoidance test.”
This is, of course, exactly what the scientists found when they separated the rats’ hippocampi from their nucleus accumbenses–they lost all ability to inhibit their impulses in order to delay gratification, even for a better payout.
In other word, the hippocampus lets you learn, process the moment of objects through space (spatial reasoning) and helps you suppress your inhibitions–that is, it is directly involved in IQ and Time Preference.
Dopaminergic input from the VTA modulate the activity of neurons within the nucleus accumbens. These neurons are activated directly or indirectly by euphoriant drugs (e.g., amphetamine, opiates, etc.) and by participating in rewarding experiences (e.g., sex, music, exercise, etc.). …
The shell of the nucleus accumbens is involved in the cognitive processing of motivational salience (wanting) as well as reward perception and positive reinforcement effects. Particularly important are the effects of drug and naturally rewarding stimuli on the NAc shell because these effects are related to addiction.Addictive drugs have a larger effect on dopamine release in the shell than in the core. The specific subset of ventral tegmental area projection neurons that synapse onto the D1-type medium spiny neurons in the shell are responsible for the immediate perception of the rewarding property of a stimulus (e.g., drug reward). …
The nucleus accumbens core is involved in the cognitive processing of motor function related to reward and reinforcement. Specifically, the core encodes new motor programs which facilitate the acquisition of a given reward in the future.
So it sounds to me like the point of the nucleus accumbens is to learn “That was awesome! Let’s do it again!” or “That was bad! Let’s not do it again!”
Together, the nucleus accumbens + hippocampus can learn “4 sugar cubes in a few seconds is way better than 1 sugar cube right now.” Apart, the nucleus accumbens just says, “Sugar cubes! Sugar cubes! Sugar cubes!” and jams the lever that says “Sugar cube right now!” and there is nothing the hippocampus can do about it.
What distinguishes humans from all other animals? Our big brains, intellects, or impressive vocabularies?
It is our ability to acquire new knowledge and use it to plan and build complex, multi-generational societies.
Ants and bees live in complex societies, but they do not plan them. Monkeys, dolphins, squirrels, and even rats can plan for the future, but only humans plan and build cities.
Even the hunter-gatherer must plan for the future; a small tendril only a few inches high is noted during the wet season, then returned to in the dry, when it is little more than a withered stem, and the water-storing root beneath it harvested. The farmer facing winter stores up grain and wood; the city engineer plans a water and sewer system large enough to handle the next hundred years’ projected growth.
All of these activities require the interaction between the hippocampus and nucleus accumbens. The nucleus accumbens tells us that water is good, grain is tasty, fire is warm, and that clean drinking water and flushable toilets are awesome. The hippocampus reminds us that the dry season is coming, and so we should save–and remember–that root until we need it. It reminds us that we will be cold and hungry in winter if we don’t save our grain and spend a hours and hours chopping wood right now. It reminds us that not only is it good to organize the city so that everyone can have clean drinking water and flushable toilets right now, but that we should also make sure the system will keep working even as new people enter the city over time.
Disconnect these two, and your ability to plan goes down the drain. You eat all of your roots now, devour your seed corn, refuse to chop wood, and say, well, yes, running water would be nice, but that would require so much planning.
As I have mentioned before, I think Europeans (and probably a few other groups whose history I’m just not as familiar with and so I cannot comment on) IQ increased quite a bit in the past thousand years or so, and not just because the Catholic Church banned cousin marriage. During this time, manorialism became a big deal throughout Western Europe, and the people who exhibited good impulse control, worked hard, delayed gratification, and were able to accurately calculate the long-term effects of their actions tended to succeed (that is, have lots of children) and pass on their clever traits to their children. I suspect that selective pressure for “be a good manorial employee” was particularly strong in German, (and possibly Japan, now that I think about it,) resulting in the Germanic rigidity that makes them such good engineers.
Nothing in the manorial environment directly selected for engineering ability, higher math, large vocabularies, or really anything that we mean when we normally talk about IQ. But I do expect manorial life to select for those who could control their impulses and plan for the future, resulting in a run-away effect of increasingly clever people constructing increasingly complex societies in which people had to be increasingly good at dealing with complexity and planning to survive.
Ultimately, I see pure mathematical ability as a side effect of being able to accurately predict the effects of one’s actions and plan for the future (eg, “It will be an extra long winter, so I will need extra bushels of corn,”) and the ability to plan for the future as a side effect of being able to accurately represent the path of objects through space and remember lessons one has learned. All of these things, ultimately, are the same operations, just oriented differently through the space-time continuum.
Since your brain is, of course, built from the same DNA code as the rest of you, we would expect brain functions to have some amount of genetic heritablity, which is exactly what we find:
“A meta-analysis of twin, family and adoption studies was conducted to estimate the magnitude of genetic and environmental influences on impulsivity. The best fitting model for 41 key studies (58 independent samples from 14 month old infants to adults; N=27,147) included equal proportions of variance due to genetic (0.50) and non-shared environmental (0.50) influences, with genetic effects being both additive (0.38) and non-additive (0.12). Shared environmental effects were unimportant in explaining individual differences in impulsivity. Age, sex, and study design (twin vs. adoption) were all significant moderators of the magnitude of genetic and environmental influences on impulsivity. The relative contribution of genetic effects (broad sense heritability) and unique environmental effects were also found to be important throughout development from childhood to adulthood. Total genetic effects were found to be important for all ages, but appeared to be strongest in children. Analyses also demonstrated that genetic effects appeared to be stronger in males than in females.”
“Shared environmental effects” in a study like this means “the environment you and your siblings grew up in, like your household and school.” In this case, shared effects were unimportant–that means that parenting had no effect on the impulsivity of adopted children raised together in the same household. Non-shared environmental influences are basically random–you bumped your head as a kid, your mom drank during pregnancy, you were really hungry or pissed off during the test, etc., and maybe even cultural norms.
So your ability to plan for the future appears to be part genetic, and part random luck.
One of the theories that undergirds a large subset of my thoughts on how brains work is the idea that Disgust is a Real Thing.
I don’t just mean a mild aversion to things that smell bad, like overturned port-a-potties or that fuzzy thing you found growing in the back of the fridge that might have been lasagna, once upon a time. Even I have such aversions.
I mean reactions like screaming and looking like you are about to vomit upon finding a chicken heart in your soup; gagging at the sight of trans people or female body hair; writhing and waving your hands while removing a slug from your porch; or the claim that talking about rats at the dinner table puts you off your meal. Or more generally, people claiming, “That’s disgusting!” or “What a creep!” about things or people that obviously aren’t even stinky.
There is a parable about a deaf person watching people dance to music he can’t hear and assuming that the people have all gone mad.
For most of my life, I assumed these reactions were just some sort of complicated schtick people put on, for totally obtuse reasons. It was only about a year ago that I realized, in a flash of insight, that this disgust is a real thing that people actually feel.
I recently expressed this idea to a friend, and they stared at me in shock. (That, or they were joking.) We both agreed that chicken hearts are a perfectly normal thing to put in soup, so at least I’m not the only one confused by this.
This breakthrough happened as a result of reading a slew of neuro-political articles that I can’t find now, and it looks like the site itself might be gone, which makes me really sad. I’ve linked to at least one of them before, which means that now my old links are dead, too. Damn. Luckily, it looks like Wired has an article covering the same or similar research: Primal Propensity for Disgust Shapes Political Positions.
“The latest such finding comes from a study of people who looked at gross images, such as a man eating earthworms. Viewers who self-identified as conservative, especially those opposing gay marriage, reacted with particularly deep disgust. … Disgust is especially interesting to researchers because it’s such a fundamental sensation, an emotional building block so primal that feelings of moral repugnance originate in neurobiological processes shared with a repugnance for rotten food.”
So when people say that some moral or political thing is, “disgusting,” I don’t think they’re being metaphorical; I think they actually, literally mean that the idea of it makes them want to vomit.
Which begs the question: Why?
Simply put, I suspect that some of us have more of our brain space devoted to processing disgust than others. I can handle lab rats–or pieces of dead lab rats–without any internal reaction, I don’t care if there are trans people in my bathroom, and I suspect my sense of smell isn’t very good. My opinions on moral issues are routed primarily through what I hope are the rational, logic-making parts of my brain.
By contrast, people with stronger disgust reactions probably have more of their brain space devoted to disgust, and so are routing more of their sensory experiences through that region, and so feel strong, physical disgust in reaction to a variety of things, like people with different cultural norms than themselves. Their moral reasoning comes from a more instinctual place.
It is tempting to claim that processing things logically is superior to routing them through the disgust regions, but sometimes things are disgusting for good, evolutionarily sound reasons. Having an instinctual aversion to rats is not such a bad thing, given that they have historically been disease vectors. Most of our instincts exist to protect and help us, after all.
Maybe the Uncanny Valley has nothing to do with avoiding sick/dead people, maybe nothing to do with anything specifically human-oriented at all, but with plain-ol’ conceptual category violations? Suppose you are trying to divide some class of reality into two discrete categories, like “plants” and “animals” or “poetry” and “prose”. Edge cases that don’t fit neatly into either category may be problematic, annoying, or otherwise troubling. Your brain tries to cram something into Category A, then a new data point comes along, and you switch to cramming it into Category B. Then more data and back to A. Then back to B. This might happen even at a subconscious level, flicking back and forth between two categories you normally assign instinctively, like human and non-human, forcing you to devote brain power to something that’s normally automatic. This is probably stressful for the brain.
In some cases, edge cases may be inconsequential and people may just ignore them; in some cases, though, group membership is important–people seem particularly keen on arguments about peoples’ inclusion in various human groups, hence accusations that people are “posers” or otherwise claiming membership they may not deserve.
Some people may prefer discreet categories more strongly than others, and so be more bothered by edge cases; other people may be more mentally flexible or capable of dealing with a third category labeled “edge cases”. It’s also possible that some people do not bother with discreet categories at all.
It would be interesting to test people’s preference for discreet categories, and then see if this correlates with disgust at humanoid robots or any particular political identities.
It would also be interesting to see if there are ways to equip people with different conceptual paradigms for dealing with data that better accommodate edge cases; a “Core vs. Periphery” approach may be better in some cases than discreet categories, for example.