Tribalism, for good or ill (mostly ill)

esquire

The difficulty with modern politics is that it is stupid. Stupid, cultish, and insane.

Let’s use a recent example: Esquire ran a cover article about a white male teen entitled “American Boy,” and and at least a handful of people reacted with the kind of vitriol that makes alt-right conspiracy theorists point and yell “See? See? We told you so!”

For example:

Capture

Since when has “the cover of Esquire” been a “we”?

Just a few of the responses to Jemele’s Tweet, which has over 48 thousand likes:

So let me get this right, @esquire can’t put any other color person on their cover during the ENTIRE month of #BlackHistoryMonth  !?!?

All during black history month. They know what they’re doing. All press is Good press

I can’t even believe that! Especially during Black History Month? I mean it’s not right to begin with but it’s completely ridiculous this month! The least they could have done was cover me! I’m the biggest black sheep there ever was ask anyone! So kidding…Sry,I know, NOT FUNNY!

During Black History Month no less. Just don’t get it at all.

What the hell?!!!! Was there some type of urgency? Some clamoring from the masses, a cultural void that needed to be filled that warranted the commissioning of this article?! WtF

Seriously?!? Just the title of this article made me throw up in my mouth a little

Okay, new rule: You’re not allowed to talk about single people on Valentine’s Day, colon cancer in October, food during Ramadan, or jam during the entire month of March, because March is National Celery Month.  Also, the second week of July is Nude Recreation Week, so consider yourselves forewarned.

By the way, this is the March issue of Esquire, not February. (As far as I can tell, the last time Esquire published a February issue in the US, it featured a black man–Pharrel–on the cover.)

Ironically, I agree, strongly, with the folks who say we need to teach non-white history–the history of Africa, Asia, Oceana, and the rest of the world.

It’s not a pretty history. It involves cannibals. If they’re right that those who fail to learn about history are destined to repeat it, then we’re in for a lot of trouble.

Humans are fundamentally tribal creatures, even when they pretend to themselves that they aren’t. It’s part of our psychology; it’s part of how we understand the world and process threats. Human history is largely the history of one tribe of hairless apes bashing another tribe of hairless apes with increasingly advanced rocks. When we understand history, we realize that our current travails are more of the same old, same old, just fought with new technology.

Tribalism makes sense if you rewind the clock a hundred years or so to before the invention of the car, plane, and television. When most of your dealings were with members of your own community, and your own community was small enough that you knew a good portion of the people in it, “tribalism” was just regular life.

The tie that divides: Cross‐national evidence of the primacy of partyism:

Using evidence from Great Britain, the United States, Belgium and Spain, it is demonstrated in this article that in integrated and divided nations alike, citizens are more strongly attached to political parties than to the social groups that the parties represent. In all four nations, partisans discriminate against their opponents to a degree that exceeds discrimination against members of religious, linguistic, ethnic or regional out‐groups. This pattern holds even when social cleavages are intense and the basis for prolonged political conflict. Partisan animus is conditioned by ideological proximity; partisans are more distrusting of parties furthest from them in the ideological space. The effects of partisanship on trust are eroded when partisan and social ties collide. In closing, the article considers the reasons that give rise to the strength of ‘partyism’ in modern democracies.

In practice, partyism is mostly racialism. 90% of blacks vote Democratic; the majority of whites vote Republican. 

The problem is that these days, we don’t live in communities of a few hundred people. We don’t just interact with members of our own tribe.

bq-5c65baff8d5a5The Esquire controversy is old-fashioned tribalism dressed up in modern language–really, all SJW politics is just tribalism dressed up in new words. There is nothing “social” or “justicey” about disliking an interview with a teenager; Jamele and the thousands of people agreeing with her aren’t objecting to the quality of the article nor the lad’s personality, but expressing a very simple emotion: You aren’t part of my tribe, therefore I don’t like you. 

But who cares about any of this? 40,000 likes is a lot of likes, but then, there are >300 million people in this country. 40k isn’t even 1% of them.

Yet I think it is important. For starters, this low-level sniping is pervasive. Whether you’re on the internet or just watch TV, people who don’t like you are everywhere.

20 years ago, I wouldn’t have had any idea whether Jemele liked Esquire’s latest cover article or not–and I wouldn’t have cared, because I don’t know her. She doesn’t live near me, doesn’t work with me, doesn’t run in any of my social circles. She could hang out with her friends, talking about how much they hate this dumb Esquire cover, and I could hang out with my friends, talking about squids and Aztec sacrifice, and never the twain would meet.

Now we do.

Every group has memes about how awesome the group is and how much other groups suck. (If they didn’t, well, they’d stop existing pretty quickly.) Jocks insult nerds; nerds talk shit about jocks. But normally we keep our opinions within our own groups, where they function to increase group cohesion and punish deviators.

This is your brain on tribalism.

Insane tribalism.

Contrary to what some sociologists claim, bringing people into contact with people whom they don’t like seems to increase conflict, not decrease it. Familiarity breeds contempt.

Being constantly exposed to other people’s ideas about how awful you are seems to have two effects on people: either they agree (become infected–pozzed, if you will) that they are awful and start trying to help the people who hate them (this might be a kind of Stockholm Syndrome); or they react negatively, become immune, and hate back.

The former I refer to as the “suicide meme.” More on this later, but in short, the suicide meme happens when you absorb the memes of people who want you dead.

To the gazelle, the lion is a monster; to the lion, the gazelle is lunch. Neither of them benefits from adopting the other’s ideas.

To the grass, of course, the gazelle is a torturer and the lion a perfect gentleman.

There is something ironic about getting lectured to about treatment of Latinos by someone who is literally named “Cortez,” (Hernando Cortes was the Spanish conquistador who conquered Mexico and destroyed the Aztec empire; he apparently also created a lot of children in the process.)

Quoting Cortez (the modern one):

We must have respect for… human rights and respect for the right of human mobility. Because it is a right. [Applause] Because we are standing on native land. And Latino people are descendants of native people. And we cannot be told, and criminalized, simply because for our identity or our status. Period.

There are multiple lies in this statement. “Human mobility” isn’t a right. Not across national borders. If you think it is, go try it on the North Korea border and report back on how it works. There is no country in the world that recognizes the right of non-citizens to traipse across its borders whenever they please.

Second, we are not standing on native land. This was filmed at the US capitol. This is AMERICAN land. It is American land because Americans killed the people who used to live here.

Every single piece of land in the entire world belongs to the person who actually has the ability to physically enforce their claim to that land. China is a country today and Tibet isn’t because the PRC has physical control over Tibet and Tibet does’t. Italy is a country because no other country has the ability to take control of Italy’s land. Bhutan is a country because it controls the borders of Bhutan.

Third, while Latinos are descended from “native” peoples, they aren’t descended from Native Americans. They’re descended from natives from other countries that are not America. White people are also “native” peoples by this logic; they are descended from the native peoples of Europe. Asians are descended from the native peoples of Asia. Blacks are descended from the native peoples of Africa. Etc. Just because Latinos are descended from people from the North and South American continents is not meaningful–Germans and Poles are both native to Europe, but that doesn’t mean Germans have some inherent right to invade Poland.

Fourth, you certainly can be criminalized for your “status” (as illegal immigrants.) In fact, immigration status is exactly what is being criminalized.

There are many other issues with this speech–like the part where AOC blames ICE for the death of a little girl they actually were trying to save (despite the fact that our border patrol has no moral obligation to spend American taxpayers’ money to save the lives of non-Americans) and her promotion of the idea that non-citizens deserve “Constitutional protections” (fact: they already have constitutional protections, under the constitutions of the countries they are citizens of. They don’t have constitutional protections in countries they are not citizens of,)–but the most troubling thing about this speech is the fact that Ocasio-Cortez is an actual member of Congress.

Ocasio-Cortez’s comments would make sense over on the Mexican side of the border–a Mexican advocating for things that benefit Mexicans is perfectly reasonable.

But for a member of the American government to advocate that Americans have no right to control their own borders and assert that the territory of America actually belongs to someone else–including non-citizens–is straight up treason.

Advertisements

Phase Change and Revolutions

Phase changes don’t usually happen instantly, like in the video, but they are sudden from the perspective of temperature. You don’t see a few ice crystals forming at 40 degrees, a few large chunks of ice at 38, the water halfway frozen at 34, and the whole thing solid at 32. No, at 34 degrees, water is liquid. Water is a liquid all the way from 100 to 33 degrees, and then suddenly, without warning, it transforms at 32 (even if it takes a little time.) By 31.9, it’s a solid chunk.

One of the enduring mysteries of political science is “Why did no one in political science predict the fall of the Soviet Union?” One of the other enduring mysteries of political science is “Why on earth did the Soviet Union fall when it did? Why not earlier–or later?”

Political regimes don’t fall very often. We can look around the world today and see a number of repressive states–North Korea, Venezuela, Iran–that don’t look like they’re doing a very good job of taking care of their citizens, yet their governments stay firmly in power. Why don’t these regimes fall? Or will they–someday?

I propose that regime change is much like phase changes–difficult to predict because they simply cannot happen before a specific point, and they happen so rarely that we don’t have enough data to test exactly which conditions are necessary to make them occur, much less figure out whether those conditions currently exist within a foreign society.

There are probably two main things necessary for something like the fall of the Soviet Union:

First, a majority of the people with guns–the armed forces in most countries, but a lot of civilians in the US–need to stop believing in the regime.

Second, the majority that no longer believes in the regimes’ legitimacy has to know that it is a majority.

Since opposing the regime will usually get you shot, no one wants to be the first guy to say that he doesn’t believe in the regime. Since opposing the regime will get you shot, even people who oppose the regime will go ahead and shoot comrades who have opposed the regime in fear that if they don’t, they will also be shot.

100% of people in a system can oppose the regime and the regime will still keep charging on, shooting dissenters, if no one knows that everyone else is also opposed to the regime.

So how does regime change actually happen?

First, you need crazy people willing to charge, like Don Quixote, at windmills and regimes. These people will usually get shot, which is why they need to be crazy. But if enough people have already decided that the regime is not particularly legitimate, there is a possibility that one of them will decide to be lenient. They will quietly decide not to shoot the revolutionary.

The fall of the Berlin Wall happened almost by accident–new regulations were passed regarding round-trip travel in the Soviet Union and this was read aloud on the radio in a way that made it sound like anyone who wanted was now allowed through the checkpoints into West Berlin, effective immediately. Thousands of people showed up within hours, demanding to be let through (after all, it had been officially announced, as far as they knew.) The overwhelmed border guards didn’t want to shoot that many people, so after a bit of conferring, they gave in and let everyone through.

There were plenty of cracks already in the USSR’s hold on power, but like a tap to the side of a bottle of supercooled water, this one little mistake caused a knowledge cascade. The thousands of people who showed up at the checkpoint (and didn’t get shot) now knew that there were thousands of other people who agreed with them–and soon that knowledge spread to everyone else in East Germany and the rest of the USSR.

The difficulty with predicting when a regime will fall is the difficulty of predicting a random tap to the bottle or a little dust for the first crystals to form around–and that’s assuming you have a state that has already lost legitimacy in the eyes of most of its citizens. If it hasn’t, that same tap does nothing–and unfortunately, states are much more complicated than bottles of water, and so involve a lot more variables than just temperature.

It’s getting late, but I think this suggests that the thing for most regimes (not even official regimes) is not to control legitimacy (that’s hard if, say, the peasants are starving), but to control what people know and make sure they’re convinced that if they step out of line, they will get shot. So long as shooting is on the table, even people who don’t like the regime will go along and enforce it by shooting dissidents.

The whole point of purity spirals and outrage mobs, then, may be to enforce the idea to people that “if you cross this line, you will get [metaphorically] shot” to people thinking of defecting from ideologies within a culture. It doesn’t even matter if the people being destroyed by the mob actually did anything wrong, so long as the mob is effective at destruction.

Is the News Bad for You?

Whilst traveling through the darkest depths of the unexplored heartland of America, I encountered a mysterious beast I had only glimpsed in the many years since I left home at 18: 

The News. 

What was this flag-waving, headshot-zooming, sound effects-ridden creature, and why did it care that someone in Ohio doesn’t like “Baby its Cold Outside?” 

I really can’t stay (but baby there’s meth outside) 

Extended viewing (or listening) to what now passes for “news” on the 24-hour cable channels strikes me as bad for one’s mental health (possibly physical, as well.)

What is so bad about the news? 

First, it is a never-ending stream of disasters, and disasters naturally tend to make people anxious and worried. But the disasters featured on the news are rarely relevant to your own life–most of them take place on the other side of the country, if not the planet. 

In the past couple of months, you probably heard about wildfires in California, the War in Yemen, ISIS, someone shooting up a Christmas Market in Europe, Ebola in Africa, protests in France, children being gassed at the border, and of course the dire threat of secular Christmas Carols being taken off the radio in Ohio and rap music in Russia. 

Chances are good that none of these things directly affects you. 

How many news stories can you think of that actually occurred in your local community and have some relevance to your actual life?

The Guardian presents it better than I can:

News is irrelevant. Out of the approximately 10,000 news stories you have read in the last 12 months, name one that – because you consumed it – allowed you to make a better decision about a serious matter affecting your life, your career or your business. The point is: the consumption of news is irrelevant to you.

Strong words from a newspaper

At the very best, you are spending time and energy worrying about stuff that doesn’t actually affect you, while not learning about stuff–such as your neighbors’ thoughts on pest control–that actually does affect you. 

And at worst, you are making yourself ill by feeding your mind constant disaster footage: 

Witnessing images of extreme violence: a psychological study of journalists in the newsroom

User Generated Content – photos and videos submitted to newsrooms by the public – has become a prominent source of information for news organisations. Journalists working with uncensored material can frequently witness disturbing images for prolonged periods. How this might affect their psychological health is not known and it is the focus of this study. …

Regression analyses revealed that frequent (i.e. daily) exposure to violent images independently predicted higher scores on all indices of the Impact of Event Scale-revised, the BDI-II and the somatic and anxiety subscales of the GHQ-28.  …

The present study, the first of its kind, suggests that frequency rather than duration of exposure to images of graphic violence is more emotionally distressing to journalists working with User Generated Content material. 

If being exposed to the news is bad for journalists, it’s probably bad for you, too: 

The Relationship between Self-Report of Depression and Media Usage:

In this study, we tested if self-report of depression (SRD), which is not a clinically based diagnosis, was associated with increased internet, television, and social media usage by using data collected in the Media Behavior and Influence Study (MBIS) database (N = 19,776 subjects). … These analyses found that SRD rates were in the range of published rates of clinically diagnosed major depression. It found that those who tended to use more media also tended to be more depressed, and that segmentation of SRD subjects was weighted toward internet and television usage, which was not the case with non-SRD subjects, who were segmented along social media use. This study found that those who have suffered either economic or physical life setbacks are orders of magnitude more likely to be depressed, even without disproportionately high levels of media use. However, among those that have suffered major life setbacks, high media users—particularly television watchers—were even more likely to report experiencing depression, which suggests that these effects were not just due to individuals having more time for media consumption.

One woman I know got so worked up reading/watching articles and news reports about the Catholic Priest Scandals that she spent a week weeping and is now undergoing therapy for PTSD. 

Stupid? Yes. Nevertheless, people are doing this to themselves. 

Another woman I know recently announced that she thought “God was weeping” because things have gotten so bad in the world. After some questioning, she claimed that wars and third-world poverty are “worse than ever”–despite the fact that poverty is actually at the lowest it’s ever been and she lived through WWII

Does watching the news make you any better informed? 

No. 

From Pew, Public Knowledge of Current Affairs Little Changed by News and Information Revolutions

Since the invention of 24 hour Cable News Networks, general knowledge of political matters has gone down slightly. 

If you must follow the news, do so in print or listen to PBS/NPR--these are the sources with a track record for not actively making you dumber

Looking at those who get their news primarily through radio and television, for most, following the news more or less closely had no reliable relation to whether respondents believed clear evidence had been found that al-Qaeda and Saddam Hussein were working closely together. Fox News was the exception. Those who followed the news closely were far more likely to have this misperception. Among those who did not follow the news at all 42% had the misperception, rising progressively at higher levels of attention to 80% among those who followed the news very closely. On the other hand, those respondents who get their news primarily from print sources were less likely to have this misperception if they were following the Iraq situation more closely. Of those not following the news closely, 49% had the misperception–declining to 32% among those who followed the news very closely.

More on this phenomenon.

This analysis is harsh on Fox, but keep in mind that it is specifically looking at misperceptions related to a war championed by Republicans–we might find a similar effect for different networks if we were looking for misperceptions related to something championed by a Democratic president. 

The news makes money by convincing you to watch it–that is, it has a self-interest in being addictive, not in making your life better. The constant parade of anxiety-inducing disasters is one way they capture your attention; the nausea-inducing zooming camera pans and waving flags are another. Some news personalities are actually good at their jobs despite the distractions, but on average, the more boring stations and media do a better job of conveying actual facts, probably because they are less distracting.

The news is one-way communication: it is a voice constantly talking to you, not you talking back (well, you can talk back, but it can’t hear you.) Would you spend so much time listening, in real life, to someone who never listened to you? 

There is something insidious about a voice that talks constantly to you, that decides what is an isn’t concerning, that uses psychological manipulation to keep you listening, and doesn’t listen to you. 

None of which is to say that the news media is intentionally evil or trying to cause harm–these things are just natural side effects of the way media works–the network that convinces more people to watch makes more money than the one that doesn’t. You have a natural desire to hear about disasters, because before the invention of mass media, almost all of them were actually relevant to your life. This also need not condemn any particular news channel–these factors apply to them all.

You can always tell someone who pays too much attention to the news, because their attention shifts radically from week to week. One day, Russia–a nation with a GDP smaller than South Korea’s and a per capita GDP almost as low as Mexico’s–is a critical threat to democracy; the next week Saudi Arabia, a dictatorship well known for things like “funding 9-11,” “women must wear burkas and can’t drive,” and “starving Yemeni children,” is suddenly catapulted from “not a problem” to “defcon 12.”

A week later, all of these things are forgotten because Trump paid off a prostitute, which is clearly a pressing national problem, right up there with Monica Lewinsky’s blue dress. 

Remember, the European witch-hunt hysteria was spread via the newly-adopted printing press, which made it easy for reports of broom-riding, devil worshiping, and livestock metamorphosis to spread from town to town. The equally absurd Satanic Daycare Scare of the 1980s was also spread by the News, this time on TV and radio.

There are probably some good sides to the news–it’s probably worthwhile to be informed about the world on some level, and it’s certainly useful to know what’s going on in your local area or economic trends that affect your business. 

But be careful about letting strangers determine what you know and what you care about.

Racism OCD and Other Political Neuroses 

 

tumblr_inline_o15j3nj8rc1slq602_500
Source: Evangelion/blog thereupon

In his post on the Chamber of Guf, Slate Star Codex discussed a slate of psychiatric conditions where the sufferer becomes obsessed with not sinning in some particular way. In homosexual OCD, for example, the sufferer becomes obsessed with fear that they are homosexual or might have homosexual thoughts despite not actually being gay; people with incest OCD become paranoid that they might have incestuous thoughts, etc. Notice that in order to be defined as OCD, the sufferers have to not actually be gay or interested in sex with their relatives–this is paranoia about a non-existent transgression. Scott also notes that homosexual OCD is less common among people who don’t think of homosexuality as a sin, but these folks have other paranoias instead.

The “angel” in this metaphor is the selection process by which the brain decides which thoughts, out of the thousands we have each day, to focus on and amplify; “Guf” is the store of all available thoughts. Quoting Scott:

I studied under a professor who was an expert in these conditions. Her theory centered around the question of why angels would select some thoughts from the Guf over others to lift into consciousness. Variables like truth-value, relevance, and interestingness play important roles. But the exact balance depends on our mood. Anxiety is a global prior in favor of extracting fear-related thoughts from the Guf. Presumably everybody’s brain dedicates a neuron or two to thoughts like “a robber could break into my house right now and shoot me”. But most people’s Selecting Angels don’t find them worth bringing into the light of consciousness. Anxiety changes the angel’s orders: have a bias towards selecting thoughts that involve fearful situations and how to prepare for them. A person with an anxiety disorder, or a recent adrenaline injection, or whatever, will absolutely start thinking about robbers, even if they consciously know it’s an irrelevant concern.

In a few unlucky people with a lot of anxiety, the angel decides that a thought provoking any strong emotion is sufficient reason to raise the thought to consciousness. Now the Gay OCD trap is sprung. One day the angel randomly scoops up the thought “I am gay” and hands it to the patient’s consciousness. The patient notices the thought “I am gay”, and falsely interprets it as evidence that they’re actually gay, causing fear and disgust and self-doubt. The angel notices this thought produced a lot of emotion and occupied consciousness for a long time – a success! That was such a good choice of thought! It must have been so relevant! It decides to stick with this strategy of using the “I am gay” thought from now on. …

Politics has largely replaced religion for how most people think of “sin,” and modern memetic structures seem extremely well designed to amplify political sin-based paranoia, as articles like “Is your dog’s Halloween costume racist?” get lots of profitable clicks and get shared widely across social media platforms, whether by fans or opponents of the article.

Both religions and political systems have an interest in promoting such concerns, since they also sell the cures–forgiveness and salvation for the religious; economic and social policies for the political. This works best if it targets a very common subset of thoughts, like sexual attraction or dislike of random strangers, because you really can’t prevent all such thoughts, no matter how hard you try.

The original Tiny House
Medieval illustration of anchorite cell

Personal OCD is bad enough; a religious sufferer obsessed with their own moralistic sin may feel compelled to retreat to a monastery or wall themselves up to avoid temptation. If a whole society becomes obsessed, though, widespread paranoia and social control may result. (Society can probably be modeled as a meta-brain.)

I propose that our society, due to its memetic structure, is undergoing OCD-inducing paranoia spirals where the voices of the most paranoid are being allowed to set political and moral directions. Using racism as an example, it works something like this:

First, we have what I’ll call the Aristotelian Mean State: an appropriate, healthy level of in-group preference that people would not normally call “racism.” This Mean State is characterized by liking and appreciating one’s own culture, generally preferring it to others, but admitting that your culture isn’t perfect and other cultures have good points, too.

Deviating too far from this mean is generally considered sinful–in one direction, we get “My culture is the best and all other cultures should die,” and too far in the other, “All other cultures are best and my culture should die.” One of these is called “racism,” the other “treason.”

When people get Racism OCD, they become paranoid that even innocuous or innocent things–like dog costumes–could be a sign of racism. In this state, people worry about even normal, healthy expressions of ethnic pride, just as a person with homosexual OCD worries about completely normal appreciation of athleticism or admiration of a friend’s accomplishments.

Our culture then amplifies such worries by channeling them through Tumblr and other social media platforms where the argument “What do you mean you’re not against racism?” does wonders to break down resistance and convince everyone that normal, healthy ethnic feelings are abnormal, pathological racism and that sin is everywhere, you must constantly interrogate yourself for sin, you must constantly learn and try harder not to be racist, etc. There is always some new area of life that a Tumblrista can discover is secretly sinful, though you never realized it before, spiraling people into new arenas of self-doubt and paranoia.

As for the rest of the internet, those not predisposed toward Racism OCD are probably predisposed toward Anti-Racism OCD. Just as people with Racism OCD see racism everywhere, folks with Anti-Racism OCD see anti-racism everywhere. These folks think that even normal, healthy levels of not wanting to massacre the outgroup is pathological treason. (This is probably synonymous with Treason OCD, but is currently in a dynamic relationship with the perception that anti-racists are everywhere.)

Since there are over 300 million people in the US alone–not to mention 7 billion in the world–you can always find some case to justify paranoia. You can find people who say they merely have a healthy appreciation for their own culture but really do have murderous attitudes toward the out-group–something the out-group, at least, has good reason to worry about. You can find people who say they have a healthy attitude toward their own group, but still act in ways that could get everyone killed. You can find explicit racists and explicit traitors, and you can find lots of people with amplified, paranoid fears of both.

These two paranoid groups, in turn, can feed off each other, each pointing at the the other and screaming that everyone trying to promote “moderatism” is actually the worst sinners of the other side in disguise and therefore moderatism itself is evil. This feedback loop gives us things like the “It’s okay to be white” posters, which manages to make an entirely innocuous statement sound controversial due to our conviction that people only make innocuous statements because they are trying to make the other guy sound like a paranoid jerk who disputes innocuous statements.

Racism isn’t the only sin devolving into OCD–we can also propose Rape OCD, where people become paranoid about behaviors like flirting, kissing, or even thinking about women. There are probably other OCDs (trans OCD? food contamination OCD) but these are the big ones coming to mind right now.

Thankfully, Scott also proposes that awareness of our own psychology may allow us to recognize and moderate ourselves:

All of these can be treated with the same medications that treat normal OCD. But there’s an additional important step of explaining exactly this theory to the patient, so that they know that not only are they not gay/a pedophile/racist, but it’s actually their strong commitment to being against homosexuality/pedophilia/racism which is making them have these thoughts. This makes the thoughts provoke less strong emotion and can itself help reduce the frequency of obsessions. Even if it doesn’t do that, it’s at least comforting for most people.

The question, then, is how do we stop our national neuroses from causing disasters?

Identity Politics and Identity Voting

Our society has managed to simultaneously discover identity politics and that identity groups tend to vote together:

2016-Youth-Voting-by-Race

 

_92349606_us_elections_2016_exit_polls_race_624

 

“We’re just like you! Make society friendlier to us!”

“Okay, but why do you all vote for the party I don’t like?”

contraitors
Source Audacious Epigone

Even when you control for ideology, ethnic voting still shows up. This graph shows only conservatives–conservative blacks are still extremely unlikely to vote for Republicans. Conservative Asians and Hispanics actually do vote Republican on balance (in this particular poll), but about 40% of them still voted for the Democrat.

 

Non-Jewish whites are the most loyal conservative voters, even among self-professed conservatives.

ft_16-01-26_eligiblevoterchange

The problem with immigration is that we live in a democracy.

Republicans now regard immigration as a massive attempt to demographically swamp the electorate by bringing in new voters who’ll vote Democrat because this is the functional result of immigration. Whether intentional or not, that is absolutely what it does.

Identity politics and awareness of identity-based voting are incompatible. “We’re just like you, we just vote for everything you hate,” is not a winning argument.

I’m reminded of the time Julian Assange naively asked why his enemies had all taken to putting ((())) around their names and got called an anti-Semite in return:

Assange

Polite society often requires politely not noticing or not pointing out other people’s differences. A store clerk helps an customer find a “flattering dress” without mentioning the customer’s obesity. A teacher helps students catch up in school without calling them stupid. And we don’t mention that different ethnic groups have different political ideas.

“They’re just like us,” and “I don’t see race,” are both lies people tell to try to get along in large, multi-ethnic societies. Obviously ethnic and racial differences are easy to see, and different groups have different cultures with their own norms, values, and beliefs. Chinese culture is different from Ghanian culture is different from Chilean culture is different from gay culture is different from video game culture, and so on.

The pretty little lie of democracy is the idea that people vote based on rational, well-thought out ideas about how government should be run. In reality, they vote their self-interest, and most people see their self-interest lying in solidarity with others in their ethnic group. Even when they aren’t voting pure self-interest, cultural similarities still result in voting similarities.

The insistence that people must see race was accompanied by increased demands for racially-based benefits/an end to racially-based harms–that is, the change was triggered by a perception that being more racially aware would benefit minorities. But this leads, in turn, to increased visibility of ethnic voting patterns, explicit vote-counting by ethnicity, and ethnic voting conflict.

I see three ways to resolve the conflict:

  1. Obfuscate. Pretend ethnic differences don’t exist and scream “racist” whenever someone notices them.
  2. Admit that ethnic differences are real and that everyone is voting in their own self-interest.
  3. Admit that ethnic differences are real and get rid of voting.

Option One is the Left’s strategy. These are the folks who insist that “race is a social construct” but at the same time that “white fragility” is real and that “whiteness needs to be abolished.” They’ll also threaten to send you to gulag for stating that Affirmative Action exists because blacks score worse than whites on the SAT. (True story.)

Option Two is the Alt-Right strategy. If the Pittsburgh shooter’s motive remains opaque to you, here it is: the majority of US Jews vote Democrat and support immigration policies that will continue giving Democrats a majority.

Option Three is NeoReaction aka neocameralism. Remove voting and you remove the incentive to shoot each other over demographic cheating (perceived or not.)

(This blog favors Option Three, the strategy that doesn’t involve shooting each other, but we understand why others might not.)

ETA: Perhaps there ought to be an Option Four: People stop arguing so much and try harder to get along. I’m not sure exactly how this would come about, but I know there are people who believe in it.

The Endless Ratiocination of the Dysphoric Mind

Begin

My endless inquiries made it impossible for me to achieve anything. Moreover, I get to think about my own thoughts of the situation in which I find myself. I even think that I think of it, and divide myself into an infinite retrogressive sequence of ‘I’s who consider each other. I do not know at which ‘I’ to stop as the actual, and as soon as I stop, there is indeed again an ‘I’ which stops at it. I become confused and feel giddy as if I were looking down into a bottomless abyss, and my ponderings result finally in a terrible headache. –Møller, Adventures of a Danish Student

Moller’s Adventures of a Danish Student was one of Niels Bohr’s favorite books; it reflected his own difficulties with cycles of ratiocination, in which the mind protects itself against conclusions by watching itself think.

I have noticed a tendency on the left, especially among the academic-minded, to split the individual into sets of mental twins–one who is and one who feels that it is; one who does and one who observes the doing.

Take the categories of “biological sex” and “gender.” Sex is defined as the biological condition of “producing small gametes” (male) or “producing large gametes” (female) for the purpose of sexual reproduction. Thus we can talk about male and female strawberry plants, male and female molluscs, male and female chickens, male and female Homo Sapiens.

(Indeed, the male-female binary is remarkably common across sexually reproducing plants and animals–it appears that the mathematics of a third sex simply don’t work out, unless you’re a mushroom. How exactly sex is created varies by species, which makes the stability of the sex-binary all the more remarkable.)

And for the first 299,945 years or so of our existence, most people were pretty happy dividing humanity into “men” “women” and the occasional “we’re not sure.” People didn’t understand why or how biology works, but it was a functional enough division for people.

In 1955, John Money decided we needed a new term, “gender,” to describe, as Wikipedia puts it, “the range of characteristics pertaining to, and differentiating between, masculinity and femininity.” Masculinity is further defined as “a set of attributes, behaviors, and roles associated with boys and men;” we can define “femininity” similarly.

So if we put these together, we get a circular definition: gender is a range of characteristics of the attributes of males and females. Note that attributes are already characteristics. They cannot further have characteristics that are not already inherent in themselves.

But really, people invoke “gender” to speak of a sense of self, a self that reflexively looks at itself and perceives itself as possessing traits of maleness of femaleness; the thinker who must think of himself as “male” before he can act as a male. After all, you cannot walk without desiring first to move in a direction; how can you think without first knowing what it is you want to think? It is a cognitive splitting of the behavior of the whole person into two separate, distinct entities–an acting body, possessed of biological sex, and a perceiving mind, that merely perceives and “displays” gender.

But the self that looks at itself looking at itself is not real–it cannot be, for there is only one self. You can look at yourself in the mirror, but you cannot stand outside of yourself and be simultaneously yourself; there is only one you. The alternative, a fractured consciousness, is a symptom of mental disorder and treated with chlorpromazine.

Robert Oppenheimer was once diagnosed with schizophrenia–dementia praecox, as they called it then. Whether he had it or simply confused the therapist by talking about wave/particle dualities is another matter.

Then there are the myriad variants of the claim that men and women “perform femininity” or “display masculinity” or “do gender.” They do not claim that people are feminine or act masculine–such conventional phrasing assumes the existence of a unitary self that is, perceives, and acts. Rather, they posit an inner self that possesses no inherent male or female traits, for whom masculinity and femininity are only created via the interaction of their body and external expectations. In this view, women do not buy clothes because they have some inherent desire to go shopping and buy pretty things, but because society has compelled them to do so in order to comply with external notion of “what it means to be female.” The self who produces large gametes is not the self who shops.

The biological view of human behavior states that most humans engage in a variety of behaviors because similar behaviors contributed to the evolutionary success of our ancestors. We eat because ancestors who didn’t think eating was important died. We jump back when we see something that looks like a spider because ancestors who didn’t got bitten and died. We love cute things with big eyes because they look like babies because we are descended mostly from people who loved their babies.

Sometimes we do things that we don’t enjoy but rationalize will benefit us, like work for an overbearing boss or wear a burka, but most “masculine” and “feminine” behaviors fall into the category of things people do voluntarily, like “compete at sports” or “gossip with friends.” The fact that more men than women play baseball and more women than men enjoy gossiping with friends has nothing to do with an internal self attempting to perform gender roles and everything to do with the challenges ancestral humans faced in reproducing.

But whence this tendency toward ratiocination? I can criticize it as a physical mistake, but does it reflect an underlying psychological reality? Do some people really perceive themselves as a self separate from themselves, a meta-self watching the first self acting in particular manners?

Here is a study that found that folks with more cognitive flexibility tended to be more socially liberal, though economic conservatism/liberalism didn’t particularly correlate with cognitive flexibility.

I find that if I work hard, I may achieve a state of zen, an inner tranquility in which the endless narrative of thoughts coalesce for a moment and I can just be. Zen is flying down a straight road at 80 miles an hour on a motorcycle; zen is working on a math problem that consumes all of your attention; zen is dancing until you only feel the music. The opposite of zen is lying in bed at 3 AM, staring at the ceiling, thinking of all of your failures, unable to switch off your brain and fall asleep.

Dysphoria is a state of unease. Some people have gender dysphoria; a few report temporal dysphoria. It might be better defined at disconnection, a feeling of being eternally out of place. I feel a certain dysphoria every time I surface from reading some text of anthropology, walk outside, and see cars. What are these metal things? What are these straight, right-angled streets? Everything about modern society strikes me as so artificial and counter to nature that I find it deeply unsettling.

It is curious that dysphoria itself is not discussed more in the psychiatric literature. Certainly a specific form or two receives a great deal of attention, but not the general sense itself.

When things are in place, you feel tranquil and at ease; when things are out of place you agitated, always aware of the sense of crawling out of your own skin. People will try any number of things to turn off the dysphoria; a schizophrenic friend reports that enough alcohol will make the voices stop, at least for a while. Drink until your brain shuts up.

But this is only when things are out of place. Healthy people seek a balance between division and unity. Division of the self is necessary for self-criticism and improvement; people can say, then, “I did a bad thing, but I am not a bad person, so I will change my behavior and be better.” Metacognition allows people to reflect on their behavior without feeling that their self is fundamentally at threat, but too much metacognition leads to fragmentation and an inability to act.

People ultimately seek a balanced, unified sense of self.

It is said that not everyone has an inner voice, a meta-self commenting on the acting self, and some have more than one:

My previous blogs have observed that some people –women with bulimia nervosa, for example– have frequent multiple simultaneous experiences, but that multiple experience is not frequent in the general population. …

Consider inner speech. Subject experienced themselves as innerly talking to themselves in 26% of all samples, but there were large individual differences: some subjects never experienced inner speech; other subjects experienced inner speech in as many as 75% of their samples. The median percentage across subjects was 20%.

It’s hard to tell what people really experience, but certainly there is a great deal of variety in people’s internal experiences. Much of thought is not easily describable. Some people hear many voices. Some cannot form mental images:

I think the best way I can describe my aphantasia is to say that I am unaware of anything in my mind except these categories: i) direct sensory input, ii) unheardwords that carry thoughts, iii) unheardmusic, iv) a kind of invisible imagery, which I can best describe as sensation of pictures that are in a sense too faint to see, v) emotions, and vi) thoughts which seem too fastto exist as words. … I see what is around me, unless my eyes are closed when all is always black. I hear, taste, smell and so forth, but I dont have the experience people describe of
hearing a tune or a voice in their heads. Curiously, I do frequently have a tune going around in my head, all I am lacking is the direct experience of hearingit.

The quoted author is, despite his lack of internal imagery, quite intelligent, with a PhD in physics.

Some cannot hear themselves think at all.

I would like to know if there is any correlation between metacognition, ratiocination, and political orientations–I have so far found a little on the subject:

We find a relationship between thinking style and political orientation and that these effects are particularly concentrated on social attitudes. We also find it harder to manipulate intuitive and reflective thinking than a number of prominent studies suggest. Priming manipulations used to induce reflection and intuition in published articles repeatedly fail in our studies. We conclude that conservatives—more specifically, social conservatives—tend to be dispositionally less reflective, social liberals tend to be dispositionally more reflective, and that the relationship between reflection and intuition and political attitudes may be more resistant to easy manipulation than existing research would suggest.

And a bit more:

… Berzonsky and Sullivan (1992) cite evidence that individuals higher in reported
self-reflection also exhibit more openness to experience, more liberal values, and more general tolerance for exploration. As noted earlier, conservatives tend to be less open to experience, more intolerant of ambiguity, and generally more reliant on self-certainty than liberals. That, coupled with the evidence reported by Berzonsky and Sullivan, strongly suggests conservatives engage in less introspective behaviors.

Following an interesting experiment looking at people’s online dating profiles, the authors conclude:

Results from our data support the hypothesis that individuals identifying
themselves as “Ultra Conservative‟ exhibit less introspection in a written passage with personal content than individuals identifying themselves as “Very Liberal‟. Individuals who reported a conservative political orientation often provided more descriptive and explanatory statements in their profile’s “About me and who I‟m looking for‟ section (e.g., “I am 62 years old and live part time in Montana” and “I enjoy hiking, fine restaurants”). In contrast, individuals who reported a liberal political orientation often provided more insightful and introspective statements in their narratives (e.g., “No regrets, that‟s what I believe in” and “My philosophy in life is to make complicated things simple”).

The ratiocination of the scientist’s mind can ultimately be stopped by delving into that most blessed of substances, reality, (or as close to it as we can get.) There is, at base, a fundamentally real thing to delve into, a thing which makes ambiguities disappear. Even a moral dilemma can be resolved with good enough data. We do not need to wander endlessly within our own thoughts; the world is here.

End

 

Stereotypes, Expertise, and Class

Tom Nichols’s book, The Death of Expertise, has a passage that inspired a tangent that I’d like to discuss separately from my main review:

“You can’t generalize like that!” Few expressions are more likely to arise in even a mildly controversial discussion. People resist generalizations–boys tend to be like this, girls tend to be like that–because we all want to believe we’re unique and can’t be pigeonholed that easily.

What most people usually mean when they object to “generalizing,” however, is not that we shouldn’t generalize, but that we shouldn’t stereotype, which is a different issue The problem in casual discourse is that people often don’t understand the difference between stereotypes and generalizations, and this makes conversation, especially between experts and laypeople, arduous and exhausting. –Tom Nichols

Nichols brings up a good point, but is wrong about stereotypes–to generalize, most stereotypes are true, and people object to stereotypes and generalizations for the exact same reasons. Or as Psychology Today puts it:

“Stereotypes” have a bad name, and everybody hates stereotypes. But what exactly is a stereotype?

What people call “stereotypes” are what scientists call “empirical generalizations,” and they are the foundation of scientific theory. That’s what scientists do; they make generalizations. Many stereotypes are empirical generalizations with a statistical basis and thus on average tend to be true. If they are not true, they wouldn’t be stereotypes. … 

SAT scores by race and parental income

We only call them “stereotypes” when we don’t like the information they convey. “African Americans have more melanin, on average, than non-African Americans,” is not a controversial statement; “African Americans score worse on the SAT, on average, than non-African Americans” is controversial, even though both are empirically true.

Nichols grasps for this false distinction between “generalizations” and “stereotypes” because Nichols sees himself as a Good Person, not an Evil Racist, and only Evil Racists use stereotypes. (We know that because the liberals said so.)

Except for the uncomfortable fact that most stereotypes are basically true, otherwise people wouldn’t bother to have them. This leaves Nichols in the uncomfortable position of eternally trying to explain to people why it’s okay when he, an expert, says mean things about the Russians, but totally not okay when ordinary people say mean things about the Russians.

I can almost hear Nichols objecting, “It can’t be racist if it’s true,” to which I raise the average Somali IQ:

In the US, the cutoff for “mental retardation” or “intellecutal disability” is set at an IQ of 70 or below. The averaged measured Somali IQ is in the low 70s. Almost half of Somalis would be, in the US, legally retarded.

The only way to dull the sting of this statement is to note that 1. of course the majority of Somalis aren’t retarded; 2. Somali migants are heavily selected from the smarter end of the Somalia, because they’re the folks who were clever enough to escape; 3. Somali IQ is probably being depressed by terrible local conditions. Still, if you work for Google, I don’t recommend writing any memos trying to give a nuanced version of “Somalis have low average IQs.” For that matter, I don’t recommend writing that if you work anywhere except in an explicit IQ-related job. If your coworkers are IQ-experts, they might already be familiar with Somali IQs; otherwise you will get sacked immediately for being racist.

Everything is fine for experts if they promote ideas that people already believe or want to agree with. Saying that “Fast food is bad for you,” raises few hackles. Everyone knows that.

The findings of the now-discredited Implicit Association Test were widely touted because people wanted evidence that “everyone is a little bit racist,” (or at least that whites are all subconsciously racist, even the ones who say they aren’t.) The IAT was obviously bogus from the start, but confirmation bias and wanting it to be true led people to latch onto it.

When experts and common wisdom agree, all is well.

It’s when experts and lay-people disagree that problems arises. If the matter is purely scientific–What is the atomic weight of cesium?–people will usually defer. But if the matter is personal or involves deeply held religious beliefs, people resist. (We evolutionists have been dealing with this for a long time.)

Nichols gives the example, “Russians are more corrupt than Norwegians.” It’s absolutely true, unless we’re using some strange definition of “corrupt.” And Nichols, a guy who speaks Russian and has been studying Russia for decades, has the relevant expertise to make such a judgment. But normal people who don’t know anything about Russia (or Norway) get their hackles up because the statement sounds mean and contradicts their deeply held belief that all groups of people are morally and intellectually equal.

If the only difference between a stereotype and a generalization is that a stereotype offends the hearer, then “Russians are more corrupt than Norwegians” is a stereotype.

Life is hard for experts if they contradict things people deeply believe, but it’s even harder if they contradict things believed by their own social class.

Scientists studying evolution face criticism and disbelief from people who believe that humans were created by God from a ball of dirt on the sixth day of creation, but evolution is a “high class” belief and creationism is “low class,” so scientists face no loss of social standing by advocating for evolution and generally don’t even associate socially with creationists.

By contrast, a geneticist like Harvard’s David Reich, who recently admitted in the New York Times that “race” is biologically, even genetically real, is contradicting the beliefs of his own social class that “race is a social construct.” Harvard is full of people who believe creationist nonsense about the biology of men, women, and racial groups, but since these are high-class religious beliefs, Reich faces a loss of social standing by contradicting them and will have to actually deal with these people in real life.

Slate Star Codex recently posed, “Can Things be Both Popular and Silenced?” a discussion of whether authors like Jordan Peterson, who has received a ton of media attention lately and sells millions of books and is doing quite well for himself, can be accurately described as “silenced” in some way.

SSC briefly touches on social class and then moves on to other important matters, but I think social class really ties matters together. Peterson is a book-writing professional with a medical degree, (I think. I haven’t read any of his work,) that is, an academic intellectual. Reich is a Harvard professor doing ground-breaking, amazing work. James Watson won a goddam Nobel prize. Bret Weinstein was a professor at Evergreen State. Etc. These are high class academics in conflict with the rest of their social class, which can cause a great deal of anxiety, the loss of friends, and outright conflict, as when students at Evergreen college literally tried to hunt Professor Weinstein down with bats and tazers for the crime of not leaving campus on “no white people on campus day.” (Relevant posts on Weinstein and Evergreen; See also the James Damore incident at Google and the Christakis incident at Yale.)

A conservative person living in a conservative part of the country probably doesn’t lose much social standing for criticizing stupid things liberals believe, but someone in a liberal profession or social environment will. Even if he is actually an expert who is actually correct, he still faces hoards of ignorant people who are socially more powerful than he is and will happily punch him silent in defense of their religious beliefs. (The same is probably also true in reverse; I wouldn’t want to be openly pro-choice in a highly conservative workplace, for example.)

A lot of what gets called “silencing” is just class insecurity or conflict. Fox News rails against “the media” even though it is the media; more people watch Fox than listen to NPR, but PR is high-class and Fox is low-class. It’s not that “the media” is liberal so much as that the upper class is liberal and the lower classes don’t like being looked down upon.

You can rail against Fox News or Infowars or whatever for being stupid, but this is a democracy, so if one side tries to be objective, correct, or have high-status experts, then the other side will rail against all of that. If one side tries to promote cute puppies, the other side will become the anti-cute-puppies party.

Experts run into troubl when their research leads them to believe things that fit with neither party, or only with the opposite social class from the one they run in. This is both very uncomfortable for the individual and hard to describe to outsiders.

Ultimately, I think class is far more important than we give it credit for.

Trying to be Smart: on bringing up extremely rare exceptions to prove forests don’t exist, only trees

When my kids don’t want to do their work (typically word problems in math,) they start coming up with all kinds of crazy scenarios to try to evade the question. “What if Susan cloned herself?” “What if Joe is actually the one driving the car, and he only saw the car pass by because he was looking at himself in a mirror?” “What if John used a wormhole to travel backwards in time and so all of the people at the table were actually Joe and so I only need to divide by one?” “What if Susan is actually a boy but her parents accidentally gave him the wrong name?” “What if ALIENS?”

After banging my head on the wall, I started asking, “Which is more likely: Sally and Susan are two different people, or Sally cloned herself, something no human has ever done before in the 300,000 years of homo Sapiens’ existence?” And sometimes they will, grudgingly, admit that their scenarios are slightly less likely than the assumptions the book is making.*

I forgive my kids, because they’re children. When adults do the same thing, I am much less sympathetic.

Folks on all sides of the political spectrum are probably guilty of this, but my inclinations/bubble lead me to encounter certain ones more often. Sex/gender is a huge one (even I have been led astray by sophistry on this subject, for which I apologize.)

Over in biology, sex is simply defined: Females produce large gametes. Males produce small gametes. It doesn’t matter how gametes are produced. It doesn’t matter what determines male or femaleness. All that matters is gamete size. There is no such thing (at least in humans) as a sex “spectrum”: reproduction requires one small gamete and one large gamete. Medium-sized gametes are not part of the process.

About 99.9% of people fit into the biological categories of “male” and “female.” An extremely small minority (<1%) have rare biological issues that interfere with gamete formation–people with Klinefelter’s, for example, are genetically XXY instead of XX or XY. People with Klinefelter’s are also infertile–unlike large gametes and small gametes, XXY isn’t part of a biological reproduction strategy. Like trisomy 21, it’s just an unfortunate accident in cell division.

In a mysterious twist, the vast majority of people have a “gender” identity that matches their biological sex. Even female athletes–women who excel at a stereotypically and highly masculine field–tend to identify as “women,” not men. Even male fashion designers tend to self-identify as men. There are a few people who identify as transgender, but in my personal experience, most of them are actually intersex in some way (eg, a woman who has autism, a condition characterized as “extreme male brain,” may legitimately feel like she thinks more like a guy than a girl.) Again, this is an extremely small percent of the population. For 99% of people you meet, normal gender assumptions apply.

So jumping into a conversation about “men” and “women” with “Well actually, ‘men’ and ‘women’ are just social constructs and gender is actually a spectrum and there are many different valid gender expressions–” is a great big NO.

Jumping into a discussion of women’s issues (like childbirth) with “Actually, men can give birth, too,” or the Women’s March with “Pussyhats are transphobic because some women have penises; vaginas don’t define what it means to be female,” is an even bigger NO, and I’m not even a fan of pussyhats.

Only biological females can give birth. That’s how the species works. When it comes to biology, leave things that you admit aren’t biology at the door. If a transgender man with a uterus gives birth to a child, he is still a biological female and we don’t need to confuse things by implying that someone gestated a fetus in his testicles. Over the millennia that humans have existed, a handful of people with some form of biological chimerism (basically, an internalized conjoined twin who never fully developed but ended up contributing an organ or two) who thought of themselves as male may have nonetheless given birth. These cases are so rare that you will probably never meet someone with them in your entire life.

Having lost a leg due to an accident (or 4 legs, due to being a pair of conjoined twins,) does not make “number of legs in humans” a spectrum ranging from 0-4. Humans have 2 legs; a few people have unfortunate accidents. Saying so doesn’t imply that people with 0 legs are somehow less human. They just had an accident.

In a conversation I read recently, Person A asserted that if two blue-eyed parents had a brown-eyed baby, the mother would be suspected of infidelity. A whole bunch of people immediately jumped on Person A, claiming he was scientifically ignorant and hadn’t paid attention in school–sadly, these overconfident people are actually the ones who don’t understand genetics, because blue eyes are recessive and thus two blue eyed people can’t make a brown-eyed biological child.  A few people, however, asserted that Person A was scientifically illiterate because there is an extremely rare brown-eyed gene that two blue-eyed people can carry, resulting in a brown-eyed child.

But this is not scientific illiteracy. The recessive brown-eyed gene is extremely rare, and both parents would have to have it. Infidelity, by contrast, is much more common. It’s not that common, but it’s more common than two parent both having recessive brown-eyed genes. Insisting that Person A is scientifically illiterate because of an extremely rare exception to the rule is ignoring statistics–statistically, the child is more likely to be not biological than to have an extremely rare variant. Statistically, men and women are far more likely to match in gender and sex than to not.

Let’s look at immigration, another topic near and dear to everyone’s hearts. After Trump’s comments about Haiti came out (and let’s be honest, Haiti’s capital, Port au Prince, is one of the world’s largest cities without a functioning sewer system, so “shithole” is actually true,) people began popping up with statements like “I’d rather a Ugandan immigrant who believes in American values than a socialist Norwegian.”

I, too, would rather a Ugandan with American values than a socialist Norwegian. However, what percentage of Ugandans actually have American values? Just a wild guess, but I suspect most Ugandans have Ugandan values. Most Ugandans probably think Ugandan culture is pretty nice and that Ugandan norms and values are the right ones to have, otherwise they wouldn’t have different values and we’d call those Ugandan values.

Updated values chart!

While we’re at it, I suspect most Chinese people have Chinese values, most Australians have Australian values, most Brazilians hold Brazilian values, and most people from Vatican City have Catholic values.

I don’t support blindly taking people from any country, because some people are violent criminals just trying to escape conviction. But some countries are clearly closer to each other, culturally, than others, and thus have a larger pool of people who hold each other’s values.

(Even when people hold very different values, some values conflict more than others.)

To be clear: I’ve been picking on one side, but I’m sure both sides do this.

What’s the point? None of this is very complicated. Most people can figure out if a person they have just met is male or female instantly and without fail. It takes a very smart person to get confused by a few extremely rare exceptions into thinking that the broad categories don’t functionally exist.

Sometimes this obfuscation is compulsive–the person just wants to show how smart they are, or maybe everyone around them is saying it so they start repeating it–but since most people seem capable of understanding probabilities in everyday life (“Sometimes the stoplight is glitched but usually it isn’t, so I’ll assume the stoplight is functioning properly and obey it,”) if someone suddenly seems incapable of distinguishing between extremely rare and extremely common events in the political realm, then they are doing so on purpose or suffering severe cognitive dissonance.

 

*Oddly, I solved the problem by giving the kids harder problems. It appears that when their brains are actively engaged with trying to solve the problem, they don’t have time/energy left to come up with alternatives. When the material is too easy (or, perhaps, way too hard) they start trying to get creative to make things more interesting.

 

Logan Paul and the Algorithms of Outrage

Leaving aside the issues of “Did Logan Paul actually do anything wrong?” and “Is changing YouTube’s policies actually in Game Theorist’s interests?” Game Theorist makes a good point: while YouTube might want to say, for PR reasons, that it is doing something about big, bad, controversial videos like Logan Paul’s, it also makes money off those same videos. YouTube–like many other parts of the internet–is primarily click driven. (Few of us are paying money for programs on YouTube Red.) YouTube wants views, and controversy drives views.

That doesn’t mean YouTube wants just any content–a reputation for having a bunch of pornography would probably have a damaging effect on channels aimed at small children, as their parents would click elsewhere. But aside from the actual corpse, Logan’s video wasn’t the sort of thing that would drive away small viewers–they’d get bored of the boring non-cartoons talking to the camera long before the suicide even came up.

Logan Paul actually managed to hit a very sweet spot: controversial enough to draw in visitors (tons of them) but not so controversial that he’d drive away other visitors.

In case you’ve forgotten the controversy in a fog of other controversies, LP’s video about accidentally finding a suicide in the Suicide Forest was initially well-received, racking up thousands of likes and views before someone got offended and started up the outrage machine. Once the outrage machine got going, public sentiment turned on a dime and LP was suddenly the subject of a full two or three days of Twitter hate. The hate, of course, got YouTube more views. LP took down the video and posted an apology–which generated more attention. Major media outlets were now covering the story. Even Tablet managed to quickly come up with an article: Want a New Years Resolution? Don’t be Like Logan Paul.

And it worked. I passed up Tablet’s regular article on Trump and Bagels and Culture, but I clicked on that article about Logan Paul because I wanted to know what on earth Tablet had to say about LP, a YouTuber whom, 24 hours prior, I had never heard of.

And the more respectable (or at least highly-trafficked) news outlets picked up the story, the higher Logan’s videos rose on the YouTube charts. And as more people watched more of LP’s other videos, they found more things to be offended at. For example, once he ran through the streets of Japan holding a fish. A FISH, I tell you. He waved this fish at people and was generally very annoying.

I don’t like LP’s style of humor, but I’m not getting worked up over a guy waving a fish around.

So understand this: you are in an outrage machine. The purpose of the outrage machine is to drive traffic, which makes clicks, which result in ad revenue. There are probably whole websites (Huffpo, CNN) that derive a significant percent of their profits from hate-clicks–that is, intentionally posting incendiary garbage not because they believe it or think it is just or true or appeals to their base, but because they can get people to click on it in sheer shock or outrage.

Your emotions–your “emotional labor” as the SJWs call it–is being turned into someone else’s dollars.

And the result is a country that is increasingly polarized. Increasingly outraged. Increasingly exhausted.

Step back for a moment. Take a deep breath. Get some fresh air. Ask yourself, “Does this really matter? Am I actually helping anyone? Will I remember this in a week?”

I’d blame the SJWs for the outrage machine–and really, they are good running it–but I think it started with CNN and “24 hour news.” You have to do something to fill that time. Then came Fox News, which was like CNN, but more controversial in order to lure viewers away from the more established channel. Now we have the interplay of Facebook, Twitter, HuffPo, online newspapers, YouTube, etc–driven largely by automated algorithms designed to maximized clicks–even hate clicks.

The Logan Paul controversy is just one example out of thousands, but let’s take a moment and think about whether it really mattered. Some guy whose job description is “makes videos of his life and posts them on YouTube” was already shooting a video about his camping trip when he happened upon a dead body. He filmed the body, called the police, canceled his camping trip, downed a few cups of sake while talking about how shaken he was, and ended the video with a plea that people seek help and not commit suicide.

In between these events was laughter–I interpret it as nervous laughter in an obviously distressed person. Other people interpret this as mocking. Even if you think LP was mocking the deceased, I think you should be more concerned that Japan has a “Suicide Forest” in the first place.

Let’s look at a similar case: When three year old Alan Kurdi drowned, the photograph of his dead body appeared on websites and newspapers around the world–earning thousands of dollars for the photographers and news agencies. Politicans then used little Alan’s death to push particular political agendas–Hillary Clinton even talked about Alan Kurdi’s death in one of the 2016 election debates. Alan Kurdi’s death was extremely profitable for everyone making money off the photograph, but no one got offended over this.

Why is it acceptable for photographers and media agencies to make money off a three year old boy who drowned because his father was a negligent fuck who didn’t put a life vest on him*, but not acceptable for Logan Paul to make money off a guy who chose to kill himself and then leave his body hanging in public where any random person could find it?

Elian Gonzalez, sobbing, torn at gunpoint from his relatives. BTW, This photo won the 2001 Pulitzer Prize for Breaking News.

Let’s take a more explicitly political case. Remember when Bill Clinton and Janet Reno sent 130 heavily armed INS agents to the home of child refugee Elian Gonzalez’s relatives** so they could kick him out of the US and send him back to Cuba?

Now Imagine Donald Trump sending SWAT teams after sobbing children. How would people react?

The outrage machine functions because people think it is good. It convinces people that it is casting light on terrible problems that need correcting. People are getting offended at things that they wouldn’t have if the outrage machine hadn’t told them to. You think you are serving justice. In reality, you are mad at a man for filming a dead guy and running around Japan with a fish. Jackass did worse, and it was on MTV for two years. Game Theorist wants more consequences for people like Logan Paul, but he doesn’t realize that anyone can get offended at just about anything. His videos have graphic descriptions of small children being murdered (in videogame contexts, like Five Nights at Freddy’s or “What would happen if the babies in Mario Cart were involved in real car crashes at racing speeds?”) I don’t find this “family friendly.” Sometimes I (*gasp*) turn off his videos as a result. Does that mean I want a Twitter mob to come destroy his livelihood? No. It means a Twitter mob could destroy his livelihood.

For that matter, as Game Theorist himself notes, the algorithm itself rewards and amplifies outrage–meaning that people are incentivised to create completely false outrage against innocent people. Punishing one group of people more because the algorithm encourages bad behavior in other people is cruel and does not solve the problem. Changing the algorithm would solve the problem, but the algorithm is what makes YouTube money.

In reality, the outrage machine is pulling the country apart–and I don’t know about you, but I live here. My stuff is here; my loved ones are here.

The outrage machine must stop.

*I remember once riding in an airplane with my father. As the flight crew explained that in the case of a sudden loss of cabin pressure, you should secure your own mask before assisting your neighbors, his response was a very vocal “Hell no, I’m saving my kid first.” Maybe not the best idea, but the sentiment is sound.

**When the boat Elian Gonzalez and his family were riding in capsized, his mother and her boyfriend put him in an inner tube, saving his life even though they drowned.

Apparently Most People Live in A Strange Time Warp Where Neither Past nor Future Actually Exist

Forget the Piraha. It appears that most Americans are only vaguely aware of these things called “past” and “future”:

Source: CNN poll conducted by SSRS,

A majority of people now report that George W. Bush, whom they once thought was a colossal failure of a president, whose approval ratings bottomed out at 33% when he left office, was actually good. By what measure? He broke the economy, destabilized the Middle East, spent trillions of dollars, and got thousands of Americans and Iraqis killed.

Apparently the logic here is “Sure, Bush might have murdered Iraqi children and tortured prisoners, but at least he didn’t call Haiti a shithole.” We Americans have standards, you know.

He’s just a huggable guy.

I’d be more forgiving if Bush’s good numbers all came from 18 year olds who were 10 when he left office and so weren’t actually paying attention at the time. I’d also be more forgiving if Bush had some really stupid scandals, like Bill Clinton–I can understand why someone might have given Clinton a bad rating in the midst of the Monica Lewinsky scandal, but looking back a decade later, might reflect that Monica didn’t matter that much and as far as president goes, Clinton was fine.

But if you thought invading Iraq was a bad idea back in 2008 then you ought to STILL think it is a bad idea right now.

Note: If you thought it was a good idea at the time, then it’s sensible to think it is still a good idea.

This post isn’t really about Bush. It’s about our human inability to perceive the flow of time and accurately remember the past and prepare for the future.

I recently texted a fellow mom: Would your kid like to come play with my kid? She texted back: My kid is down for a nap.

AND?

What about when the nap is over? I didn’t specify a time in the original text; tomorrow or next week would have been fine.

I don’t think these folks are trying to avoid me. They’re just really bad at scheduling.

People are especially bad at projecting current trends into the future. In a conversation with a liberal friend, he dismissed the idea that there could be any problems with demographic trends or immigration with, “That won’t happen for a hundred years. I’ll be dead then. I don’t care.”

An anthropologist working with the Bushmen noticed that they had to walk a long way each day between the watering hole, where the only water was, and the nut trees, where the food was. “Why don’t you just plant a nut tree near the watering hole?” asked the anthropologist.

“Why bother?” replied a Bushman. “By the time the tree was grown, I’d be dead.”

Of course, the tree would probably only take a decade to start producing, which is within even a Bushman’s lifetime, but even if it didn’t, plenty of people build up wealth, businesses, or otherwise make provisions to provide for their children–or grandchildren–after their deaths.

Likewise, current demographic trends in the West will have major effects within our lifetimes. Between the  1990 and 2010 censuses (twenty years), the number of Hispanics in the US doubled, from 22.4 million to 50.5 million. As a percent of the overall population, they went from 9% to 16%–making them America’s largest minority group, as blacks constitute only 12.6%.

If you’re a Boomer, then Hispanics were only 2-3% of the country during your childhood.

The idea that demographic changes will take a hundred years and therefore don’t matter makes as much sense as saying a tree that takes ten years to grow won’t produce within your lifetime and therefore isn’t worth planting.

Society can implement long term plans–dams are built with hundred year storms and floods in mind; building codes are written with hundred year earthquake risks in mind–but most people seem to exist in a strange time warp in which neither the past nor future really exist. What they do know about the past is oddly compressed–anything from a decade to a century ago is mushed into a vague sense of “before now.” Take this article from the Atlantic on how Micheal Brown (born in 1996,) was shot in 2014 because of the FHA’s redlining policies back in 1943.

I feel like I’m beating a dead horse at this point, but one of the world’s most successful ethnic groups was getting herded into gas chambers in 1943. Somehow the Jews managed to go from being worked to death in the mines below Buchenwald (slave labor dug the tunnels where von Braun’s rockets were developed) to not getting shot by the police on the streets of Ferguson in 2014, 71 years later. It’s a mystery.

And in another absurd case, “Artist reverses gender roles in 50s ads to ‘give men a taste of their own sexist poison’,” because clearly advertisements from over half a century ago are a pressing issue, relevant to the opinions of modern men.

I’m focusing here on political matters because they make the news, but I suspect this is a true psychological trait for most people–the past blurs fuzzily together, and the future is only vaguely knowable.

Politically, there is a tendency to simultaneously assume the past–which continued until last Tuesday–was a long, dark, morass of bigotry and unpleasantness, and that the current state of enlightened beauty will of course continue into the indefinite future without any unpleasant expenditures of effort.

In reality, our species is, more or less, 300,000 years old. Agriculture is only 10,000 years old.

100 years ago, the last great bubonic plague epidemic (yersinia pestis) was still going on. 10 million people died, including 119 Californians. 75 years ago, millions of people were dying in WWII. Sixty years ago, polio was still crippling children (my father caught it, suffering permanent nerve damage.)

In the 1800s, Germany’s infant mortality rate was 50%; in 1950, Europe’s rate was over 10%; today, infant mortality in the developed world is below 0.5%; globally, it’s 4.3%. The death of a child has gone from a universal hardship to an almost unknown suffering.

100 years ago, only one city in the US–Jersey City–routinely disinfected its drinking water. (Before disinfection and sewers, drinking water was routinely contaminated with deadly bacteria like cholera.) I’m still looking for data on the spread of running water, but chances are good your grandparents did not have an indoor toilet when they were children. (I have folks in my extended family who still have trouble when the water table drops and their well dries up.)

Hunger, famines, disease, death… I could continue enumerating, but my point is simple: the prosperity we enjoy is not only unprecedented in the course of human history, but it hasn’t even existed for one full human lifetime.

Rome was once an empire. In the year one hundred, the eternal city had over 1,500,000 citizens. By 500, it had fewer than 50,000. It would not recover for over a thousand years.

Everything we have can be wiped away in another human lifetime if we refuse to admit that the future exists.