Liberal Christian denominations (ie, Mainline Protestants) are caught in a paradox: even though they have increasingly defined themselves as open to everyone, their membership roles keep decreasing. It’s as if the more people they let in, the fewer people show up.
[insert Groucho Marx cartoon about not wanting to belong to the set of all clubs that would have him.]
Mainline Protestant churches have been hit the hardest. The Evangelical Lutheran Church in America (ELCA) in Minnesota has lost almost 200,000 members since 2000 and about 150 churches. A third of the remaining 1,050 churches have fewer than 50 members. The United Methodist Church, the second largest Protestant denomination in Minnesota, has shuttered 65 churches since 2000.
Catholic membership statewide has held steady, but the number of churches fell from 720 in 2000 to 639 last year, according to official Catholic directories.”
Note the timeframe: we’re not talking about change over the course of a century. The Presbyterian church of Minnesota has lost 42% of its members since 2000.
Meanwhile, membership is basically holding steady at conservative denominations that practically define themselves by whom they don’t let in. Evangelicals and fundamentalists are not hemorrhaging nearly as badly as their more welcoming brethren.
Among Mainline Protestants, the only denomination that’s basically holding steady is the American Baptist Church, which has gained black souls as it has lost white ones.
The African Methodist Episcopal Church has more than doubled in size.
Interestingly, a conservative spin-off of the Presbyterian church is doing fine, and the notorious Southern Baptists are doing fine. [source for denomination data.]
The Amish, who are practically their own ethnic group due to only marrying other Amish, have been nearly doubling their population every 20 years, and that’s even with a significant number of children leaving each generation. Of course, the Amish have plenty of children.
Of course, one of the biggest factors in the decline of liberal denominations is fertility–the Amish have a lot more kids than Mainline Protestants.
But why have the Mainlines, with their open and tolerant ideologies and welcoming attitude toward nearly everyone, not attracted more members as society in general has moved leftward on many issues? If you have read Dumbing of Age for as long as I have, then you are well aware of the main character, Joyce’s, rejection of the particular brand of conservative Christianity she was raised and homeschooled in over the issue of homosexuality, and her subsequent search for a more liberal church (which has so far involved freaking out at an Episcopalian service because it smacked of papistry.)
Why are Presbyterians failing to attract the Joyces of the world?
I propose this is because functionally religious identity is about group identity, and a group identity that hinges on “openness to outsiders” is not a functional group identity.
Now you might be saying, “Wait, I thought religious identity had to do with what you think God, or ethics, or how the world was created. People give some sort of rational thought to their beliefs, and then pick the church that best suits them.”
No. I don’t think anyone ever said, “Hey, the religion where you can’t eat pigs sounds much more rational than the religion where you can’t eat cows.” Nor did anyone logically think that the religions with animal sacrifice sounded more logical than the one where the feces of priests are holy, or where alien ghosts are causing all of your problems. (Basically, every religion that isn’t whatever you happen to practice is full of totally illogical beliefs.)
This is why conversations between atheists and theists are so boring. Atheists try to explain that religion doesn’t make sense, and theists try to explain that religion is about faith, not logic.
The nation of Pakistan is 96.4% Muslim, and it didn’t get that way because everyone in Pakistan spontaneously decided when they were about 16 years old that they all agreed that Islam was the only true religion. Israel is 74.7% Jewish, not because all of the Jews logically examined all of the world’s religion and then spontaneously agreed that Judaism was the best one. No; most of the world’s Muslims are Muslim because their parents were Muslim. Most of the world’s Jews were born to other Jews. Most Christians were born to Christians, and so on.
Multi-religious states exist, but within those states, people tend to marry within their own religion or abandon religion altogether, for religion is ethnicity.
3,000 years ago, this would have been an unexceptional statement. The People of the Crocodile God worshiped crocodiles and were certain those folks over there worshiped the Snake God were up to no good. Note that they didn’t deny the existence of the Snake God; they just didn’t worship it.
Our ancestral memetic environment was very different from our modern one because most people couldn’t travel far and mass media didn’t exist. As a result, people tended to only interact with their own group; outsiders were demonized and war was frequent. To be part of a tribe was to worship the tribe’s totems or ancestral deities. In an uncertain world where wind and rain, life and death were mysteries in the hands of capricious deities, to not worship the tribal gods was akin to saying you did not care whether your brothers lived or died.
Indeed, the big issue Rome had with Christians and Jews was less that they worshiped some strange god with weird food rules and transubstantiation–the empire had a pretty inclusive attitude of adopting new deities as it encountered them–but that Christians and Jews refused to adopt the empires deities into their pantheon. More to the point, they refused to sacrifice to the Roman gods, which the Romans believed would bring the wrath of the gods on them and showed very poor civic spirit. As Tertullian complained in the second century:
They think the Christians the cause of every public disaster, of every affliction with which the people are visited. If the Tiber rises as high as the city walls, if the Nile does not send its waters up over the fields, if the heavens give no rain, if there is an earthquake, if there is famine or pestilence, straightway the cry is, “Away with the Christians to the lions!
Monotheism of course triumphed over paganism by taking over the empire itself. The conquering of pagans and thus their gods happened on a small scale within Judea, then on a large scale with Rome and Mecca. The big religions now expanded past pure ethnic lines, but still functioned for ordinary people as ethnic identities due to the lack of long-distance travel–Christians, for example, were members of “Christendom,” which stood in contrast with the pagan, barbarian, and non-Christian hordes–places which, of course, the average christian never saw.
But modern technology has drastically changed our memetic environment. Today you can hop in a car or plane and within hours be hundreds or thousands of miles away–distances your ancestors would have taken months to walk. You can pick up your phone and talk to a friend on the other side of the planet, or read headlines detailing the spread of disease in a foreign country. (I have written extensively about this change in the memes category.)
In the ancestral memetic environment, almost everyone you talked to and got information from was either your immediate family or lived in your community. As a result, memes that promote the survival of you, your family, your community, and your genes tend to dominate. Memes that promote the survival of strangers don’t do as well.
In our modern memetic environment, most of the people you talk to and get information from are strangers. You get movie recommendations from strangers on Rotten Tomatoes; you learn about new business ideas from the reporters at Forbes or Wired or The Wall Street Journal; you get parenting advice from a nanny on TV and medical advice from WebMD. You no longer raise barns or herd goats with your brothers, cousins, and extended family, but work in a cubicle farm with a hundred people who probably aren’t even 5th cousins.
As a result, the modern memetic environment favors the horizontal (rather than vertical, ie from parent to child,) meme transfer. This environment favors the spread of memes that prioritize the interests of strangers, simply because so many of the people you are talking to and interacting with are strangers.
The liberal churches–in particular, the Mainline Protestants–have worked hard to signal openness to others, because this is how horizontal morality works. (The group identity of people who define themselves as open to others thus has as its group it’s defined against as “people who aren’t open to others.”) But if religion itself is about group identity, then a group identity of “let’s be open to others and not have a strong group identity” is going to leave people unenthusiastic about attending liberal churches.
Group identity used to be more intuitive for people, again, because they mostly interacted with members of their own group. Modern religious identity for most Christians is no longer explicitly ethnic (not if you want a place in polite society,) so the “outgroup” has switched gay people, who are such a small percent of the population (2-3%) that they’re effectively a symbolic issue for most parishioners. Unlike those dastardly followers of the Snake God, homosexuals have never made their own army, invaded a neighboring tribe’s territory, massacred all of the women and carried off the men.
(This is, in my opinion, a very silly rock to build one’s church on. Certainly churches for the first 1,900 years of Christianity didn’t make this a major, defining point of what makes them different from their competitors. Jesus himself didn’t say a whole lot about gay people.)
And getting back to fertility, people with stronger group identities–such as people whose religions tell them they should have a group identity and it is good to have a group identity that excludes those [evil outgroup people] tend to have more children, who are the literal future of the church.
Summary version: Religion is about group identity, but the modern memetic environment, ie liberalism, is anti-group identity. Churches that try to set themselves up in opposition to group identity therefore fail. But since ethnic identity is no longer in fashion, conservative religious groups now define themselves in opposition to homosexuals, a somewhat symbolic opposition considering that homosexuals have never constituted a military threat to anyone’s ethnic group.
I am on vacation, and so have only been able to take notes on the posts I want to write for the past week. Here is the outline I jotted down in the car:
When Capitalism Devours Democracy
Ken Star, Mueller, the media, and endless for-profit, anti-nation investigations into the president. (Actually, Tom Nichols’s discussion about the evolution of talk radio and Cable News and their deleterious effects on political discourse is one of the better parts of his book, The Death of Expertise.)
The overly complex legal code + endless investigation + the media + advertising dollars => undermining government function.
Watergate, White Water, Monica, Russiagate, etc.
Can you imagine the national reaction if someone tried to investigate George Washington the same way? It would have been seen not as “anti-George Washington,” but as fundamentally anti-American, an attempt to subvert democracy itself and interfere with the proper functioning of the nation.
Note the complexity of the modern legal, economic, and tax systems, which simultaneously make it very hard for anyone doing much of anything to comply with every single law (have you ever jaywalked? Accidentally miscounted a deduction on your taxes?) and ensure that, with enough searching, if you want to pin something bad on someone, you probably can.
Even though you believe in your heart that you have done nothing wrong, you have no idea whether you might be admitting that you did something that is against the law. There are tens of thousands of criminal statutes on the books in America today. Most of them you have never heard of, and many of them involve conduct that nobody would imagine could ever be a crime.
(Unless you’ve been pulled over for speeding. Then obviously you pull out your driver’s license and talk like a normal human.)
In short, the media discovered, with Nixon and Watergate (at least within the past century or so,) that constant presidential scandals could be good for ratings, and certain folks in the government discovered with Bill Clinton and Monica and Lewinsky that if you go digging for long enough, eventually you can find some kind of dirt to pin on someone–even if it’s completely irrelevant, idiotic dirt that has nothing to do with the president’s ability to govern.
This creates the incentive for the Media to constantly push the drumbeat narrative of “presidential scandal!” which leads to people truly believing that there is much more scandal than there really is.
Theory: Monica, Benghazi, Russiagate, and maybe even Watergate were all basically trumped-up hogwash played for ratings dollars. (Well, clearly someone broke into the Watergate hotel.)
The sheer complexity of the modern legal system, which allows this to happen, also incentivizes each party to push for constant investigations of the other party’s presidents. In essence, both sides are moving toward mutual defect-defect, with the media egging them on.
And We the People are the suckers.
I feel like there are concepts here for which we need better words.
Make no mistake: Nichols is annoyingly arrogant. He draws a rather stark line between “experts” (who know things) and everyone else (who should humbly limit themselves to voting between options defined for them by the experts.) He implores people to better educate themselves in order to be better voters, but has little patience for autodidacts and bloggers like myself who are actually trying.
But arrogance alone doesn’t make someone wrong.
Nichols’s first thesis is simple: most people are too stupid or ignorant to second-guess experts or even contribute meaningfully to modern policy discussions. How can people who can’t find Ukraine on a map or think we should bomb the fictional city of Agrabah contribute in any meaningful way to a discussion of international policy?
It was one thing, in 1776, to think the average American could vote meaningfully on the issues of the day–a right they took by force, by shooting anyone who told them they couldn’t. Life was less complicated in 1776, and the average person could master most of the skills they needed to survive (indeed, pioneers on the edge of the frontier had to be mostly self-sufficient in order to survive.) Life was hard–most people engaged in long hours of heavy labor plowing fields, chopping wood, harvesting crops, and hauling necessities–but could be mastered by people who hadn’t graduated from elementary school.
But the modern industrial (or post-industrial) world is much more complicated than the one our ancestors grew up in. Today we have cars (maybe even self-driving cars), electrical grids and sewer systems, atomic bombs and fast food. The speed of communication and transportation have made it possible to chat with people on the other side of the earth and show up on their doorstep a day later. The amount if specialized, technical knowledge necessary to keep modern society running would astonish the average caveman–even with 15+ years of schooling, the average person can no longer build a house, nor even produce basic necessities like clothes or food. Most of us can’t even make a pencil.
Even experts who are actually knowledgeable about their particular area may be completely ignorant of fields outside of their expertise. Nichols speaks Russian, which makes him an expert in certain Russian-related matters, but he probably knows nothing about optimal high-speed rail networks. And herein lies the problem:
The American attachment to intellectual self-reliance described by Tocqueville survived for nearly a century before falling under a series of assaults from both within and without. Technology, universal secondary education, the proliferation of specialized expertise, and the emergence of the United States a a global power in the mid-twentieth century all undermined the idea… that the average American was adequately equipped either for the challenges of daily life or for running the affairs of a large country.
… the political scientist Richard Hofstadter wrote that “the complexity of modern life has steadily whittled away the functions the ordinary citizen can intelligently and competently perform for himself.”
… Somin wrote in 2015 that the “size and complexity of government” have mad it “more difficult for voters with limited knowledge to monitor and evaluate the government’s many activities. The result is a polity in which the people often cannot exercise their sovereignty responsibly and effectively.”
In other words, society is now too complex and people too stupid for democracy.
Nichols’s second thesis is that people used to trust experts, which let democracy function, but to day they are less trusting. He offers no evidence other than his general conviction that this change has happened.
He does, however, detail the way he thinks that 1. People have been given inflated egos about their own intelligence, and 2. How our information-delivery system has degenerated into misinformational goo, resulting in the trust-problems he believes we are having These are interesting arguments and worth examining.
A bit of summary:
Indeed, maybe the death of expertise is a sign of progress. Educated professionals, after all, no longer have a stranglehold on knowledge. The secrets of life are no longer hidden in giant marble mausoleums… in the past, there was less tress between experts and laypeople, but only because citizen were simply unable to challenge experts in any substantive way. …
Participation in political, intellectual, and scientific life until the early twentieth century was far more circumscribed, with debates about science, philosophy, and public policy all conducted by a small circle of educated males with pen and ink. Those were not exactly the Good Old Days, and they weren’t that long ago. The time when most people didn’t finish highschool, when very few went to college, and only a tiny fraction of the population entered professions is still within living memory of many Americans.
Aside from Nichols’s insistence that he believes modern American notions about gender and racial equality, I get the impression that he wouldn’t mind the Good Old Days of genteel pen-and-ink discussions between intellectuals. However, I question his claim that participation in political life was far more circumscribed–after all, people voted, and politicians liked getting people to vote for them. People anywhere, even illiterate peasants on the frontier or up in the mountains like to gather and debate about God, politics, and the meaning of life. The question is less “Did they discuss it?” and more “Did their discussions have any effect on politics?” Certainly we can point to abolition, women’s suffrage, prohibition, and the Revolution itself as heavily grass-roots movements.
But continuing with Nichols’s argument:
Social changes only in the past half century finally broke down old barriers of race, class, and sex not only between Americans and general but also between uneducated citizens and elite expert in particular. A wide circle of debate meant more knowledge but more social friction. Universal education, the greater empowerment of women and minorities, the growth of a middle class, and increased social mobility all threw a minority of expert and the majority of citizens into direct contact, after nearly two centuries in which they rarely had to interact with each other.
And yet the result has not been a greater respect for knowledge, but the growth of an irrational conviction among Americans that everyone is as smart as everyone else.
Nichols is distracting himself with the reflexive racial argument; the important change he is highlighting isn’t social but technical.
I’d like to quote a short exchange from Our Southern Highlanders, an anthropologic-style text written about Appalachia about a century ago:
The mountain clergy, as a general rule, are hostile to “book larnin’,” for “there ain’t no Holy Ghost in it.” One of them who had spent three months at a theological school told President Frost, “Yes, the seminary is a good place ter go and git rested up, but ’tain’t worth while fer me ter go thar no more ’s long as I’ve got good wind.”
It used to amuse me to explain how I knew that the earth was a sphere; but one day, when I was busy, a tiresome old preacher put the everlasting question to me: “Do you believe the earth is round?” An impish perversity seized me and I answered, “No—all blamed humbug!” “Amen!” cried my delighted catechist, “I knowed in reason you had more sense.”
But back to Nichols, who really likes the concept of expertise:
One reason claims of expertise grate on people in a democracy is that specialization is necessarily exclusive. WHen we study a certain area of knowledge or spend oulives in a particular occupation, we not only forego expertise in othe jobs or subjects, but also trust that other pople in the community know what they’re doing in thei area as surely as we do in our own. As much as we might want to go up to the cockpit afte the engine flames out to give the pilots osme helpful tips, we assume–in part, ebcause wehave to–that tye’re better able to cope with the problem than we are. Othewise, our highly evovled society breaks down int island sof incoherence, where we spend our time in poorly infomed second-guessing instead of trusting each other.
This would be a good point to look at data on overall trust levels, friendship, civic engagement, etc (It’s down. It’s all down.) and maybe some explanations for these changes.
Nichols talks briefly about the accreditation and verification process for producing “experts,” which he rather likes. There is an interesting discussion in the economics literature on things like the economics of trust and information (how do websites signal that they are trustworthy enough that you will give them your credit card number and expect to receive items you ordered a few days later?) which could apply here, too.
Nichols then explores a variety of cognitive biases, such a superstitions, phobias, and conspiracy theories:
Conspiracy theories are also a way for people to give meaning to events that frighten them. Without a coherent explanation for why terrible thing happen to innocent people, they would have to accept such occurence as nothing more than the random cruelty either of an uncaring universe or an incomprehensible deity. …
The only way out of this dilemma is to imagine a world in which our troubles are the fault of powerful people who had it within their power to avert such misery. …
Just as individual facing grief and confusion look for reasons where none may exist, so, too, will entire societies gravitate toward outlandish theories when collectively subjected to a terrible national experience. Conspiracy theories and flawed reasoning behind them …become especially seductive “in any society that has suffered an epic, collectively felt trauma. In the aftermath, millions of people find themselves casting about for an answer to the ancient question of why bad things happen to good people.” …
Today, conspiracy theories are reaction mostly to the economic and social dislocations of globalization…This is not a trivial obstacle when it comes to the problems of expert engagement with the public: nearly 30 percent of Americans, for example, think “a secretive elite with a globalist agenda is conspiring to eventually rule the world” …
Obviously stupid. A not-secret elite with a globalist agenda already rules the world.
and 15 percent think media or government add secret mind controlling technology to TV broadcasts. (Another 15 percent aren’t sure about the TV issue.)
It’s called “advertising” and it wants you to buy a Ford.
Anyway, the problem with conspiracy theories is they are unfalsifiable; no amount of evidence will ever convince a conspiracy theorist that he is wrong, for all evidence is just further proof of how nefariously “they” are constructing the conspiracy.
Then Nichols gets into some interesting matter on the difference between stereotypes and generalizations, which segues nicely into a tangent I’d like to discuss, but it probably deserves its own post. To summarize:
Sometimes experts know things that contradict other people’s political (or religious) beliefs… If an “expert” finding or field accords with established liberal values, EG, the implicit association test found that “everyone is a little bit racist,” which liberals already believed, then there is an easy mesh between what the academics believe and the rest of their social class.
If their findings contradict conservative/low-class values, EG, when professors assert that evolution is true and “those low-class Bible-thumpers in Oklahoma are wrong,” sure, they might have a lot of people who disagree with them, but those people aren’t part of their own social class/the upper class, and so not a problem. If anything, high class folks love such finding, because it gives them a chance to talk about how much better they are than those low-class people (though such class conflict is obviously poisonous in a democracy where those low-class people can still vote to Fuck You and Your Global Warming, Too.)
But if the findings contradict high-class/liberal politics, then the experts have a real problem. EG, if that same evolution professor turns around and says, “By the way, race is definitely biologically real, and there are statistical differences in average IQ between the races,” now he’s contradicting the political values of his own class/the upper class, and that becomes a social issue and he is likely to get Watsoned.
Jordan Peterson isn’t unpopular or “silenced” so much as he is disliked by upper class folks and liked by “losers” and low class folks, despite the fact that he is basically an intellectual guy and isn’t peddling a low-class product. Likewise, Fox News is just as much part of The Media as NPR, (if anything, it’s much more of the Media) but NPR is higher class than Fox, and Fox doesn’t like feeling like its opinions are being judged along this class axis.
For better or for worse (mostly worse) class politics and political/religious beliefs strongly affect our opinions of “experts,” especially those who say things we disagree with.
But back to Nichols: Dunning-Kruger effect, fake cultural literacy, and too many people at college. Nichols is a professor and has seen college students up close and personal, and has a low opinion of most of them. The massive expansion of upper education has not resulted in a better-educated, smarter populace, he argues, but a populace armed with expensive certificates that show the sat around a college for 4 years without learning much of anything. Unfortunately, beyond a certain level, there isn’t a lot that more school can do to increase people’s basic aptitudes.
Colleges get money by attracting students, which incentivises them to hand out degrees like candy–in other words, students are being lied to about their abilities and college degrees are fast becoming the participation trophies for the not very bright.
Nichols has little sympathy for modern students:
Today, by contrast, students explode over imagined slights that are not even remotely int eh same category as fighting for civil rights or being sent to war. Students now build majestic Everests from the smallest molehills, and they descend into hysteria over pranks and hoaxes. In the midst of it all, the students are learning that emotions and volume can always defeat reason and substance, thus building about themselves fortresses that no future teacher, expert, or intellectual will ever be able to breach.
At Yale in 2015, for example, a house master’s wife had the temerity to tell minority students to ignore Halloween costumes they thought offensive. This provoked a campus wide temper tantrum that included professors being shouted down by screaming student. “In your position as master,” one student howled in a professor’s face, “it is your job to create a place of comfort and home for the students… Do you understand that?!”
Quietly, the professor said, “No, I don’t agree with that,” and the student unloaded on him:
“Then why the [expletive] did you accept the position?! Who the [expletive] hired you?! You should step down! If that is what you think about being a master you should step down! It is not about creating an intellectual space! It is not! Do you understand that? It’s about creating a home here. You are not doing that!” [emphasis added]
Yale, instead of disciplining students in violation of their own norms of academic discourse, apologized to the tantrum throwers. The house master eventually resigned from his residential post…
To faculty everywhere, the lesson was obvious: the campus of a top university is not a place for intellectual exploration. It is a luxury home, rented for four to six years, nine months at a time, by children of the elite who may shout at faculty as if they’re berating clumsy maids in a colonial mansion.
The incident Nichols cites (and similar ones elsewhere,) are not just matters of college students being dumb or entitled, but explicitly racial conflicts. The demand for “safe spaces” is easy to ridicule on the grounds that students are emotional babies, but this misses the point: students are carving out territory for themselves on explicitly racial lines, often by violence.
Nichols, though, either does not notice the racial aspect of modern campus conflicts or does not want to admit publicly to doing so.
Nichols moves on to blame TV, especially CNN, talk radio, and the internet for dumbing down the quality of discourse by overwhelming us with a deluge of more information than we can possibly process.
Referring back to Auerswald and The Code Economy, if automation creates a bifurcation in industries, replacing a moderately-priced, moderately available product with a stream of cheap, low-quality product on the one hand and a trickle of expensive, high-quality products on the other, good-quality journalism has been replaced with a flood of low-quality crap. The high-quality end is still working itself out.
Accessing the Internet can actually make people dumber than if they had never engaged a subject at all. The very act of searching for information makes people think they’ve learned something,when in fact they’re more likely to be immersed in yet more data they do not understand. …
When a group of experimental psychologists at Yale investigated how people use the internet, they found that “people who search for information on the Web emerge from the process with an inflated sense of how much they know–even regarding topic that are unrelated to the ones they Googled.” …
How can exposure to so much information fail to produce at least some kind of increased baseline of knowledge, if only by electronic osmosis? How can people read so much yet retain so little? The answer is simple: few people are actually reading what they find.
As a University College of London (UCL) study found, people don’t actually read the articles they encounter during a search on the Internet. Instead, they glance at the top line or the first few sentences and then move on. Internet users, the researchers noted, “Are not reading online in the traditional sense; indeed, there are signs that new forms of ‘reading’ are emerging as users ‘power browse’ horizontally through titles, contents pages and abstracts going for quick wins. It almost seems that they go online to avoid reading in the traditional sense.”
The internet’s demands for instant updates, for whatever headlines generate the most clicks (and thus advertising revenue), has upset the balance of speed vs. expertise in the newsroom. No longer have reporters any incentive to spend long hours carefully writing a well-researched story when such stories pay less than clickbait headlines about racist pet costumes and celebrity tweets.
I realize it seems churlish to complain about the feast of news and information brought to us by the Information Age, but I’m going to complain anyway. Changes in journalism, like the increased access to the Internet and to college education, have unexpectedly corrosive effects on the relationship between laypeople and experts. Instead of making people better informed, much of what passes for news in the twenty-first century often leaves laypeople–and sometimes experts–even more confused and ornery.
Experts face a vexing challenge: there’s more news available, and yet people seem less informed, a trend that goes back at least a quarter century. Paradoxically, it is a problem that is worsening rather than dissipating. …
As long ago as 1990, for example, a study conducted by the Pew Trust warned that disengagement from important public questions was actually worse among people under thirty, the group that should have been most receptive to then-emerging sources of information like cable television and electronic media. This was a distinct change in American civic culture, as the Pew study noted:
“Over most of the past five decades younger members of the public have been at least as well informed as older people. In 1990, that is no longer the case. … “
Those respondents are now themselves middle-aged, and their children are faring no better.
If you were 30 in 1990, you were born in 1960, to parents who were between the ages of 20 and 40 years old, that is, born between 1920 and 1940.
Fertility for the 1920-1940 cohort was strongly dysgenic. So was the 1940-50 cohort. The 1900-1919 cohort at least had the Flynn Effect on their side, but later cohorts just look like an advertisement for idiocracy.
Nichols ends with a plea that voters respect experts (and that experts, in turn, be humble and polite to voters.) After all, modern society is too complicated for any of us to be experts on everything. If we don’t pay attention to expert advice, he warns, modern society is bound to end in ignorant goo.
The logical inconsistency is that Nichols believes in democracy at all–he thinks democracy can be saved if ignorant people vote within a range of options as defined by experts like himself, eg, “What vaccine options are best?” rather than “Should we have vaccines at all?”
The problem, then, is that whoever controls the experts (or controls which expert opinions people hear) controls the limits of policy debates. This leads to people arguing over experts, which leads right back where we are today. As long as there are politics, “expertise” will be politicized, eg:
Look at any court case in which both sides bring in their own “expert” witnesses. Both experts testify to the effect that their side is correct. Then the jury is left to vote on which side had more believable experts. This is like best case scenario voting, and the fact that the voters are dumb and don’t understand what the experts are saying and are obviously being mislead in many cases is still a huge problem.
If politics is the problem, then perhaps getting rid of politics is the solution. Just have a bunch of Singapores run by Lee Kwan Yews, let folks like Nichols advise them, and let the common people “vote with their feet” by moving to the best states.
The problem with this solution is that “exit” doesn’t exist in the modern world in any meaningful way, and there are significant reasons why ordinary people oppose open borders.
Conclusion: 3/5 stars. It’s not a terrible book, and Nichols has plenty of good points, but “Americans are dumb” isn’t exactly fresh territory and much has already been written on the subject.
People think memetic viruses are just going to ask politely about infecting you, like the Jehovah’s Witnesses: “Hello, can I talk to you today about the importance of WWIII with Russia?”
No. Mind-viruses are not polite. They USE you. They use your empathy and compassion to make you feel like a shit person for rejecting them. They throw dying children in your face and demand that you start a war to save them.
They hijack your sense of yourself as a good person.
I call this the empathy trap.
Why did this take Stone Cold’s breath away? Why is it shocking?
It’s a basically true statement– the 3/5ths compromise originated in 1783 and was still around in 1789, when the 2nd Amendment was proposed–but soare “California became the 31st American state when I was deemed 3/5ths of a person,” “Napoleon invaded Russia when I was deemed 3/5ths of a person” and “The New York Times was founded, the safety elevator was invented, Massachusetts passed the nation’s first child employment laws, the first telegrams were sent, and Jane Eyre was published when I was deemed 3/5ths of a person.”
A lot happened between 1783 and 1861.
As unpleasant as the 3/5ths compromise is to think back on, we should remember that it was not passed because proponents thought black people only counted as “3/5ths of a person,” but because they didn’t want slave owners using census counts of non-voting slaves to get more votes for their states in the federal government. The 3/5ths compromise actually reduced the power of the slave-owning states relative to the non-slave owning states, in exchange for a break on taxes.
So this isn’t shocking because it’s factually true (I can come up with a whole list of equally true but unshocking statements) nor because the 3/5ths compromise was evil.
Perhaps it is shocking because it points out how old the 2nd Amendment is? But there are many other equally old–or older–things we find completely mundane. Mozart was writing operas in the 1790s; US copyright law began in the 1790s; Edward Jenner developed his smallpox vaccine in 1796; Benjamin Franklin invented the “swim fin” or flippers back in 1717. I don’t think anyone’s throwing out their flippers just because the concept is older than the entire country.
No; it’s shocking because “I was deemed 3/5ths of a person” appeals immediately to your sense of empathy.
Do you respond, “That doesn’t matter”?
“What do you mean, it doesn’t matter that I was considered only 3/5ths of a person? That matters a lot to me.”
“Oh, no, of course, I didn’t mean that it doesn’t matter like that, of course I understand that matters to you–”
Now you’re totally off-topic.
In order to see that this is a non sequitor, you first have to step back from the emotion. Push it aside, if you must. Yes, slavery was evil, but what does it have to do with the 2nd Amendment? Nothing. Reject the frame.
Mitochondrial memes are passed down from your parents and other trusted members of your family and community. You don’t typically have to be convinced of them; children tend to just believe their parents. That’s why you believed all of that business about Santa Claus. Meme viruses, by contrast, come from the wider community, typically strangers. Meme viruses have to convince you to adopt them, which can be quite a bit harder. This is why so many people follow their parents’ religion, and so few people convert to new religions as adults. Most religious transmission is basically mitochondrial–even if the Jehovah’s Witnesses show up at your doorstep fairly often.
To spread faster and more effectively, therefore, meme viruses have to convince you to lower your defenses and let them spread. They convince you that believing and spreading them is part of being a good person. They demand that if you really care about issue X, then you must also care about issue W, Y, and Z. “If you want to fight racism, you also have to go vegan, because all systems of oppression are intersectionally linked,” argues the vegan. “If you love Jesus, you must support capitalism because those godless commies hate Jesus.” Jesus probably also supported socialism and veganism, depending on whom you ask. “This photo of Kim Kardashian balancing a wine glass on her ass is problematic because once someone took a picture of a black woman in the same pose and that was racist.” “Al Qaeda launched an attack on 9-11, therefore we need to topple Saddam Hussein.” “A Serbian anarchist shot some Austro-Hungarian arch duke, therefore we need to have WWI.” “Assad used chemical weapons, therefore the US needs to go to war with Russia.”
Once you are sensitive to this method of framing, you’ll notice it fairly often.
There is a commonly-believed strategic model of terrorism which we could describe as follows: terrorists are people who are ideologically motivated to pursue specific unvarying political goals; to do so, they join together in long-lasting organizations and after the failure of ordinary political tactics, rationally decide to efficiently & competently engage in violent attacks on (usually) civilian targets to get as much attention as possible and publicity for their movement, and inspire fear & terror in the civilian population, which will pressure its leaders to solve the problem one way or another, providing support for the terrorists’ favored laws and/or their negotiations with involved governments, which then often succeed in gaining many of the original goals, and the organization dissolves.
Unfortunately, this model, is in almost every respect, empirically false.
It’s a great essay, so go read the whole thing before we continue. Don’t worry; I’ll wait.
Now, since I know half of you didn’t actually read the essay, I’ll summarize: terrorists are really bad at accomplishing their “objectives.” By any measure, they are really bad at it. Simply doing nothing would, in most cases, further their political goals more effectively.
This is in part because terrorists tend not to conquer and hold land, and in part because terrorism tends to piss off its targets, making them less likely to give in to the terrorists’ demands. Consider 9-11: sure, the buildings fell down, but did it result in America conceding to any of Al-Qaeda’s demands?
The article quotes Abrams 2012:
Jones and Libicki (2008) then examined a larger sample, the universe of known terrorist groups between 1968 and 2006. Of the 648 groups identified in the RAND-MIPT Terrorism Incident database, only 4% obtained their strategic demands. … Chenoweth and Stephan (2008, 2011) provide additional empirical evidence that meting out pain hurts non-state actors at the bargaining table. … These statistical findings are reinforced with structured in-case comparisons highlighting that escalating from nonviolent methods of protest such as petitions, sit-ins, and strikes to deadly attacks tends to dissuade government compromise. … Other statistical research (Abrahms, 2012, Fortna, 2011) demonstrates that when terrorist attacks are combined with such discriminate violence, the bargaining outcome is not additive; on the contrary, the pain to the population significantly decreases the odds of government concessions.3
(Aside: Remember, right-wing violence doesn’t work. It’s stupid and you will fail at accomplishing anything.)
Another “mystery” about terrorism is that it actually doesn’t happen very often. It’s not that hard to drive a truck into a crowd or attack people with a machete. Armies are expensive; coughing on grocery store produce is cheap.
If terrorism is 1. ineffective and 2. not even used that often, why do terrorist groups exist at all?
Terrorists might just be dumb, stupid people who try to deal with their problems by blowing them up, but there’s no evidence to this effect–terrorists are not less intelligent than the average person in their societies, anyway. People who are merely dumb and violent tend to get into fights with their neighbors, not take airplanes hostage.
Gwern suggests a different possibility: People join terrorist organizations because they want to be friends with the other terrorists. They’re like social clubs, but instead of bowling, you talk about how going on jihad would be totally awesome.
Things people crave: Meaning. Community. Brotherhood.
Terrorist organizations provide these to their members, most of whom don’t actually blow themselves up.
Friendships cultivated in the jihad, just as those forged in combat in general, seem more intense and are endowed with special significance. Their actions taken on behalf of God and the umma are experienced as sacred. This added element increases the value of friendships within the clique and the jihad in general and diminishes the value of outside friendships.
Enough about terrorists; let’s talk about Americans:
“Jihad” is currently part of the Islamic cultural script–that is, sometimes Muslims see some form of “jihad” as morally acceptable. (They are not unique in committing terrorism, though–Marxist terrorists have created trouble throughout Latin America, for instance, and the Tamil Tigers of Sri Lanka were one of the world’s deadliest groups.)
Thankfully, though, few major groups in the US see jihad or terrorist violence as acceptable, but… we have our exceptions.
For example, after a Jewish professor, Bret Weinstein, declined to stay home on a “Day of Absence” intended to force whites away from Evergreen State College, WA, violent protests erupted. Bands of students armed with bats and tasers roamed the campus, searching for Weinstein; the poor professor was forced to flee and eventually resign.
During a Berkeley protest on August 27, 2017, an estimated one hundred antifa protesters joined a crowd of 2,000–4,000 counter-protesters to attack a reported “handful” of alt-right demonstrators and Trump supporters who showed up for a “Say No to Marxism” rally that had been cancelled by organizers due to security concerns. Some antifa activists beat and kicked unarmed demonstrators and threatened to smash the cameras of anyone who filmed them.
Antifa, like terrorist groups, typically attract folks who are single and have recently left home–young people who have just lost the community they were raised in and in search of a new one.
The article recounts an amusing incident when a terrorist organization wanted to disband a cell, but struggled to convince its members to abandon their commitment to sacrificing themselves on behalf of jihad. Finally they hit upon a solution: they organized social get-togethers with women, then incentivised the men to get married, get jobs, and have babies. Soon all of the men were settled and raising children, too busy and invested in their new families to risk sacrificing it all for jihad. The cell dissolved.
Even Boko Haram was founded in response to the difficulties young men in Nigeria face in affording brides:
Our recent study found that marriage markets and inflationary brideprice are a powerful driver of participation in violence and drive recruitment into armed groups. Armed groups often arrange low-cost marriages for their members, help members afford brideprice, or provide extra-legal opportunities to acquire the capital necessary to take a wife. In Nigeria, in the years in which Boko Haram gained influence under founder Mohammed Yusuf, “items required for [a] successful [marriage] celebration kept changing in tune with inflation over the years.”66 A resident of the Railroad neighborhood of Maiduguri, where Yusuf established his mosque, recalled that in just a few years, Yusuf had facilitated more than 500 weddings. The group also provided support for young men to become “okada drivers,” who gained popularity for their affordable motorbike taxi services — who often used their profits to afford marriage. Thus, Boko Haram’s early recruits were often attracted by the group’s facilitation of marriage. Even in the aftermath of Yusuf’s assassination by the Nigerian state and the rise of Abubakar Shekau, the group has continued to exploit obstacles to marriage to attract supporters. The women and girls that are abducted by the group, estimated to number more than 6,000, are frequently married off to members of the group.
Antifa of course aren’t the only people in the US who commit violence; the interesting fact here is their organization. As far as I know, Dylan Roof killed more people than Antifa, but Roof acted alone.
I suggest, therefore, that the principle thing driving Antifa (and similar organizations) isn’t a rational pursuit of their stated objectives (did driving Milo out of Berkley actually protect any illegal immigrants from deportation?) but the same social factors that drive Muslims to join terrorist groups: camaraderie, brotherhood, and the feeling like they are leading meaningful, moral lives by sacrificing themselves for their chosen cause.
Right-wingers do this, too (the military is an obvious source of “meaning” and “brotherhood” in many people’s lives).
And the pool of unmarried people to recruit into extremist organizations is only growing in America.
But we don’t have to look to organizations that commit violence to find this pattern. Why change one’s avatar to a rainbow pattern to celebrate gay marriage or overlay a French flag after the Charlie Hebdo attack?
Why spend hours “fighting racism” by “deconstructing whiteness” online when you could do far more to help black people by handing out sandwiches at your local homeless shelter? (The homeless would also appreciate a hot lasagna.) What percentage of people who protest Islamophobia have actually bothered to befriend some Muslims and express support toward them?
The obvious answer is that these activities enhance the actor’s social standing among their friends and online compatriots. Congratulations received for turning your profile picture different colors: objective achieved. Actions that would actually help the targeted group require more effort and return less adulation, since they have to be done in real life.
Liberal groups seem to be better at social organizing–thus I’ve had an easier time coming up with liberal examples of this phenomenon. Conservative political organizations, at least in the US, seem to be smaller and offer less in the way of social benefits (this may be in part because conservatives are more likely to be married, employed, and have children, and because conservatives are more likely to channel such energies into their churches,) but they also do their share of social signaling that doesn’t achieve its claimed goal. “White pride” organizations, for example, generally do little to improve whites’ public image.
But is this an aberration? Or are things operating as designed? What’s the point of friendship and social standing in the first place?
Interestingly, in JaneGoodall‘s account of chimps in the Gombe, we see parallels to the origins of human social structures and friendships. Only male chimps consistently have what we would call “friendships;” females instead tend to live in groups with their children. Male friends benefit from each other’s assistance in hunting and controlling access to other food, like the coveted bananas. A single strong male may dominate a troop of chimps, but a coalition can bring him to a bloody end. Persistent dominance of a chimp troop (and thus dominance of food) is thus easier for males who have a strong coalition on their side–that is, friends.
From these things therefore it is clear that the city-state is a natural growth, and that man is by nature a political animal, and a man that is by nature and not merely by fortune citiless is either low in the scale of humanity or above it … inasmuch as he is solitary, like an isolated piece at draughts.
And why man is a political animal in a greater measure than any bee or any gregarious animal is clear. For nature, as we declare, does nothing without purpose; and man alone of the animals possesses speech. … speech is designed to indicate the advantageous and the harmful, and therefore also the right and the wrong; for it is the special property of man in distinction from the other animals that he alone has perception of good and bad and right and wrong and the other moral qualities, and it is partnership in these things that makes a household and a city-state.
Most people desire to be members in good standing in their communities:
Thus also the city-state is prior in nature to the household and to each of us individually.  For the whole must necessarily be prior to the part; since when the whole body is destroyed, foot or hand will not exist except in an equivocal sense… the state is also prior by nature to the individual; for if each individual when separate is not self-sufficient, he must be related to the whole state as other parts are to their whole, while a man who is incapable of entering into partnership, or who is so self-sufficing that he has no need to do so, is no part of a state, so that he must be either a lower animal or a god.
Therefore the impulse to form a partnership of this kind is present in all men by nature… –Aristotle, Politics, Book 1
The spread of the internet has changed both who we’re talking to (the people in our communities) and how we engage with them, resulting in, I hypothesize, a memetic environment that increasingly favors horizontally (rather than vertically) transmitted memes. (If you are not familiar with this theory, I wrote about it here, here, here, here, here, here, here, and here.) Vertically spread memes tend to come from your parents and are survival-oriented; horizontal memes come from your friends and are social. A change in the memetic environment, therefore, has the potential to change the landscape of social, moral, and political ideas people frequently encounter–and has allowed us to engage in nearly costless, endless social signaling.
The result of that, it appears, is political polarization:
According to Pew:
A decade ago, the public was less ideologically consistent than it is today. In 2004, only about one-in-ten Americans were uniformly liberal or conservative across most values. Today, the share who are ideologically consistent has doubled: 21% express either consistently liberal or conservative opinions across a range of issues – the size and scope of government, the environment, foreign policy and many others.
The new survey finds that as ideological consistency has become more common, it has become increasingly aligned with partisanship. Looking at 10 political values questions tracked since 1994, more Democrats now give uniformly liberal responses, and more Republicans give uniformly conservative responses than at any point in the last 20 years.
This, of course, makes it harder for people to find common ground for compromises.
So if we want a saner, less histrionic political culture, the first step may be encouraging people to settle down, get married, and have children, then work on building communities that let people feel a sense of meaning in their real lives.
Still, I think letting your friends convince you that blowing yourself is a good idea is pretty dumb.
Note: “Memes” on this blog is used as it is in the field of memetics, representing units of ideas that are passed from person to person, not in the sense of “funny cat pictures on the internet.”
“Mitochondrial memes” are memes that are passed vertically from parent to child, like “it’s important to eat your dinner before desert” or “brush your teeth twice a day or your teeth will rot out.”
“Meme viruses” (I try to avoid the confusing phrase, “viral memes,”) are memes that are transmitted horizontally through society, like chain letters and TV news.
I’ve spent a fair amount of time warning about some of the potential negative results of meme viruses, but today I’d like to discuss one of their greatest strengths: you can transmit them to other people without using them yourself.
Let’s start with genetics. It is very easy to quickly evolve in a particular direction if a variety of relevant traits already exist in a population. For example, humans already vary in height, so if you wanted to, say, make everyone on Earth shorter, you would just have to stop all of the tall people from reproducing. The short people would create the next generation, and it would be short.
But getting the adult human height below 3″ tall requires not just existing, normal human height variation, but exploiting random mutations. These are rare and the people who have them normally incur huge reductions in fitness, as they often have problems with bone growth, intelligence, and giving birth.
Most random mutations simply result in an organism’s death. Very few are useful, and those that are have to beat out all of the other local genetic combinations to actually stick around.
Suppose you happen to be born with a very lucky genetic trait: a rare mutation that lets you survive more easily in an arctic environment.
But you were born in Sudan.
Your genetic trait could be really useful if you could somehow give it away to someone in Siberia, but no, you are stuck in Sudan and you are really hot all of the time and then you die of heatstroke.
With the evolution of complex thought, humans (near alone among animals) developed the ability to go beyond mere genetic abilities, instincts, and impulses, and impart stores of knowledge to the next generation. Humanity has been accumulating mitochondrial memes for millions of years, ever since the first human showed another human how to wield fire and create stone tools. (Note: the use of fire and stone tools predates the emergence of homo Sapiens by a long while, but not the Homo genus.)
But mitochondrial memes, to get passed on, need to offer some immediate benefit to their holders. Humans are smart enough–and the utility of information unpredictable enough–that we can hold some not obviously useful or absurd ideas, but the bulk of our efforts have to go toward information that helps us survive.
(By definition, mitochondrial memes aren’t written down; they have to be remembered.)
If an idea doesn’t offer some benefit to its holder, it is likely to be quickly forgotten–even if it could be very useful to someone else.
Suppose one day you happen to have a brilliant new idea for how to keep warm in a very cold environment–but you live in Sudan. If you can’t tell your idea to anyone who lives somewhere cold, your idea will never be useful. It will die with you.
But introduce writing, and ideas of no use to their holder can be recorded and transmitted to people who can use them. For example, in 1502, Leonardo da Vinci designed a 720-foot (220 m) bridge for Ottoman SultanBeyazid II of Constantinople. The sultan never built Leonardo’s bridge, but in 2001, a bridge based on his design was finally built in Norway. Leonardo’s ideas for flying machines, while also not immediately useful, inspired generations of future engineers.
Viral memes don’t have to be immediately useful to stick around. They can be written down, tucked into a book, and picked up again a hundred years later and a thousand miles away by someone who can use them. A person living in Sudan can invent a better way to stay warm, write it down, and send it to someone in Siberia–and someone in Siberia can invent a better way to stay cool, write it down, and send it back.
Many modern scientific and technological advances are based on the contributions of not one or two or ten inventors, but thousands, each contributing their unpredictable part to the overall whole. Electricity, for example, was a mere curiosity when Thales of Miletus wrote about effects of rubbing amber to produce static electricity (the word “electricity” is actually derived from the Greek for “amber”;) between 1600 and 1800, scientists began studying electricity in a more systematic way, but it still wasn’t useful. It was only with the invention of the telegraph from many different electrical parts and systems, (first working model, 1816; first telegram sent in the US, 1838;) that electricity became useful. With the invention of electric lights and the electrical grids necessary to power them (1870s and 80s,) electricity moved into people’s homes.
The advent of meme viruses has thus given humanity two gifts: 1. People can use technology like books and the internet to store more information than we can naturally, like external hard-drives for our brains; and 2. we can preserve and transmit ideas that aren’t immediately useful to ourselves to people who can use them.
Homo sapiens is about 200-300,000 years old, depending on exactly where you draw the line between us and our immediate ancestors. Printing (and eventually mass literacy) only got going about 550 years ago, with the development of the Gutenberg press. TV, radio, movies, and the internet only became widespread within the past century, and internet in the past 25 years.
In other words, for 99.99% of human history, “mass media” didn’t exist.
How did illiterate peasants learn about the world, if not from books, TV, or Youtube videos? Naturally, from each other: parents passed knowledge to children; tribal elders taught their wisdom to other members of their tribes; teenagers were apprenticed to masters who already knew a trade, etc.
A hundred years ago, if you wanted to know how to build a wagon, raise a barn, or plant corn, you generally had to find someone who knew how to do so and ask them. Today, you ask the internet.
Getting all of your information from people you know is limiting, but it has two advantages: you can easily judge whether the source of your information is reliable, (you’re not going to take farming advice from your Uncle Bob whose crops always fail,) and most of the people giving you information have your best interests at heart.
The internet’s strength is that it lets us talk to people from outside our own communities; it’s weakness is that this makes it much easier for people (say, Nigerian princes with extra bank accounts,) to get away with lying. They also have no particular interest one way or another in your survival–unlike your parents.
In a mitochondrial memetic environment (that is, an environment where you get most of your information from relatives,) memes that could kill you tend to get selected against: parents who encourage their children to eat poison tend not to have grandchildren. From an evolutionary perspective, deadly memes are selected against in a mitochondrial environment; memes will evolve to support your survival.
By contrast, in a viral meme environment, (that is, an environment where ideas can easily pass from person to person without anyone having to give birth,) your personal survival is not all that important to the idea’s success.
So one of the risks of viral memes is getting scammed: memetically, infected by an idea that sounds good but actually benefits someone else at your expense.
In the mitochondrial environment, we expect people to be basically cautious; in the viral, less cautious.
Suppose we have two different groups (Group A and Group B) interacting. 25% of Group B is violent criminals, versus 5% of Group A. Folks in group A would quite logically want to avoid Group B. But 75% of Group B is not violent criminals, and would logically not want to be lumped in with criminals. (For that matter, neither do the 25% who are.)
In an ideal world, we could easily sort out violent criminals from the rest of the population, allowing the innocent people to freely associate. In the real world, we have to make judgment calls. Lean a bit toward the side of caution, and you exclude more criminals, but also more innocents; lean the opposite direction and innocent people have an easier time finding jobs and houses, but more people get killed by criminals.
Let’s put it less abstractly: suppose you are walking down a dimly-lit street at night and see a suspicious looking person coming toward you. It costs you almost nothing to cross the street to avoid them, while not crossing the street could cost you your life. The person you avoided, if they are innocent, incurs only the expense of potentially having their feelings hurt; if they are a criminal, they have lost a victim.
Companies also want to avoid criminals, which makes it hard for ex-cons to get jobs (which is an issue if we want folks who are no longer in prison to have an opportunity to earn an honest living besides going on welfare.) Unfortunately, efforts to improve employment chances for ex-cons by preventing employers from inquiring directly about criminal history have resulted in employers using rougher heuristics to exclude felons, like simply not hiring young African American males. Since most companies have far more qualified job applicants than available jobs, the cost to them of excluding young African American males is fairly low–while the cost to African Americans is fairly high.
One of the interesting things about the past 200 years is the West’s historically unprecedented shift from racial apartheid/segregation and actual race-based slavery to full legal (if not always de facto) racial integration.
One of the causes of this shift was doubtless the transition from traditional production modes like farming and horticulture to the modern, industrial economy. Subsistence farming didn’t require a whole lot of employees. Medieval peasants didn’t change occupations very often: most folks ended up working in the same professions as their parents, grandparents, and great-grandparents (usually farming,) probably even on the same estate.
It was only with industrialization that people and their professions began uncoupling; a person could now hold multiple different jobs, in different fields, over the span of years.
Of course, there were beginnings of this before the 1800s–just as people read books before the 1800s–but accelerating technological development accelerated the trends.
But while capitalists want to hire the best possible workers for the lowest possible wages, this doesn’t get us all the way to the complete change we’ve witnessed in racial mores. After all, companies don’t want to hire criminals, either, and any population that produces a lot of criminals tends not to produce a whole lot of really competent workers.
However, the rise of mass communication has allowed us to listen to and empathize with far more people than ever before. When Martin Luther King marched on Washington and asked to be judged by the content of his character rather than the color of his skin, his request only reached national audiences because of modern media, because we now live in a society of meme viruses. And it worked: integration happened.
Also, crime went up dramatically:
While we’re at it:
Integration triggered a massive increase in crime, which only stopped because… well, we’re not sure, but a corresponding massive increase in the incarceration rate (and sentences) has probably stopped a lot of criminals from committing additional crimes.
Most of these homicides were black on black, but plenty of the victims were white, even as they sold their devalued homes and fled the violence. (Housing integration appears to have struck America’s “ethnic” neighborhoods of Italians, Irish, and Jews particularly hard, destroying coherent communities and, I assume, voting blocks.)
From the white perspective, integration was tremendously costly: people died. Segregation might not be fair, it might kill black people, but it certainly prevented the murder of whites. But segregation, as discussed, does have some costs for whites: you are more limited in all of your transactions, both economic and personal. You can’t sell your house to just anyone you want. Can’t hire anyone you want. Can’t fall in love with anyone you want.
But obviously segregation is far more harmful to African Americans.
Despite all of the trouble integration has caused for whites, the majority claim to believe in it–even though their feet tell a different story. This at least superficial change in attitudes, I believe, was triggered by the nature of the viral memetic environment.
Within the mitochondrial meme environment, you listen to people who care about your survival and they pass on ideas intended to help you survive. They don’t typically pass on ideas that sacrifice your survival for the sake of others, at least not for long. Your parents will tell you that if you see someone suspicious, you should cross the street and get away.
In the viral environment, you interact far more with people who have their own interests in mind, not yours, and these folks would be perfectly happy for you to sacrifice your survival for their sake. The good folks at Penn State would like you to know that locking your car door when a black person passes by is a “microaggression:”
Former President Obama once said in his speech that he was followed when he was shopping in a store, heard the doors of cars locked as he was walking by, and a woman showed extremely nervousness as he got on an elevator with him (Obama, 2013). Those are examples of nonverbal microaggressions. It is disturbing to learn that those behaviors are often automatic that express “put-downs” of individuals in marginalized groups (Pierce et al., 1977). What if Obama were White, would he receive those unfair treatments?
(If Obama were white, like Hillary Clinton, he probably wouldn’t have been elected president.)
For some reason, black people shoplifting, carjacking, or purse-snatching are never described as “microaggressions;” a black person whose feelings are hurt has been microaggressed, but a white person afraid of being robbed or murdered has not been.
This post was actually inspired by an intra-leftist debate:
Shortly after the highly successful African-star-studded movie Black Panther debuted, certain folks, like Faisal Kutty, started complaining that the film is “Islamophobic” because of a scene where girls are rescued from a Boko Haram-like organization.
Never mind that Boko Haram is a real organization, that it actually kidnaps girls, that it has killed more people than ISIS and those people it murders are Africans. Even other Black African Muslims think Boko Haram is shit. (Though obviously BH has its supporters.)
Here we have two different groups of people with different interests: one, Muslims with no particular ties to Africa who don’t want people to associate them with Boko Haram, and two, Black Muslims who don’t want to get killed by folks like Boko Haram.
It is exceedingly disingenuous for folks like Faisal Kutty to criticize as immoral an accurate portrayal of a group that is actually slaughtering thousands of people just because he might accidentally be harmed by association. More attention on Boko Haram could save lives; less attention could result in more deaths–the dead just wouldn’t be Kutty, who is safe in Canada.
Without mass media, I don’t think this kind of appeal works: survival memes dominate and people take danger very seriously. “Some stranger in Canada might be inconvenienced over this” loses to “these people slaughter children.” With mass media, the viral environment allows appeals to set aside your own self-interest and ignore danger in favor of “fairness” and “equality” for everyone in the conversation to flourish.
So far this post has focused primarily on the interests of innocent people, but criminals have interests, too–and criminals would like you to make it easier for them to commit crime.
Simon Mol (6 November 1973 in Buea, Cameroon – 10 October 2008) was the pen name of Simon Moleke Njie, a Cameroon-born journalist, writer and anti-racist political activist. In 1999 he sought political asylum in Poland; it was granted in 2000, and he moved to Warsaw, where he became a well-known anti-racist campaigner. …
In 2005 he organized a conference with Black ambassadors in Poland to protest the claims in an article in Wiedza i Życie by Adam Leszczyński about AIDS problems in Africa, which quoted research stating that a majority of African women were unable to persuade their HIV positive husbands to wear condoms, and so later got caught HIV themselves. Mol accused Leszczyński of prejudice because of this publication. …
Honorary member of the British International Pen Club Centre.
In 2006 Mol received the prestigious award “Oxfam Novib/PEN Award for Freedom of Expression”.
In February 2006, further to his partner’s request for him to take an HIV test, Mol declined and published a post on his blog explaining why not:
Character assassination isn’t a new phenomenon. However, it appears here the game respects no rules. It wouldn’t be superfluous to state that there is an ingrained, harsh and disturbing dislike for Africans here. The accusation of being HIV positive is the latest weapon that as an African your enemy can raise against you. This ideologically inspired weapon, is strengthened by the day with disturbing literature about Africa from supposed-experts on Africa, some of whom openly boast of traveling across Africa in two weeks and return home to write volumes. What some of these hastily compiled volumes have succeeded in breeding, is a social and psychological conviction that every African walking the street here is supposedly HIV positive, and woe betide anyone who dares to unravel the myth being put in place.
On the 3rd of January 2007 Mol was taken into custody by the Polish police and charged with infecting his sexual partners with HIV. …
According to the Rzeczpospolita newspaper, he was diagnosed with HIV back in 1999 while living in a refugee shelter, but Polish law does not force an HIV carrier to reveal his or her disease status.
According to the police inspector who was investigating his case, a witness stated that Mol refused to wear condoms during sex. An anonymous witness in one case said that he accused a girl who demanded he should wear them that she was racist because as he was Black she thought he must be infected with HIV. After sexual intercourse he used to say to his female partners that his sperm was sacred.
In an unusual move, his photo with an epidemiological warning, was ordered to be publicly displayed by the then Minister of Justice Zbigniew Ziobro. MediaWatch, a body that monitors alleged racism, quickly denounced this decision, asserting that it was a breach of ethics with racist implications, as the picture had been published before any court verdict. They saw it as evidence of institutional racism in Poland, also calling for international condemnation. …
After police published Mol’s photo and an alert before the start of court proceedings, Warsaw HIV testing centers were “invaded by young women”. A few said that they knew Mol. Some of the HIV tests have been positive. According to the police inspector who had been monitoring the tests and the case: “Some women very quickly started to suffer drug-resistant tonsillitis and fungal infections. They looked wasted, some lost as many as 15 kilograms and were deeply traumatized, impeding us taking the witness statements. 18 additional likely victims have been identified thereby”. Genetic tests of the virus from the infectees and Simon proved that it was specific to Cameroon.
In other words, Simon Mol was a sociopath who used the accusation of “racism” to murder dozens of women.
Criminals–of any race–are not nice people. They will absolutely use anything at their disposal to make it easier to commit crime. In the past, they posed as police officers, asked for help finding their lost dog, or just rang your doorbell. Today they can get intersectional feminists and international human rights organizations to argue on their behalf that locking your door or insisting on condoms is the real crime.
As ANI (Asian News International) reports on Twitter (h/t Rohit):
For those of you reading this in the future, after the 15 minutes of manufactured furor have subsided, #MarcyForOurLives is an anti-guns/pro-gun control movement in the US. Gun laws in India are notably much stricter than gun laws in the US, and yet–
The thing that looks like a mushroom is the internal part of a uterus; you can see the rest of the drawing faintly around it. As noted, this is completely backwards from the reality in India, where it is nearly impossible to buy a gun but abortions are extremely common and completely legal. So where did the marchers in Mumbai get this sign?
Well, it’s a meme, found on Twitter, instagram, t-shirts, and of course signs at pussyhat rallies in the US. It’s not even true in the US, but at least it kind of makes sense given our frequent debates over both guns and abortions. Certainly there are some people in the US who think abortions should be completely illegal. India, by contrast, is a nation where slowing the growth rate to prevent famine is a high priority and abortions are quite legal.
I am reminded of that time Michelle Obama tweeted #BringBackOurGirls in support of Nigerians kidnapped by Boko Haram:
This is the signature of a mind-virus: it makes you repeat things that make no sense in context. It makes you spread the virus even though it does not make logical sense for you, personally, to spread it. Michelle Obama is married to a man who controlled, at the time, the world’s largest military, including an enormous stockpile of nuclear weapons, and yet she was tweeting ineffective hashtags to be part of the #movement.
Likewise, the state of gun (and abortion) laws in India is nothing like their state in the US, yet Indians are getting sucked into spreading our viral memes.
Horizontal meme transfer–like social media–promotes the spread of memetic viruses.
The material-grievances theory and the cultural-resentments theory can fit together because, in both cases, they tell us that people voted for Trump out of a perceived self-interest, which was to improve their faltering economic and material conditions, or else to affirm their cultural standing vis-à-vis the non-whites and the bicoastal elites. Their votes were, from this standpoint, rationally cast. … which ultimately would suggest that 2016’s election was at least a semi-normal event, even if Trump has his oddities. But here is my reservation.
I do not think the election was normal. I think it was the strangest election in American history in at least one major particular, which has to do with the qualifications and demeanor of the winning candidate. American presidents over the centuries have always cultivated, after all, a style, which has been pretty much the style of George Washington, sartorially updated. … Now, it is possible that, over the centuries, appearances and reality have, on occasion, parted ways, and one or another president, in the privacy of his personal quarters, or in whispered instructions to his henchmen, has been, in fact, a lout, a demagogue, a thug, and a stinking cesspool of corruption. And yet, until just now, nobody running for the presidency, none of the serious candidates, would have wanted to look like that, and this was for a simple reason. The American project requires a rigorously republican culture, without which a democratic society cannot exist—a culture of honesty, logic, science, and open-minded debate, which requires, in turn, tolerance and mutual respect. Democracy demands decorum. And since the president is supposed to be democracy’s leader, the candidates for the office have always done their best to, at least, put on a good act.
The author (Paul Berman) then proposes Theory III: Broad Cultural Collapse:
A Theory 3 ought to emphasize still another non-economic and non-industrial factor, apart from marriage, family structure, theology, bad doctors, evil pharmaceutical companies, and racist ideology. This is a broad cultural collapse. It is a collapse, at minimum, of civic knowledge—a collapse in the ability to identify political reality, a collapse in the ability to recall the nature of democracy and the American ideal. An intellectual collapse, ultimately. And the sign of this collapse is an inability to recognize that Donald Trump has the look of a foreign object within the American presidential tradition.
Berman is insightful until he blames cultural collapse on the educational system (those dastardly teachers just decided not to teach about George Washington, I guess.)
We can’t blame education. Very few people had many years of formal education of any sort back in 1776 or 1810–even in 1900, far fewer people completed highschool than do today. The idea that highschool civics class was more effectively teaching future voters what to look for in a president in 1815 than today therefore seems unlikely.
If anything, in my (admittedly limited, parental) interactions with the local schools, education seem to lag national sentiment. For example, the local schools still cover Columbus Day in a pro-Columbus manner (and I don’t even live in a particularly conservative area) and have special Veterans’ Day events. School curricula are, I think, fairly influenced by the desires of the Texas schools, because Texas is a big state that buys a lot of textbooks.
I know plenty of Boomers who voted for Trump, so if we’re looking at a change in school curricula, we’re looking at a shift that happened half a century ago (or more,) but only recently manifested.
That said, I definitely feel something coursing through society that I could call “Cultural Collapse.” I just don’t think the schools are to blame.
Yesterday I happened across children’s book about famous musicians from the 1920s. Interwoven with the biographies of Beethoven and Mozart were political comments about kings and queens, European social structure and how these musicians of course saw through all of this royalty business and wanted to make music for the common people. It was an articulated ideology of democracy.
Sure, people today still think democracy is important, but the framing (and phrasing) is different. The book we recently read of mathematicians’ biographies didn’t stop to tell us how highly the mathematicians thought of the idea of common people voting (rather, when it bothered with ideology, it focused on increasing representation of women in mathematics and emphasizing the historical obstacles they faced.)
According to the Mounk-Foa early-warning system, signs of democratic deconsolidation in the United States and many other liberal democracies are now similar to those in Venezuela before its crisis.
Across numerous countries, including Australia, Britain, the Netherlands, New Zealand, Sweden and the United States, the percentage of people who say it is “essential” to live in a democracy has plummeted, and it is especially low among younger generations. …
Support for autocratic alternatives is rising, too. Drawing on data from the European and World Values Surveys, the researchers found that the share of Americans who say that army rule would be a “good” or “very good” thing had risen to 1 in 6 in 2014, compared with 1 in 16 in 1995.
That trend is particularly strong among young people. For instance, in a previously published paper, the researchers calculated that 43 percent of older Americans believed it was illegitimate for the military to take over if the government were incompetent or failing to do its job, but only 19 percent of millennials agreed. The same generational divide showed up in Europe, where 53 percent of older people thought a military takeover would be illegitimate, while only 36 percent of millennials agreed.
Note, though, that this is not a local phenomenon–any explanation that explains why support for democracy is down in the US needs to also explain why it’s down in Sweden, Australia, Britain, and the Netherlands (and maybe why it wasn’t so popular there in the first place.)
Here are a few different theories besides failing schools:
Less common culture, due to integration and immigration
More international culture, due to the internet, TV, and similar technologies
Put yourself in your grandfather or great-grandfather’s shoes, growing up in the 1910s or 20s. Cars were not yet common; chances were if he wanted to go somewhere, he walked or rode a horse. Telephones and radios were still rare. TV barely existed.
If you wanted to talk to someone, you walked over to them and talked. If you wanted to talk to someone from another town, either you or they had to travel, often by horse or wagon. For long-distance news, you had newspapers and a few telegraph wires.
News traveled slowly. People traveled slowly (most people didn’t ride trains regularly.) Most of the people you talked to were folks who lived nearby, in your own community. Everyone not from your community was some kind of outsider.
During World War II, for example, three German submariners escaped from Camp Crossville, Tennessee. Their flight took them to an Appalachian cabin, where they stopped for a drink of water. The mountain granny told them to git.” When they ignored her, she promptly shot them dead. The sheriff came, and scolded her for shooting helpless prisoners. Granny burst into tears, and said that she wold not have done it if she had known the were Germans. The exasperated sheriff asked her what in “tarnation” she thought she was shooting at. “Why,” she replied, “I thought they was Yankees!”
And then your grandfather got shipped out to get shot at somewhere in Europe or the Pacific.
Today, technology has completely transformed our lives. When we want to talk to someone or hear their opinion, we can just pick up the phone, visit facebook, or flip on the TV. We have daily commutes that would have taken our ancestors a week to walk. People expect to travel thousands of miles for college and jobs.
The effect is a curious inversion: In a world where you can talk to anyone, why talk to your neighbors? Personally, I spend more time talking to people in Britain than the folks next door, (and I like my neighbors.)
Now, this blog was practically founded on the idea that this technological shift in the way ideas (memes) are transmitted has a profound effect on the kinds of ideas that are transmitted. When ideas must be propagated between relatives and neighbors, these ideas are likely to promote your own material well-being (as you must survive well enough to continue propagating the idea for it to go on existing,) whereas when ideas can be easily transmitted between strangers who don’t even live near each other, the ideas need not promote personal survival–they just need to sound good. (I went into more detail on this idea back in Viruses Want you to Spread Them, Mitochondrial Memes, and The Progressive Virus.)
How do these technological shifts affect how we form communities?
In a groundbreaking book based on vast data, Putnam shows how we have become increasingly disconnected from family, friends, neighbors, and our democratic structures– and how we may reconnect.
Putnam warns that our stock of social capital – the very fabric of our connections with each other, has plummeted, impoverishing our lives and communities.
Putnam draws on evidence including nearly 500,000 interviews over the last quarter century to show that we sign fewer petitions, belong to fewer organizations that meet, know our neighbors less, meet with friends less frequently, and even socialize with our families less often. We’re even bowling alone. More Americans are bowling than ever before, but they are not bowling in leagues. Putnam shows how changes in work, family structure, age, suburban life, television, computers, women’s roles and other factors have contributed to this decline.
The National Science Foundation (NSF) reported in its General Social Survey (GSS) that unprecedented numbers of Americans are lonely. Published in the American Sociological Review (ASR) and authored by Miller McPhearson, Lynn Smith-Lovin, and Matthew Brashears, sociologists at Duke and the University of Arizona, the study featured 1,500 face-to-face interviews where more than a quarter of the respondents — one in four — said that they have no one with whom they can talk about their personal troubles or triumphs. If family members are not counted, the number doubles to more than half of Americans who have no one outside their immediate family with whom they can share confidences. Sadly, the researchers noted increases in “social isolation” and “a very significant decrease in social connection to close friends and family.”
Rarely has news from an academic paper struck such a responsive nerve with the general public. These dramatic statistics from ASR parallel similar trends reported by the Beverly LaHaye Institute — that over the 40 years from 1960 to 2000 the Census Bureau had expanded its analysis of what had been a minor category. The Census Bureau categorizes the term “unrelated individuals” to designate someone who does not live in a “family group.” Sadly, we’ve seen the percentage of persons living as “unrelated individuals” almost triple, increasing from 6 to 16 percent of all people during the last 40 years. A huge majority of those classified as “unrelated individuals” (about 70 percent) lived alone.
Long-run data from the US, where the General Social Survey (GSS) has been gathering information about trust attitudes since 1972, suggests that people trust each other less today than 40 years ago. This decline in interpersonal trust in the US has been coupled with a long-run reduction in public trust in government – according to estimates compiled by the Pew Research Center since 1958, today trust in the government in the US is at historically low levels.
Interpersonal trust attitudes correlate strongly with religious affiliation and upbringing. Some studies have shown that this strong positive relationship remains after controlling for several survey-respondent characteristics.1This, in turn, has led researchers to use religion as a proxy for trust, in order to estimate the extent to which economic outcomes depend on trust attitudes. Estimates from these and other studies using an instrumental-variable approach, suggest that trust has a causal impact on economic outcomes.2 This suggests that the remarkable cross-country heterogeneity in trust that we observe today, can explain a significant part of the historical differences in cross-country income levels.
Measures of trust from attitudinal survey questions remain the most common source of data on trust. Yet academic studies have shown that these measures of trust are generally weak predictors of actual trusting behaviour. Interestingly, however, questions about trusting attitudes do seem to predict trustworthiness. In other words, people who say they trust other people tend to be trustworthy themselves.3
Our technological shifts haven’t just affected ideas and conversations–with people able to travel thousands of miles in an afternoon, they’ve also affected the composition of communities. The US in 1920 was almost 90% white and 10% black, (with that black population concentrated in the segregated South). All other races together totaled only a couple percent. Today, the US is <65% white, 13% black, 16% Hispanic, 6% Asian and Native American, and 9% “other” or multi-racial.
Similar changes have happened in Europe, both with the creation of the Free Movement Zone and the discovery that the Mediterranean isn’t that hard to cross, though the composition of the newcomers obviously differs.
Diversity may have its benefits, but one of the things it isn’t is a common culture.
With all of these changes, do I really feel that there is anything particularly special about my local community and its norms over those of my British friends?
What about Disney?
Well, Disney’s most profitable product hasn’t exactly been pro-democracy, though I doubt a few princess movies can actually budge people’s political compasses or vote for Trump (or Hillary.) But what about the general content of children’s stories? It sure seems like there are a lot fewer stories focused on characters from American history than in the days when Davy Crockett was the biggest thing on TV.
Of course this loops back into technological changes, as American TV and movies are enjoyed by an increasingly non-American audience and media content is driven by advertisers’ desire to reach specific audiences (eg, the “rural purge” in TV programming, when popular TV shows aimed at more rural or older audiences were cancelled in favor of programs featuring urban characters, which advertisers believed would appeal to younger viewers with more cash to spend.)
If cultural collapse is happening, it’s not because we lack for civics classes, but because civics classes alone cannot create a civic culture where there is none.
Sayyid Qutb lived from 1906 – 1966. He was an Egyptian writer, thinker, and leader of the Muslim Brotherhood. He was executed in 1966 for plotting to assassinate the Egyptian president, Nasser.
The Muslim Brotherhood was founded back in 1928 by Islamic scholarHassan al-Banna. Its goal is to instill the Quran and the Sunnah as the “sole reference point for … ordering the life of the Muslim family, individual, community … and state”; mottos include “Believers are but Brothers”, “Islam is the Solution”, and “Allah is our objective; the Qur’an is the Constitution; the Prophet is our leader; jihad is our way; death for the sake of Allah is our wish”.
The MB’s philosophy is pan-Islamist and it wields power in several countries:
323/354 seats in the Sudanese National Assembly,
74/132 seats in the Palestian Legislature,
69/217 seats in the Tunisian assembly,
39/249 seats in the Afghan House,
46/301 seats in Yemen,
16/146 seats in Mauritania,
40/560 seats in Indonesia
2/40 seats in Bahrain
and 4/325 and 1/128 in Iraq and Lebanon, respectively
In 2012, the MB sponsored the elected political party in Egypt (following the January Revolution in 2011,) but has had some trouble in Egypt since then.
The MB also does charity work, runs hospitals, etc., and is clearly using democratic means to to assemble power.
According to Wikipedia:
As Islamic Modernist beliefs were co-opted by secularist rulers and official `ulama, the Brotherhood has become traditionalist and conservative, “being the only available outlet for those whose religious and cultural sensibilities had been outraged by the impact of Westernisation”. Al-Banna believed the Quran and Sunnah constitute a perfect way of life and social and political organization that God has set out for man. Islamic governments must be based on this system and eventually unified in a Caliphate. The Muslim Brotherhood’s goal, as stated by its founder al-Banna was to drive out British colonial and other Western influences, reclaim Islam’s manifest destiny—an empire, stretching from Spain to Indonesia. The Brotherhood preaches that Islam will bring social justice, the eradication of poverty, corruption and sinful behavior, and political freedom (to the extent allowed by the laws of Islam).
Back to Qutb:
In the early 1940s, he encountered the work of Nobel Prize-winner FrencheugenicistAlexis Carrel, who would have a seminal and lasting influence on his criticism of Western civilization, as “instead of liberating man, as the post-Enlightenment narrative claimed, he believed that Western modernity enmeshed people in spiritually numbing networks of control and discipline, and that rather than build caring communities, it cultivated attitudes of selfish individualism. Qutb regarded Carrel as a rare sort of Western thinker, one who understood that his civilization “depreciated humanity” by honouring the “machine” over the “spirit and soul” (al-nafs wa al-ruh). He saw Carrel’s critique, coming as it did from within the enemy camp, as providing his discourse with an added measure of legitimacy.”
“As a brown person in Greeley, Colorado in the late 1940’s studying English he came across much prejudice. He was appalled by what he perceived as loose sexual openness of American men and women (a far cry from his home of Musha, Asyut). This American experience was for him a fine-tuning of his Islamic identity.”…
Qutb concluded that major aspects of American life were primitive and “shocking”, a people who were “numb to faith in religion, faith in art, and faith in spiritual values altogether”. His experience in the U.S. is believed to have formed in part the impetus for his rejection of Western values and his move towards Islamism upon returning to Egypt.
The man has a point. American art has a lot of Jackson Pollock and Andy Warhol schtick.
In 1952, the Egyptian monarchy–which was pro-western–was overthrown by nationalists (?) like Nasser. At first Nasser and Qutb worked together, but there was something of a power struggle and Qutb didn’t approve of Nasser organizing the new Egypt along essentially secular lines instead of Islamic ideology, at which point Qutb tried to have Nasser assassinated and Nasser had Qutb arrested, tortured, and eventually hung.
Aside from the fact that Qutb is Egyptian and Muslim, he and the alt-right have a fair amount in common. (Read his Wikipedia Page if you don’t see what I mean.) The basic critique that the West is immoral, degenerate, has bad art, bad manners, and that capitalism has created a “spiritually numbing” network of control (your boss, office dress codes, the HOA, paperwork), and a return to spirituality (not rejecting science, but enhancing it,) can fix these things.
My impression–Muslim monarchs tend to be secular modernists. They see the tech other countries have (especially bombs) and want it. They see the GDPs other countries have, and want that, too. They’re not that interested in religion (which would limit their behavior) and not that interested in nationalism (as they tend to rule over a variety of different “nations.”) Many monarchs are (or were) quite friendly to the West. The King of Jordan and Shah of Iran come immediately to mind.
(I once met the Director of the CIA. He had a photograph of the King of Jordan in his office. Motioning to the photo, he told me the King was one of America’s friends.)
But modernization isn’t easy. People who have hundreds or thousands of years’ experience living a particular lifestyle are suddenly told to go live a different lifestyle, and aren’t sure how to react. The traditional lifestyle gave people meaning, but the modern lifestyle gives people TV and low infant mortality.
That’s the situation we’re all facing, really.
So what’s a society to do? Sometimes they keep their kings. Sometimes they overthrow them. Then what? You can go nationalist–like Nasser. Communist–like South Yemen. (Though I’m not sure Yemen had a king.) Or Islamic, like Iran. (As far as I can tell, the Iranian revolution had a significant communist element, but the Islamic won out.) The Iranian revolution is in no danger of spreading, though, because the Iranians practice a variety of Islam that’s a rare minority everywhere else in the world.
I hear the Saudis and certain other monarchs have stayed in power so far by using their oil money to keep everyone comfortable (staving off the stresses of modernization) and enforcing Islamic law (keeping the social system familiar.) We’ll see how long this lasts.
So one of the oddities of the Middle East is that while other parts of the world have become more liberal, it appears to have become less. You can find many Before-and-After pictures of places like Iran, where women used to mingle with men, unveiled, in Western-style dress. (In fact, I think the veil was illegal in Iran in the 50s.) War-torn Afghanistan is an even sadder case.
“After the end of the Second World War, Zahir Shah recognised the need for the modernisation of Afghanistan and recruited a number of foreign advisers to assist with the process. During this period Afghanistan’s first modern university was founded.… despite the factionalism and political infighting a new constitution was introduced during 1964 which made Afghanistan a modern democratic state by introducing free elections, a parliament, civil rights, women’s rights and universal suffrage.“
While he was in Italy (undergoing eye surgery and treatment for lumbago,) his cousin executed a coup and instituted a republican government. As we all know, Afghanistan has gone nowhere but up since then.
Zahir Shah returned to Afghanistan in 2002, after the US drove out the Taliban, where he received the title “Father of the Nation” but did not resume duties as monarch. He died in 2007.
His eldest daughter (Princess of Afghanistan?) is Bilqis Begum–Bilqis is the Queen of Sheba’s Islamic name–but she doesn’t have a Wikipedia page. The heir apparent is Ahmad Shah Khan, if you’re looking for someone to crown.
Back to the Muslim Brotherhood.
One of the big differences between elites and commoners is that commoners tend to be far more conservatives than elites. Elites think a world in which they can jet off to Italy for medical treatment sounds awesome, while commoners think this is going to put the local village medic out of a job. Or as the world learned last November, America’s upper and lower classes have very different ideas about borders, globalization, and who should be president.
Similarly, the Muslim Brotherhood seems perfectly happy to use democratic means to come to power where it can.
(The MB apparently does a lot of charity work, which is part of why it is popular.)
The relationship between the MB an Saudi Arabia is interesting. After Egypt cracked down on the MB, thousands of members went to Saudi Arabia. SA needed teachers, and many of the MB were teachers, so it seemed mutually beneficial. The MB thus took over the Saudi educational system, and probably large chunks of their bureaucracy.
Relations soured between SA and the MB due to SA’s decision to let the US base troops there for its war against Iraq, and due to the MB’s involvement in the Arab Spring and active role in Egypt’s democracy–Saudi monarchs aren’t too keen on democracy. In 2014, SA declared the MB a “terrorist organization.”
Lots of people say the MB is a terrorist org, but I’m not sure how that distinguishes them from a whole bunch of other groups in the Middle East. I can’t tell what links the MB has (if any) to ISIS. (While both groups have similar-sounding goals, it’s entirely possible for two different groups to both want to establish an Islamic Caliphate.)
The MB reminds me of the Protestant Reformation, with its emphasis on returning to the Bible as the sole sources of religious wisdom, the establishment of Puritan theocracies, and a couple hundred years of Catholic/Protestant warfare. I blame the Protestant Revolution on the spread of the printing press in Europe, without which the whole idea of reading the Bible for yourself would have been nonsense. I wager something similar happened recently in the Middle East, with cheap copies of the Quran and other religious (and political) texts becoming widely available.