By any objective analysis, life in modern America is pretty darn good. You probably didn’t die in childbirth and neither did half of your children. You haven’t died of smallpox or polio. You probably haven’t lived through a famine or war. Cookies and meat are cheap, houses are big, most of us do rather little physical labor, and we can carry the collected works of Shakespeare, Tolstoy, and Wikipedians in our pockets. We have novacaine for tooth surgery. If you avoid drugs and don’t eat too much, there’s a very good chance you’ll survive into your eighties.
In the past, people grew up in small towns or rural areas near small towns, knew most of the people in their neighborhoods, went to school, got jobs, and got married. They moved if they needed more land or saw opportunities in the gold fields, but most stayed put.
We know this because we can read about it in historical books.
One of the results was strong continuity of people in a particular place, and strong continuity of people allowed the development of those “civic associations” people are always going on about. Kids joined clubs at school, clubs at church, then transitioned into adult-aged clubs when they graduated. At every age, there were clubs, and clubs organized and ran events for the community.
Of course club membership was mediated by physical location–if you live in a town you will be in more clubs than if you live in the country and have to drive an hour to get there–but in general, life revolved around clubs (and church, which we can generously call another kind of club, with its own sub clubs.)
In such an environment, it is easy to see how someone could meet their sweetheart at 16, become a functioning member of society at 18, get a job, put a down payment on a house, get married by 20 or 22 and start having children.
Today, people go to college.
Forget your highschool sweetheart: you’re never going to see her again.
After college, people typically move again, because the job they’ve spent 4 years training for often isn’t in the same city as their college.
So forget all of your college friends: chances are you’ll never see any of them again, either.
Now you’re living in a strange city, full of strangers. You know no one. You are part of no clubs. No civic organizations. You feel no connection to anyone.
“Isn’t diversity great?” someone crows over kebabs, and you think “Hey, at least those Muslims over there have each other to talk to.” Soon you find yourself envying the Hispanics. They have a community. You have a bar.
People make do. They socialize after work. They reconnect with old friends on Facebook and discover that their old friends are smug and annoying because Facebook is a filter that turns people smug and annoying.
But you can’t repair all of the broken connections.
Meanwhile, all of those small, rural towns have lost their young adults. Many of them have no realistic future for young people who stay; everyone who can leave, does. All that’s left behind are children, old people, and the few folks who didn’t quite make it into college.
The cities bloat with people who feel no connection to each other and small towns wither and die.
Never mind the ‘war on drugs’ or laying all blame with pharmas, this epidemic exists because millions live in a world without hope, certainty and structure…
The number one killer of Americans under the age of 50 isn’t cancer, or suicide, or road traffic accidents. It’s drug overdoses. They have quadrupled since 1999. More than 52,000 Americans died from drug overdoses last year. Even in the UK, where illegal drug use is on the decline, overdose deaths are peaking, having grown by 10% from 2015 to 2016 alone. …
Opioids, whatever their source, bond with receptors all over our bodies. Opioid receptors evolved to protect us from panic, anxiety and pain – a considerate move by the oft-callous forces of evolution. …
The overdose epidemic compels us to face one of the darkest corners of modern human experience head on, to stop wasting time blaming the players and start looking directly at the source of the problem. What does it feel like to be a youngish human growing up in the early 21st century? Why are we so stressed out that our internal supply of opioids isn’t enough? …
You get opioids from your own brain stem when you get a hug. Mother’s milk is rich with opioids, which says a lot about the chemical foundation of mother-child attachment. When rats get an extra dose of opioids, they increase their play with each other, even tickle each other. And when rodents are allowed to socialise freely (rather than remain in isolated steel cages) they voluntarily avoid the opiate-laden bottle hanging from the bars of their cage. They’ve already got enough. …
So what does it say about our lifestyle if our natural supply isn’t sufficient and so we risk our lives to get more? It says we are stressed, isolated and untrusting.
(Note: college itself is enjoyable and teaches people valuable skills. This thread is not opposed to “learning things,” just to an economic system that separates people from their loved ones.)
At least, this looks like a problem to me., especially when I’m trying to make conversation at the local moms group.
There are many potential reasons the data looks like this (including inaccuracy, though my lived experience says it is accurate.) Our culture encourages people to limit their fertility, and smart women are especially so encouraged. Smart people are also better at long-term planning and doing things like “reading the instructions on the birth control.”
But it seems likely that there is another factor, an arrow of causation pointing in the other direction: smart people tend to stay in school for longer, and people dislike having children while they are still in school. While you are in school, you are in some sense still a child, and we have a notion that children shouldn’t beget children.
Isaac Newton. Never married. Probably a virgin.
People who drop out of school and start having children at 16 tend not to be very smart and also tend to have plenty of children during their child-creating years. People who pursue post-docs into their thirties tend to be very smart–and many of them are virgins.
Now, I don’t know about you, but I kind of like having smart people around, especially the kinds of people who invent refrigerators and make supply chains work so I can enjoy eating food, even though I live in a city, far from any farms. I don’t want to live in a world where IQ is crashing and we can no longer maintain complex technological systems.
We need to completely re-think this system where the smarter you are, the longer you are expected to stay in school, accruing debt and not having children.
Proposal one: Accelerated college for bright students. Let any student who can do college-level work begin college level work for college credits, even if they are still in high (or middle) school. There are plenty of bright students out there who could be completing their degrees by 18.
The entirely framework of schooling probably ought to be sped up in a variety of ways, especially for bright students. The current framework often reflects the order in which various discoveries were made, rather than the age at which students are capable of learning the material. For example, negative numbers are apparently not introduced in the math curriculum until 6th grade, even though, in my experience, even kindergarteners are perfectly capable of understanding the concept of “debt.” If I promise to give you one apple tomorrow, then I have “negative one apple.” There is no need to hide the concept of negatives for 6 years.
Proposal two: More apprenticeship.
In addition to being costly and time-consuming, a college degree doesn’t even guarantee that your chosen field will still be hiring when you graduate. (I know people with STEM degrees who graduated right as the dot.com bubble burst. Ouch.) We essentially want our educational system to turn out people who are highly skilled at highly specialized trades, and capable of turning around and becoming highly skilled at another highly specialized trade on a dime if that doesn’t work out. This leads to chemists returning to university to get law degrees; physicists to go back for medical degrees. We want students to have both “broad educations” so they can get hired anywhere, and “deep educations” so they’ll actually be good at their jobs.
Imagine, instead, a system where highschool students are allowed to take a two-year course in preparation for a particular field, at the end of which high performers are accepted into an apprenticeship program where the continue learning on the job. At worst, these students would have a degree, income, and job experience by the age of 20, even if they decided they now wanted to switch professions or pursue an independent education.
Proposal three: Make childbearing normal for adult students.
There’s no reason college students can’t get married and have children (aside from, obviously, their lack of jobs and income.) College is not more time consuming or physically taxing than regular jobs, and college campuses tend to be pretty pleasant places. Studying while pregnant isn’t any more difficult than working while pregnant.
Grad students, in particular, are old and mature enough to get married and start families, and society should encourage them to do so.
Proposal four: stop denigrating child-rearing, especially for intelligent women.
Children are a lot of work, but they’re also fun. I love being with my kids. They are my family and an endless source of happiness.
What people want and value, they will generally strive to obtain.
Make no mistake: Nichols is annoyingly arrogant. He draws a rather stark line between “experts” (who know things) and everyone else (who should humbly limit themselves to voting between options defined for them by the experts.) He implores people to better educate themselves in order to be better voters, but has little patience for autodidacts and bloggers like myself who are actually trying.
But arrogance alone doesn’t make someone wrong.
Nichols’s first thesis is simple: most people are too stupid or ignorant to second-guess experts or even contribute meaningfully to modern policy discussions. How can people who can’t find Ukraine on a map or think we should bomb the fictional city of Agrabah contribute in any meaningful way to a discussion of international policy?
It was one thing, in 1776, to think the average American could vote meaningfully on the issues of the day–a right they took by force, by shooting anyone who told them they couldn’t. Life was less complicated in 1776, and the average person could master most of the skills they needed to survive (indeed, pioneers on the edge of the frontier had to be mostly self-sufficient in order to survive.) Life was hard–most people engaged in long hours of heavy labor plowing fields, chopping wood, harvesting crops, and hauling necessities–but could be mastered by people who hadn’t graduated from elementary school.
But the modern industrial (or post-industrial) world is much more complicated than the one our ancestors grew up in. Today we have cars (maybe even self-driving cars), electrical grids and sewer systems, atomic bombs and fast food. The speed of communication and transportation have made it possible to chat with people on the other side of the earth and show up on their doorstep a day later. The amount if specialized, technical knowledge necessary to keep modern society running would astonish the average caveman–even with 15+ years of schooling, the average person can no longer build a house, nor even produce basic necessities like clothes or food. Most of us can’t even make a pencil.
Even experts who are actually knowledgeable about their particular area may be completely ignorant of fields outside of their expertise. Nichols speaks Russian, which makes him an expert in certain Russian-related matters, but he probably knows nothing about optimal high-speed rail networks. And herein lies the problem:
The American attachment to intellectual self-reliance described by Tocqueville survived for nearly a century before falling under a series of assaults from both within and without. Technology, universal secondary education, the proliferation of specialized expertise, and the emergence of the United States a a global power in the mid-twentieth century all undermined the idea… that the average American was adequately equipped either for the challenges of daily life or for running the affairs of a large country.
… the political scientist Richard Hofstadter wrote that “the complexity of modern life has steadily whittled away the functions the ordinary citizen can intelligently and competently perform for himself.”
… Somin wrote in 2015 that the “size and complexity of government” have mad it “more difficult for voters with limited knowledge to monitor and evaluate the government’s many activities. The result is a polity in which the people often cannot exercise their sovereignty responsibly and effectively.”
In other words, society is now too complex and people too stupid for democracy.
Nichols’s second thesis is that people used to trust experts, which let democracy function, but to day they are less trusting. He offers no evidence other than his general conviction that this change has happened.
He does, however, detail the way he thinks that 1. People have been given inflated egos about their own intelligence, and 2. How our information-delivery system has degenerated into misinformational goo, resulting in the trust-problems he believes we are having These are interesting arguments and worth examining.
A bit of summary:
Indeed, maybe the death of expertise is a sign of progress. Educated professionals, after all, no longer have a stranglehold on knowledge. The secrets of life are no longer hidden in giant marble mausoleums… in the past, there was less tress between experts and laypeople, but only because citizen were simply unable to challenge experts in any substantive way. …
Participation in political, intellectual, and scientific life until the early twentieth century was far more circumscribed, with debates about science, philosophy, and public policy all conducted by a small circle of educated males with pen and ink. Those were not exactly the Good Old Days, and they weren’t that long ago. The time when most people didn’t finish highschool, when very few went to college, and only a tiny fraction of the population entered professions is still within living memory of many Americans.
Aside from Nichols’s insistence that he believes modern American notions about gender and racial equality, I get the impression that he wouldn’t mind the Good Old Days of genteel pen-and-ink discussions between intellectuals. However, I question his claim that participation in political life was far more circumscribed–after all, people voted, and politicians liked getting people to vote for them. People anywhere, even illiterate peasants on the frontier or up in the mountains like to gather and debate about God, politics, and the meaning of life. The question is less “Did they discuss it?” and more “Did their discussions have any effect on politics?” Certainly we can point to abolition, women’s suffrage, prohibition, and the Revolution itself as heavily grass-roots movements.
But continuing with Nichols’s argument:
Social changes only in the past half century finally broke down old barriers of race, class, and sex not only between Americans and general but also between uneducated citizens and elite expert in particular. A wide circle of debate meant more knowledge but more social friction. Universal education, the greater empowerment of women and minorities, the growth of a middle class, and increased social mobility all threw a minority of expert and the majority of citizens into direct contact, after nearly two centuries in which they rarely had to interact with each other.
And yet the result has not been a greater respect for knowledge, but the growth of an irrational conviction among Americans that everyone is as smart as everyone else.
Nichols is distracting himself with the reflexive racial argument; the important change he is highlighting isn’t social but technical.
I’d like to quote a short exchange from Our Southern Highlanders, an anthropologic-style text written about Appalachia about a century ago:
The mountain clergy, as a general rule, are hostile to “book larnin’,” for “there ain’t no Holy Ghost in it.” One of them who had spent three months at a theological school told President Frost, “Yes, the seminary is a good place ter go and git rested up, but ’tain’t worth while fer me ter go thar no more ’s long as I’ve got good wind.”
It used to amuse me to explain how I knew that the earth was a sphere; but one day, when I was busy, a tiresome old preacher put the everlasting question to me: “Do you believe the earth is round?” An impish perversity seized me and I answered, “No—all blamed humbug!” “Amen!” cried my delighted catechist, “I knowed in reason you had more sense.”
But back to Nichols, who really likes the concept of expertise:
One reason claims of expertise grate on people in a democracy is that specialization is necessarily exclusive. WHen we study a certain area of knowledge or spend oulives in a particular occupation, we not only forego expertise in othe jobs or subjects, but also trust that other pople in the community know what they’re doing in thei area as surely as we do in our own. As much as we might want to go up to the cockpit afte the engine flames out to give the pilots osme helpful tips, we assume–in part, ebcause wehave to–that tye’re better able to cope with the problem than we are. Othewise, our highly evovled society breaks down int island sof incoherence, where we spend our time in poorly infomed second-guessing instead of trusting each other.
This would be a good point to look at data on overall trust levels, friendship, civic engagement, etc (It’s down. It’s all down.) and maybe some explanations for these changes.
Nichols talks briefly about the accreditation and verification process for producing “experts,” which he rather likes. There is an interesting discussion in the economics literature on things like the economics of trust and information (how do websites signal that they are trustworthy enough that you will give them your credit card number and expect to receive items you ordered a few days later?) which could apply here, too.
Nichols then explores a variety of cognitive biases, such a superstitions, phobias, and conspiracy theories:
Conspiracy theories are also a way for people to give meaning to events that frighten them. Without a coherent explanation for why terrible thing happen to innocent people, they would have to accept such occurence as nothing more than the random cruelty either of an uncaring universe or an incomprehensible deity. …
The only way out of this dilemma is to imagine a world in which our troubles are the fault of powerful people who had it within their power to avert such misery. …
Just as individual facing grief and confusion look for reasons where none may exist, so, too, will entire societies gravitate toward outlandish theories when collectively subjected to a terrible national experience. Conspiracy theories and flawed reasoning behind them …become especially seductive “in any society that has suffered an epic, collectively felt trauma. In the aftermath, millions of people find themselves casting about for an answer to the ancient question of why bad things happen to good people.” …
Today, conspiracy theories are reaction mostly to the economic and social dislocations of globalization…This is not a trivial obstacle when it comes to the problems of expert engagement with the public: nearly 30 percent of Americans, for example, think “a secretive elite with a globalist agenda is conspiring to eventually rule the world” …
Obviously stupid. A not-secret elite with a globalist agenda already rules the world.
and 15 percent think media or government add secret mind controlling technology to TV broadcasts. (Another 15 percent aren’t sure about the TV issue.)
It’s called “advertising” and it wants you to buy a Ford.
Anyway, the problem with conspiracy theories is they are unfalsifiable; no amount of evidence will ever convince a conspiracy theorist that he is wrong, for all evidence is just further proof of how nefariously “they” are constructing the conspiracy.
Then Nichols gets into some interesting matter on the difference between stereotypes and generalizations, which segues nicely into a tangent I’d like to discuss, but it probably deserves its own post. To summarize:
Sometimes experts know things that contradict other people’s political (or religious) beliefs… If an “expert” finding or field accords with established liberal values, EG, the implicit association test found that “everyone is a little bit racist,” which liberals already believed, then there is an easy mesh between what the academics believe and the rest of their social class.
If their findings contradict conservative/low-class values, EG, when professors assert that evolution is true and “those low-class Bible-thumpers in Oklahoma are wrong,” sure, they might have a lot of people who disagree with them, but those people aren’t part of their own social class/the upper class, and so not a problem. If anything, high class folks love such finding, because it gives them a chance to talk about how much better they are than those low-class people (though such class conflict is obviously poisonous in a democracy where those low-class people can still vote to Fuck You and Your Global Warming, Too.)
But if the findings contradict high-class/liberal politics, then the experts have a real problem. EG, if that same evolution professor turns around and says, “By the way, race is definitely biologically real, and there are statistical differences in average IQ between the races,” now he’s contradicting the political values of his own class/the upper class, and that becomes a social issue and he is likely to get Watsoned.
For years folks at Fox News (and talk radio) have lambasted “the media” even though they are part of the media; SSC recently discussed “can something be both popular and silenced?”
Jordan Peterson isn’t unpopular or “silenced” so much as he is disliked by upper class folks and liked by “losers” and low class folks, despite the fact that he is basically an intellectual guy and isn’t peddling a low-class product. Likewise, Fox News is just as much part of The Media as NPR, (if anything, it’s much more of the Media) but NPR is higher class than Fox, and Fox doesn’t like feeling like its opinions are being judged along this class axis.
For better or for worse (mostly worse) class politics and political/religious beliefs strongly affect our opinions of “experts,” especially those who say things we disagree with.
But back to Nichols: Dunning-Kruger effect, fake cultural literacy, and too many people at college. Nichols is a professor and has seen college students up close and personal, and has a low opinion of most of them. The massive expansion of upper education has not resulted in a better-educated, smarter populace, he argues, but a populace armed with expensive certificates that show the sat around a college for 4 years without learning much of anything. Unfortunately, beyond a certain level, there isn’t a lot that more school can do to increase people’s basic aptitudes.
Colleges get money by attracting students, which incentivises them to hand out degrees like candy–in other words, students are being lied to about their abilities and college degrees are fast becoming the participation trophies for the not very bright.
Nichols has little sympathy for modern students:
Today, by contrast, students explode over imagined slights that are not even remotely int eh same category as fighting for civil rights or being sent to war. Students now build majestic Everests from the smallest molehills, and they descend into hysteria over pranks and hoaxes. In the midst of it all, the students are learning that emotions and volume can always defeat reason and substance, thus building about themselves fortresses that no future teacher, expert, or intellectual will ever be able to breach.
At Yale in 2015, for example, a house master’s wife had the temerity to tell minority students to ignore Halloween costumes they thought offensive. This provoked a campus wide temper tantrum that included professors being shouted down by screaming student. “In your position as master,” one student howled in a professor’s face, “it is your job to create a place of comfort and home for the students… Do you understand that?!”
Quietly, the professor said, “No, I don’t agree with that,” and the student unloaded on him:
“Then why the [expletive] did you accept the position?! Who the [expletive] hired you?! You should step down! If that is what you think about being a master you should step down! It is not about creating an intellectual space! It is not! Do you understand that? It’s about creating a home here. You are not doing that!” [emphasis added]
Yale, instead of disciplining students in violation of their own norms of academic discourse, apologized to the tantrum throwers. The house master eventually resigned from his residential post…
To faculty everywhere, the lesson was obvious: the campus of a top university is not a place for intellectual exploration. It is a luxury home, rented for four to six years, nine months at a time, by children of the elite who may shout at faculty as if they’re berating clumsy maids in a colonial mansion.
The incident Nichols cites (and similar ones elsewhere,) are not just matters of college students being dumb or entitled, but explicitly racial conflicts. The demand for “safe spaces” is easy to ridicule on the grounds that students are emotional babies, but this misses the point: students are carving out territory for themselves on explicitly racial lines, often by violence.
Nichols, though, either does not notice the racial aspect of modern campus conflicts or does not want to admit publicly to doing so.
Nichols moves on to blame TV, especially CNN, talk radio, and the internet for dumbing down the quality of discourse by overwhelming us with a deluge of more information than we can possibly process.
Referring back to Auerswald and The Code Economy, if automation creates a bifurcation in industries, replacing a moderately-priced, moderately available product with a stream of cheap, low-quality product on the one hand and a trickle of expensive, high-quality products on the other, good-quality journalism has been replaced with a flood of low-quality crap. The high-quality end is still working itself out.
Nichols opines:
Accessing the Internet can actually make people dumber than if they had never engaged a subject at all. The very act of searching for information makes people think they’ve learned something,when in fact they’re more likely to be immersed in yet more data they do not understand. …
When a group of experimental psychologists at Yale investigated how people use the internet, they found that “people who search for information on the Web emerge from the process with an inflated sense of how much they know–even regarding topic that are unrelated to the ones they Googled.” …
How can exposure to so much information fail to produce at least some kind of increased baseline of knowledge, if only by electronic osmosis? How can people read so much yet retain so little? The answer is simple: few people are actually reading what they find.
As a University College of London (UCL) study found, people don’t actually read the articles they encounter during a search on the Internet. Instead, they glance at the top line or the first few sentences and then move on. Internet users, the researchers noted, “Are not reading online in the traditional sense; indeed, there are signs that new forms of ‘reading’ are emerging as users ‘power browse’ horizontally through titles, contents pages and abstracts going for quick wins. It almost seems that they go online to avoid reading in the traditional sense.”
The internet’s demands for instant updates, for whatever headlines generate the most clicks (and thus advertising revenue), has upset the balance of speed vs. expertise in the newsroom. No longer have reporters any incentive to spend long hours carefully writing a well-researched story when such stories pay less than clickbait headlines about racist pet costumes and celebrity tweets.
I realize it seems churlish to complain about the feast of news and information brought to us by the Information Age, but I’m going to complain anyway. Changes in journalism, like the increased access to the Internet and to college education, have unexpectedly corrosive effects on the relationship between laypeople and experts. Instead of making people better informed, much of what passes for news in the twenty-first century often leaves laypeople–and sometimes experts–even more confused and ornery.
Experts face a vexing challenge: there’s more news available, and yet people seem less informed, a trend that goes back at least a quarter century. Paradoxically, it is a problem that is worsening rather than dissipating. …
As long ago as 1990, for example, a study conducted by the Pew Trust warned that disengagement from important public questions was actually worse among people under thirty, the group that should have been most receptive to then-emerging sources of information like cable television and electronic media. This was a distinct change in American civic culture, as the Pew study noted:
“Over most of the past five decades younger members of the public have been at least as well informed as older people. In 1990, that is no longer the case. … “
Those respondents are now themselves middle-aged, and their children are faring no better.
If you were 30 in 1990, you were born in 1960, to parents who were between the ages of 20 and 40 years old, that is, born between 1920 and 1940.
Source: Audacious Epigone
Fertility for the 1920-1940 cohort was strongly dysgenic. So was the 1940-50 cohort. The 1900-1919 cohort at least had the Flynn Effect on their side, but later cohorts just look like an advertisement for idiocracy.
Nichols ends with a plea that voters respect experts (and that experts, in turn, be humble and polite to voters.) After all, modern society is too complicated for any of us to be experts on everything. If we don’t pay attention to expert advice, he warns, modern society is bound to end in ignorant goo.
The logical inconsistency is that Nichols believes in democracy at all–he thinks democracy can be saved if ignorant people vote within a range of options as defined by experts like himself, eg, “What vaccine options are best?” rather than “Should we have vaccines at all?”
The problem, then, is that whoever controls the experts (or controls which expert opinions people hear) controls the limits of policy debates. This leads to people arguing over experts, which leads right back where we are today. As long as there are politics, “expertise” will be politicized, eg:
Look at any court case in which both sides bring in their own “expert” witnesses. Both experts testify to the effect that their side is correct. Then the jury is left to vote on which side had more believable experts. This is like best case scenario voting, and the fact that the voters are dumb and don’t understand what the experts are saying and are obviously being mislead in many cases is still a huge problem.
If politics is the problem, then perhaps getting rid of politics is the solution. Just have a bunch of Singapores run by Lee Kwan Yews, let folks like Nichols advise them, and let the common people “vote with their feet” by moving to the best states.
The problem with this solution is that “exit” doesn’t exist in the modern world in any meaningful way, and there are significant reasons why ordinary people oppose open borders.
Conclusion: 3/5 stars. It’s not a terrible book, and Nichols has plenty of good points, but “Americans are dumb” isn’t exactly fresh territory and much has already been written on the subject.
The other day on Twitter, Nick B. Steves challenged me to find data supporting or refuting his assertion that Nerds vs. Jocks is a false stereotype, invented around 1975. Of course, we HBDers have a saying–“all stereotypes are true,” even the ones about us–but let’s investigate Nick’s claim and see where it leads us.
(NOTE: If you have relevant data, I’d love to see it.)
Unfortunately, terms like “nerd,” “jock,” and “chad” are not all that well defined. Certainly if we define “jock” as “athletic but not smart” and nerd as “smart but not athletic,” then these are clearly separate categories. But what if there’s a much bigger group of people who are smart and athletic?
Or what if we are defining “nerd” and “jock” too narrowly? Wikipedia defines nerd as, “a person seen as overly intellectual, obsessive, or lacking social skills.” I recall a study–which I cannot find right now–which found that nerds had, overall, lower-than-average IQs, but that study included people who were obsessive about things like comic books, not just people who majored in STEM. Similarly, should we define “jock” only as people who are good at sports, or do passionate sports fans count?
For the sake of this post, I will define “nerd” as “people with high math/science abilities” and “jock” as “people with high athletic abilities,” leaving the matter of social skills undefined. (People who merely like video games or watch sports, therefore, do not count.)
Nick is correct on one count: according to Wikipedia, although the word “nerd” has been around since 1951, it was popularized during the 70s by the sitcom Happy Days. However, Wikipedia also notes that:
An alternate spelling,[10] as nurd or gnurd, also began to appear in the mid-1960s or early 1970s.[11] Author Philip K. Dick claimed to have coined the nurd spelling in 1973, but its first recorded use appeared in a 1965 student publication at Rensselaer Polytechnic Institute.[12][13]Oral tradition there holds that the word is derived from knurd (drunk spelled backward), which was used to describe people who studied rather than partied. The term gnurd (spelled with the “g”) was in use at the Massachusetts Institute of Technology by 1965.[14] The term nurd was also in use at the Massachusetts Institute of Technology as early as 1971 but was used in the context for the proper name of a fictional character in a satirical “news” article.[15]
suggesting that the word was already common among nerds themselves before it was picked up by TV.
But we can trace the nerd-jock dichotomy back before the terms were coined: back in 1921, Lewis Terman, a researcher at Stanford University, began a long-term study of exceptionally high-IQ children, the Genetic Studies of Genius aka the Terman Study of the Gifted:
Terman’s goal was to disprove the then-current belief that gifted children were sickly, socially inept, and not well-rounded.
This belief was especially popular in a little nation known as Germany, where it inspired people to take schoolchildren on long hikes in the woods to keep them fit and the mass-extermination of Jews, who were believed to be muddying the German genepool with their weak, sickly, high-IQ genes (and nefariously trying to marry strong, healthy German in order to replenish their own defective stock.) It didn’t help that German Jews were both high-IQ and beset by a number of illnesses (probably related to high rates of consanguinity,) but then again, the Gypsies are beset by even more debilitating illnesses, but no one blames this on all of the fresh air and exercise afforded by their highly mobile lifestyles.
(Just to be thorough, though, the Nazis also exterminated the Gypsies and Hans Asperger’s subjects, despite Asperger’s insistence that they were very clever children who could probably be of great use to the German war effort via code breaking and the like.)
The results of Terman’s study are strongly in Nick’s favor. According to Psychology Today’s account:
His final group of “Termites” averaged a whopping IQ of 151. Following-up his group 35-years later, his gifted group at mid-life definitely seemed to conform to his expectations. They were taller, healthier, physically better developed, and socially adept (dispelling the myth at the time of high-IQ awkward nerds).
…the first volume of the study reported data on the children’s family,[17] educational progress,[18] special abilities,[19] interests,[20] play,[21] and personality.[22] He also examined the children’s racial and ethnic heritage.[23] Terman was a proponent of eugenics, although not as radical as many of his contemporary social Darwinists, and believed that intelligence testing could be used as a positive tool to shape society.[3]
Based on data collected in 1921–22, Terman concluded that gifted children suffered no more health problems than normal for their age, save a little more myopia than average. He also found that the children were usually social, were well-adjusted, did better in school, and were even taller than average.[24] A follow-up performed in 1923–1924 found that the children had maintained their high IQs and were still above average overall as a group.
Of course, we can go back even further than Terman–in the early 1800s, allergies like hay fever were associated with the nobility, who of course did not do much vigorous work in the fields.
My impression, based on studies I’ve seen previously, is that athleticism and IQ are positively correlated. That is, smarter people tend to be more athletic, and more athletic people tend to be smarter. There’s a very obvious reason for this: our brains are part of our bodies, people with healthier bodies therefore also have healthier brains, and healthier brains tend to work better.
At the very bottom of the IQ distribution, mentally retarded people tend to also be clumsy, flacid, or lacking good muscle tone. The same genes (or environmental conditions) that make children have terrible health/developmental problems often also affect their brain growth, and conditions that affect their brains also affect their bodies. As we progress from low to average to above-average IQ, we encounter increasingly healthy people.
In most smart people, high-IQ doesn’t seem to be a random fluke, a genetic error, nor fitness reducing: in a genetic study of children with exceptionally high IQs, researchers failed to find many genes that specifically endowed the children with genius, but found instead a fortuitous absence of deleterious genes that knock a few points off the rest of us. The same genes that have a negative effect on the nerves and proteins in your brain probably also have a deleterious effect on the nerves and proteins throughout the rest of your body.
On the other hand, the evolutionary standard for “fitness” isn’t strength or longevity, but reproduction, and on this scale the high-IQ don’t seem to do as well:
Controlling for age, physical maturity, and mother’s education, a significant curvilinear relationship between intelligence and coital status was demonstrated; adolescents at the upper and lower ends of the intelligence distribution were less likely to have sex. Higher intelligence was also associated with postponement of the initiation of the full range of partnered sexual activities. … Higher intelligence operates as a protective factor against early sexual activity during adolescence, and lower intelligence, to a point, is a risk factor.
Here we see the issue plainly: males at 120 and 130 IQ are less likely to get laid than clinically retarded men in 70s and 60s. The right side of the graph are “nerds”, the left side, “jocks.” Of course, the high-IQ females are even less likely to get laid than the high-IQ males, but males tend to judge themselves against other men, not women, when it comes to dating success. Since the low-IQ females are much less likely to get laid than the low-IQ males, this implies that most of these “popular” guys are dating girls who are smarter than themselves–a fact not lost on the nerds, who would also like to date those girls.
In 2001, the MIT/Wellesley magazine Counterpart (Wellesley is MIT’s “sister school” and the two campuses allow cross-enrollment in each other’s courses) published a sex survey that provides a more detailed picture of nerd virginity:
I’m guessing that computer scientists invented polyamory, and neuroscientists are the chads of STEM. The results are otherwise pretty predictable.
Unfortunately, Counterpoint appears to be defunct due to lack of funding/interest and I can no longer find the original survey, but here is Jason Malloy’s summary from Gene Expression:
By the age of 19, 80% of US males and 75% of women have lost their virginity, and 87% of college students have had sex. But this number appears to be much lower at elite (i.e. more intelligent) colleges. According to the article, only 56% of Princeton undergraduates have had intercourse. At Harvard 59% of the undergraduates are non-virgins, and at MIT, only a slight majority, 51%, have had intercourse. Further, only 65% of MIT graduate students have had sex.
The student surveys at MIT and Wellesley also compared virginity by academic major. The chart for Wellesley displayed below shows that 0% of studio art majors were virgins, but 72% of biology majors were virgins, and 83% of biochem and math majors were virgins! Similarly, at MIT 20% of ‘humanities’ majors were virgins, but 73% of biology majors. (Apparently those most likely to read Darwin are also the least Darwinian!)
How Rolling Stone-ish are the few lucky souls who are doing the horizontal mambo? Well, not very. Considering all the non-virgins on campus, 41% of Wellesley and 32% of MIT students have only had one partner (figure 5). It seems that many Wellesley and MIT students are comfortingly monogamous. Only 9% of those who have gotten it on at MIT have been with more than 10 people and the number is 7% at Wellesley.
Someone needs to find the original study and PUT IT BACK ON THE INTERNET.
But this lack of early sexual success seems to translate into long-term marital happiness, once nerds find “the one.”Lex Fridman’s Divorce Rates by Profession offers a thorough list. The average divorce rate was 16.35%, with a high of 43% (Dancers) and a low of 0% (“Media and communication equipment workers.”)
I’m not sure exactly what all of these jobs are nor exactly which ones should count as STEM (veterinarian? anthropologists?) nor do I know how many people are employed in each field, but I count 49 STEM professions that have lower than average divorce rates (including computer scientists, economists, mathematical science, statisticians, engineers, biologists, chemists, aerospace engineers, astronomers and physicists, physicians, and nuclear engineers,) and only 23 with higher than average divorce rates (including electricians, water treatment plant operators, radio and telecommunication installers, broadcast engineers, and similar professions.) The purer sciences obviously had lower rates than the more practical applied tech fields.
The big outliers were mathematicians (19.15%), psychologists (19.26%), and sociologists (23.53%), though I’m not sure they count (if so, there were only 22 professions with higher than average divorce rates.)
I’m not sure which professions count as “jock” or “chad,” but athletes had lower than average rates of divorce (14.05%) as did firefighters, soldiers, and farmers. Financial examiners, hunters, and dancers, (presumably an athletic female occupation) however, had very high rates of divorce.
According to the survey recently taken by the “infidelity dating website,” Victoria Milan, individuals working in the finance field, such as brokers, bankers, and analysts, are more likely to cheat than those in any other profession. However, following those in finance comes those in the aviation field, healthcare, business, and sports.
With the exception of healthcare and maybe aviation, these are pretty typical Chad occupations, not STEM.
The Mirror has a similar list of jobs where people are most and least likely to be married. Most likely: Dentist, Chief Executive, Sales Engineer, Physician, Podiatrist, Optometrist, Farm product buyer, Precision grinder, Religious worker, Tool and die maker.
Least likely: Paper-hanger, Drilling machine operator, Knitter textile operator, Forge operator, Mail handler, Science technician, Practical nurse, Social welfare clerk, Winding machine operative, Postal clerk.
I struggled to find data on male fertility by profession/education/IQ, but there’s plenty on female fertility, eg the deceptively titled High-Fliers have more Babies:
…American women without any form of high-school diploma have a fertility rate of 2.24 children. Among women with a high-school diploma the fertility rate falls to 2.09 and for women with some form of college education it drops to 1.78.
However, among women with college degrees, the economists found the fertility rate rises to 1.88 and among women with advanced degrees to 1.96. In 1980 women who had studied for 16 years or more had a fertility rate of just 1.2.
As the economists prosaically explain: “The relationship between fertility and women’s education in the US has recently become U-shaped.”
Here is another article about the difference in fertility rates between high and low-IQ women.
But female fertility and male fertility may not be the same–I recall data elsewhere indicating that high-IQ men have more children than low IQ men, which implies those men are having their children with low-IQ women. (For example, while Bill and Hillary seem about matched on IQ, and have only one child, Melania Trump does not seem as intelligent as Trump, who has five children.)
Of the 1,508,874 children born in 1920 in the birth registration area of the United states, occupations of fathers are stated for … 96.9%… The average number of children ever born to the present wives of these occupied fathers is 3.3 and the average number of children living 2.9.
The average number of children ever born ranges from 4.6 for foremen, overseers, and inspectors engaged in the extraction of minerals to 1.8 for soldiers, sailors, and marines. Both of these extreme averages are easily explained, for soldier, sailors and marines are usually young, while such foremen, overseers, and inspectors are usually in middle life. For many occupations, however, the ages of the fathers are presumably about the same and differences shown indicate real differences in the size of families. For example, the low figure for dentists, (2), architects, (2.1), and artists, sculptors, and teachers of art (2.2) are in striking contrast with the figure for mine operatives (4.3), quarry operatives (4.1) bootblacks, and brick and stone masons (each 3.9). …
As a rule the occupations credited with the highest number of children born are also credited with the highest number of children living, the highest number of children living appearing for foremen, overseers, and inspectors engaged in the extraction of minerals (3.9) and for steam and street railroad foremen and overseer (3.8), while if we exclude groups plainly affected by the age of fathers, the highest number of children living appear for mine and quarry operatives (each 3.6).
Obviously the job market was very different in 1920–no one was majoring in computer science. Perhaps some of those folks who became mine and quarry operatives back then would become engineers today–or perhaps not. Here are the average numbers of surviving children for the most obviously STEM professions (remember average for 1920 was 2.9):
I don’t know what paper hangers do, but the Mirror said they were among the least likely to be married, and in 1920, they had an average of 3.1 children–above average.
The Journal-Constitution studied 54 public universities, “including the members of the six major Bowl Championship Series conferences and other schools whose teams finished the 2007-08 season ranked among the football or men’s basketball top 25.”…
Football players average 220 points lower on the SAT than their classmates. Men’s basketball was 227 points lower.
University of Florida won the prize for biggest gap between football players and the student body, with players scoring 346 points lower than their peers.
Georgia Tech had the nation’s best average SAT score for football players, 1028 of a possible 1600, and best average high school GPA, 3.39 of a possible 4.0. But because its student body is apparently very smart, Tech’s football players still scored 315 SAT points lower than their classmates.
UCLA, which has won more NCAA championships in all sports than any other school, had the biggest gap between the average SAT scores of athletes in all sports and its overall student body, at 247 points.
From the original article, which no longer seems to be up on the Journal-Constitution website:
All 53 schools for which football SAT scores were available had at least an 88-point gap between team members’ average score and the average for the student body. …
Football players performed 115 points worse on the SAT than male athletes in other sports.
The differences between athletes’ and non-athletes’ SAT scores were less than half as big for women (73 points) as for men (170).
Many schools routinely used a special admissions process to admit athletes who did not meet the normal entrance requirements. … At Georgia, for instance, 73.5 percent of athletes were special admits compared with 6.6 percent of the student body as a whole.
On the other hand, as Discover Magazine discusses in “The Brain: Why Athletes are Geniuses,” athletic tasks–like catching a fly ball or slapping a hockey puck–require exceptionally fast and accurate brain signals to trigger the correct muscle movements.
Ryan Stegal studied the GPAs of highschool student athletes vs. non-athletes and found that the athletes had higher average GPAs than the non-athletes, but he also notes that the athletes were required to meet certain minimum GPA requirements in order to play.
But within athletics, it looks like the smarter athletes perform better than dumber ones, which is why the NFL uses the Wonderlic Intelligence Test:
NFL draft picks have taken the Wonderlic test for years because team owners need to know if their million dollar player has the cognitive skills to be a star on the field.
What does the NFL know about hiring that most companies don’t? They know that regardless of the position, proof of intelligence plays a profound role in the success of every individual on the team. It’s not enough to have physical ability. The coaches understand that players have to be smart and think quickly to succeed on the field, and the closer they are to the ball the smarter they need to be. That’s why, every potential draft pick takes the Wonderlic Personnel Test at the combine to prove he does–or doesn’t—have the brains to win the game. …
The first use of the WPT in the NFL was by Tom Landry of the Dallas Cowboys in the early 70s, who took a scientific approach to finding players. He believed players who could use their minds where it counted had a strategic advantage over the other teams. He was right, and the test has been used at the combine ever since.
For the NFL, years of testing shows that the higher a player scores on the Wonderlic, the more likely he is to be in the starting lineup—for any position. “There is no other reasonable explanation for the difference in test scores between starting players and those that sit on the bench,” Callans says. “Intelligence plays a role in how well they play the game.”
A large study conducted at the Sahlgrenska Academy and Sahlgrenska University Hospital in Gothenburg, Sweden, reveals that young adults who regularly exercise have higher IQ scores and are more likely to go on to university.
The study was published in the Proceedings of the National Academy of Sciences (PNAS), and involved more than 1.2 million Swedish men. The men were performing military service and were born between the years 1950 and 1976. Both their physical and IQ test scores were reviewed by the research team. …
The researchers also looked at data for twins and determined that primarily environmental factors are responsible for the association between IQ and fitness, and not genetic makeup. “We have also shown that those youngsters who improve their physical fitness between the ages of 15 and 18 increase their cognitive performance.”…
I have seen similar studies before, some involving mice and some, IIRC, the elderly. It appears that exercise is probably good for you.
I have a few more studies I’d like to mention quickly before moving on to discussion.
Overall, it looks like smarter people are more athletic, more athletic people are smarter, smarter athletes are better athletes, and exercise may make you smarter. For most people, the nerd/jock dichotomy is wrong.
However, there is very little overlap at the very highest end of the athletic and intelligence curves–most college (and thus professional) athletes are less intelligent than the average college student, and most college students are less athletic than the average college (and professional) athlete.
Additionally, while people with STEM degrees make excellent spouses (except for mathematicians, apparently,) their reproductive success is below average: they have sex later than their peers and, as far as the data I’ve been able to find shows, have fewer children.
Stephen Hawking
Even if there is a large overlap between smart people and athletes, they are still separate categories selecting for different things: a cripple can still be a genius, but can’t play football; a dumb person can play sports, but not do well at math. Stephen Hawking can barely move, but he’s still one of the smartest people in the world. So the set of all smart people will always include more “stereotypical nerds” than the set of all athletes, and the set of all athletes will always include more “stereotypical jocks” than the set of all smart people.
In my experience, nerds aren’t socially awkward (aside from their shyness around women.) The myth that they are stems from the fact that they have different interests and communicate in a different way than non-nerds. Let nerds talk to other nerds, and they are perfectly normal, communicative, socially functional people. Put them in a room full of non-nerds, and suddenly the nerds are “awkward.”
Unfortunately, the vast majority of people are not nerds, so many nerds have to spend the majority of their time in the company of lots of people who are very different than themselves. By contrast, very few people of normal IQ and interests ever have to spend time surrounded by the very small population of nerds. If you did put them in a room full of nerds, however, you’d find that suddenly they don’t fit in. The perception that nerds are socially awkward is therefore just normie bias.
Why did the nerd/jock dichotomy become so popular in the 70s? Probably in part because science and technology were really taking off as fields normal people could aspire to major in, man had just landed on the moon and the Intel 4004 was released in 1971. Very few people went to college or were employed in sciences back in 1920; by 1970, colleges were everywhere and science was booming.
And at the same time, colleges and highschools were ramping up their athletics programs. I’d wager that the average school in the 1800s had neither PE nor athletics of any sort. To find those, you’d probably have to attend private academies like Andover or Exeter. By the 70s, though, schools were taking their athletics programs–even athletic recruitment–seriously.
How strong you felt the dichotomy probably depends on the nature of your school. I have attended schools where all of the students were fairly smart and there was no anti-nerd sentiment, and I have attended schools where my classmates were fiercely anti-nerd and made sure I knew it.
But the dichotomy predates the terminology. Take Superman, first 1938. His disguise is a pair of glasses, because no one can believe that the bookish, mild-mannered, Clark Kent is actually the super-strong Superman. Batman is based on the character of El Zorro, created in 1919. Zorro is an effete, weak, foppish nobleman by day and a dashing, sword-fighting hero of the poor by night. Of course these characters are both smart and athletic, but their disguises only work because others do not expect them to be. As fantasies, the characters are powerful because they provide a vehicle for our own desires: for our everyday normal failings to be just a cover for how secretly amazing we are.
But for the most part, most smart people are perfectly fit, healthy, and coordinated–even the ones who like math.
Evergreen State is the sort of small-potatoes college that I don’t normally focus on in my regular Cathedral Round-Ups. It accepts 98% of applicants— 1,707 out of 1,744 in a recent year–with an average SAT score of 1084. According to Pumpkin Person’s conversion table, this works out to an average student IQ of about 112, too low to benefit from college instruction.
You have probably already heard about the recent protests at Evergreen State, in which students have gone completely insane in response to a professor objecting to segregation. Here is a decent article, though by the time this posts there will probably be a variety of new developments.
The students themselves are morally repugnant, but it is unsurprising that sometimes people say and do stupid things. Like terrorist “incidents,” leftist students turning on their professors and trying to destroy their lives is now routine, surprising to no one but the professors themselves, who until the attack descended saw themselves as good leftists.
The left’s power to destroy their own depends on their cultish claim to a monopoly on morality. To be liberal is to be a “good person,” an identity people cling to even as they are attacked and their lives destroyed by “their own side.” The entire construct is built on the desire to not be racist, America’s “Original Sin,” and thus attacks hinge on claims that the professor actually is racist.
All of these attacks would stop, of course, if universities simply declared that they don’t care if professors are racist or not. After all, students regularly protest over matters like cafeteria meal plans or housing, but universities ignore these protests and they die quickly. Universities don’t care if you like their food, but they are deeply invested in leftist ideology and its promotion.
These protests aren’t motivated by anything a normal person would call “racism”–leftist professors are pretty good at avoiding anything that looks like conventional racism–but bad allyship.
An Ally, in SJW-speak, is a “privileged” person who has dedicated themselves to helping the “unprivileged.” For example, a straight person might be an LGBT ally or a white person might be a black ally.
In politics, allies work together for their mutual benefit, typically against a common enemy. An alliance between the US and Britain or Germany and Japan is supposed to help both countries. An alliance between whites and blacks would therefore be to the mutual benefit of both parties. Whites would defend blacks from harm, and blacks would defend whites from harm. Neither group would attack each other.
But “white allies” are not working for the benefit of white people. They’re working against their own self-interest. This is where the whole matter breaks down, because privilege theory teaches that whites, as a whole, have benefited from the oppression of black (and brown) people. The promotion of white interests is therefore in direct opposition to the promotion of black interests.
Don’t protest that you know more about racism or fighting racism than they do
Leave black people alone and don’t take over their events and spaces
This is all perfectly sensible if you are a black person trying to promote black interests, but not particularly in the interests of anyone who isn’t black.
Professor Weinstein objected to a “Day of Absence” in which SJWs wanted all white people to stay off campus for the day, leaving the space solely for POC enjoyment. (As though universities were some kind of social hall and not money-making businesses.) Weinstein saw this as forced segregation aimed at himself at a place where he is, after all, not merely socializing but trying to earn a living. Of course the “Day of Absence” is being portrayed as “entirely voluntary,” but somehow refusal to take part is being met with screaming protests, violence, and general campus shutdown.
Weinstein’s version of fighting racism involves treating all people equally, not harming people like himself. The protesters’ version requires whites to give up their own self-interest in order to benefit others–indeed, anti-racists call for the abolition of “whiteness” entirely. But of course this is not an alliance, and is why “allies” are never treated as such, but with barely concealed hatred and disdain. Weinstein’s desire to not be segregated solely because of his race is so shocking to these people that they have actually responded by violently hunting him down and driving him off campus.
Edited to avoid confusion–did not mean to imply that 112 IQ is stupid, though many Evergreen students clearly are.
A commentator last month asked if universities do anything good, so I though I would begin this month’s Cathedral Round-Up by searching for some good news.
More than 13 million pain-blocking epidural procedures are performed every year in the United States. Although epidurals are generally regarded as safe, there are complications in up to 10 percent of cases, in which the needles are inserted too far or placed in the wrong tissue.
A team of researchers from MIT and Massachusetts General Hospital hopes to improve those numbers with a new sensor that can be embedded into an epidural needle, helping anesthesia doctors guide the needle to the correct location.
Since inserting a giant needle into your spine is really freaky, but going through natural childbirth is hideously painful, I strongly support this kind of research.
More than half of Americans under the age of 25 who have a bachelor’s degree are either unemployed or underemployed. According to The Christian Science Monitor, nearly 1 percent of bartenders and 14 percent of parking lot attendants have a bachelor’s degree.
Adding additional degrees is no guarantee of employment either. According to a recent Urban Institute report, nearly 300,000 Americans with master’s degrees and over 30,000 with doctorates are on public relief. …
Unless you have a “hard” skill, such as a mastery of accounting, or a vocational certificates (e.g., in teaching) your liberal arts education generally will not equip you with the skill set that an employer will need.
Obviously colleges still do some good things. Much of the research I cite here in this blog originated at a college of some sort. And of course, if you are careful and forward thinking, you can use college to obtain useful skills/information.
But between the years, money, and effort students spend, not to mention the absurd political indoctrination, college is probably a net negative for most students.
A few doctors in the 1400s probably saved the lives of their patients, but far more killed them.
Candy Crush, Bejeweled, Farmville, and many other games are exceedingly dumb ways to pass your time–and yet, chances are you’ve played some version of them anyway. People have, collectively, spent millions of potentially-productive hours on such games. Even more amazingly, people have spent millions of dollars in actual money on these games.
These games work because they’re addictive. Click the screen a few times, and corn appears! Wow! So you click the screen again, hoping more corn will appear. But as you “progress” through the game, each level becomes harder, takes longer, or requires more clicks. Next thing you know, you’re pulling out your phone at family functions to check on your fake corn instead of socializing with your cousins, or getting mugged on the subway because you were too busy swiping candies to pay attention to your surroundings.
Our career tracks have become far too similar.
I had the luck to catch up with a friend recently during a rare moment of down time. Way back in highschool, she decided to dedicate her life to one of those careers that shows a true commitment to helping others. Her adulthood, so far: 4 years of college; 4 years of grad school; 4 years of training; 2 years in a specialization program. By the time she has any hope of even being geographically settled instead of moving every few years, assuming she can get a job that will let her settle, she’ll be in her mid to late 30s. By the time she’s paid off her education debt, she’ll be in her 50s. Whether she wants kids or not, the question is practically moot.
It’s like the Farmville of real life, only instead of crops, you harvest degrees and grants and papers and fellowships.
Why pursue such a track? Yes, obviously, because she’s passionately committed to helping others, which is what she does. But also because our system requires and rewards such behavior.
There is absolutely no damn reason a JD or MD requires 4 years of college in addition to the programs themselves. There is no damn reason not to expedite a new doctor or lawyer or scientist or pretty much anyone else’s path to geographic and income stability.
When we ask why smart people don’t have more children, a big reason is that smart people are up to their eyeballs in debt, working 12 (or 24!) hour days, and constantly moving in hopes of finally getting enough points on their resumes to score a permanent job.
Fuck, people struggle just to get volunteer jobs.
Meanwhile, compare our friend to an Amish farmer. The work is hard. Back-breaking, sweaty, sometimes disgusting. If you’re unlucky, you could get trampled by a cow or something.
But there are no degrees. You don’t have to go to school to learn how to milk a cow and plow a field; your parents taught you that. There’s very little in the way of career advancement. You’ve been doing farm labor since you were four or so, and you’re likely to continue doing it until you die. You know you’ll probably have a job next year, how much money your crops will bring in, and if you need a new barn, your family will probably pitch in and help you out.
And the Amish have a lot of children. According to the Wikipedia, there were 5,000 Amish in 1920, and there were 290,000 Amish in 2014–and that’s not counting all of the ex-Amish who’ve left the faith over the years.
The same is true for people who aren’t Amish, but who face similarly limited career opportunities. If you can’t advance, you focus your energies elsewhere. If your phone dies because you forgot to charge it, you might be forced to actually interact with the people around you or read a goddamn book for a change.
I like having doctors. I like scientists. I can even stomach the thought of having some lawyers for certain purposes, like helping people fill out their wills. But we have to expedite the process.