Like the uncollapsed quantum state holding Schrodinger’s cat in a state of simultaneous life and death, whether a school is “teaching critical race theory” or not seems to depend entirely on whether the inquiring person wants them to. Are you anti-CRT? Then, you may rest assured, American schools most certainly aren’t teaching CRT. (If you press a bit and ask why the district has cancelled all of the advanced math classes in the name of “equity,” you’ll be politely informed that this, “Isn’t CRT,” and, further, that you are, “Full of hate. So, so full of hate.”) On the other hand, if you are in favor of CRT, then you will be heartened to know that the schools definitely are teaching CRT.
The National Education Association (NEA) is, according to Wikipedia,
“the largest labor union and the largest white-collar representative in the United States.[2] It represents public schoolteachers and other support personnel, faculty and staffers at colleges and universities, retired educators, and college students preparing to become teachers. The NEA has just under 2.3 million members and is headquartered in Washington, D.C.[3] The NEA had a budget of more than $341 million for the 2012–2013 fiscal year.[4]
The NEA has been hard at work at their annual meeting this summer, passing (among doubtless many other important union matters), the alluringly-named New Business Item 39:
The NEA will, with guidance on implementation from the NEA president and chairs of the Ethnic Minority Affairs Caucuses:
A. Share and publicize, through existing channels, information already available on critical race theory (CRT) — what it is and what it is not; have a team of staffers for members who want to learn more and fight back against anti-CRT rhetoric; and share information with other NEA members as well as their community members.
B. Provide an already-created, in-depth, study that critiques empire, white supremacy, anti-Blackness, anti-Indigeneity, racism, patriarchy, cisheteropatriarchy, capitalism, ableism, anthropocentrism, and other forms of power and oppression at the intersections of our society, and that we oppose attempts to ban critical race theory and/or The 1619 Project.
It goes on, but the grammar here is so atrocious that I had to pause to double-check what, exactly, the nation’s largest union of educators had written. This is a complicated sentence, given the nested nature of the resolution’s clauses, but we can simplify it by only looking at subjects, verbs, and parts that make no sense at all:
“The NEA will… share and publicize… Have a team… and share information… Provide (a study)… and that we oppose attempts to ban critical race theory and/or The 1619 Project.”
Absolutely pathetic. I might just be a mom, but at least I have a grasp of basic grammar. These people are teachers.
I like the inclusion of anthropocentrism here. It’s good to remind children that when dogs are allowed to pee on random trees, but they aren’t, this is speciesism, and speciesism is evil. True equality will not have been achieved until children and dogs are treated equally.
But let’s go on:
C. Publicly (through existing media) convey its support for the accurate and honest teaching of social studies topics, including truthful and age-appropriate accountings of unpleasant aspects of American history, such as slavery, and the oppression and discrimination of Indigenous, Black, Brown, and other peoples of color, as well as the continued impact this history has on our current society.
You might have thought that the purpose of school was to equip children with the skills they’ll need in adulthood, but it’s actually to make children sad.
The Association will further convey that in teaching these topics, it is reasonable and appropriate for curriculum to be informed by academic frameworks for understanding and interpreting the impact of the past on current society, including critical race theory.
Ah, yes, academic frameworks. You see, whether you’re busy teaching kindergarteners their ABCs or trying to help the whopping 28% of 12th graders who still can’t even read at a basic level, it’s important to make sure you’re using college-level academic frameworks for the concepts you introduce to your students. Supposedly the people who wrote this, or at least who voted on it, are “real teachers” who have totally interacted with “real children” and understand the meaning of the phrase “age appropriate instruction,” and aren’t just trying to shoehorn their political beliefs into an utterly inappropriate context.
D. Join with Black Lives Matter at School and the Zinn Education Project to call for a rally this year on October 14—George Floyd’s birthday—as a national day of action to teach lessons about structural racism and oppression. Followed by one day of action that recognize and honor lives taken such as Breonna Taylor, Philando Castile, and others. [Sic]
Aside from being entirely inappropriate, this is grammatically pathetic.
USA’s economy/social order is built on interactions between different cultures/races.
“Don’t worry, Evie,” they said, “Cultural Marxism isn’t hiding under the bed, waiting to eat your fingers, because Cultural Marxism isn’t real.”
Oh, my sweet readers, Cultural Marxism is real, very real, and the only reason it isn’t hiding under your bed is because it’s busy reshaping Marxist arguments about the structure of society and the economy being determined by a nation’s economic system into an argument that they’re determined by the nation’s racial system.
To deny opportunities to teach truth about Black, Brown, and other marginalized races minimalizes the necessity for students to build efficacy.
I think this sentence is grammatical, it just sounds like schizophrenic word salad and actually says the opposite of what it is supposed to. To simplify/make it more understandable, “Denying opportunities… minimizes the need for students to become more effective.” Not needing to be more effective is a good thing: it means that students are already just as effective as they need to be.
I think they wanted to say, “Denying opportunities… minimizes the opportunities for students to become more effective.” Look over your work before you send it out, people. If necessary, get a friend to edit your work for you; they’ll probably catch mistakes you overlooked.
The ancient African proverb says, “Know Thyself.”
This is just thrown in randomly, at the end of the paragraph, with no context. A Turkish proverb says, “Those who want yogurt in winter must carry a cow in their pocket,” and an Arabic proverb says, “Someone who can’t dance says the ground is sloping.”
“Know thyself” is, incidentally, also a Greek aphorism; it was inscribed in the Temple of Apollo at the Oracle of Delphi:
“But I have no leisure for them at all; and the reason, my friend, is this: I am not yet able, as the Delphic inscription has it, to know myself; so it seems to me ridiculous, when I do not yet know that, to investigate irrelevant things.”–Socrates, Phaedrus.
Of course, “know thyself” is short and straightforward enough that it is probably a bit of wisdom given in many cultures.
And finally, the money:
This item cannot be accomplished with current staff and resources under the proposed Modified 2021-2022 Strategic Plan and Budget. It would cost an additional $127,600.
Someone will be well-paid for this grift.
At least you may take some comfort, my reader, that the quantum state of CRT in the schools has collapsed: the teachers’ union has voted unambiguously in its favor.
Historian and philosopher Emma Houston interrogates the science of electricity and light bulbs
The light switch was at the center of Houston’s first big foray into the history of electricity and lights. The story told in introductory electrical engineering textbooks is relatively simple: flipping the switch up turns the lights “on”; flipping the switch down turns the lights “off.” Whether a switch is in the on or off position has for for decades been seen as an expression of a light bulb’s “true” state or of “light itself.” It is the job of a science historian to discover where these stories come from, and why.
Houston’s doctoral dissertation, published in 2004 as Light Itself: The Search for On and Off in the Electric Circuit does just this, tracing the history of the idea that electromagnetic radiation is turned “on” and “off” by switches found on the wall. Early in the twentieth century, she shows, it was controversial to refer to “light switches” because sometimes electricians accidentally wired around them when installing lights.
But the fact that switches are visible, (unlike electricity) made them useful for enough to two groups of engineers–those building electrical circuits, and those working to untangle the role of electricity in light generation–that the association between switches and light solidified for decades.
Associating “light” with the “light switch,” writes Richardson, has serious consequences, as when engineers tried to develop a “super flashlight” that used two light switches and multiple batteries.
The “super flashlight” was finally abandoned in the development stage when engineers decided it was simpler to use bigger batteries, but in Light Itself, Houston argues that it made the light switch the star of electrical engineering in a way that still reverberates. She points to engineers like Professor Book, whose research focused for decades on using light switches to design home lighting plans. Such a focus was not inevitable, Houston argues: from the 1920s through the 50s, based on evidence in lasers, researchers saw buttons as drivers of light output.
It turns out that “light switches” do not actually cause bulbs to emit electromagnetic radiation. Engineers now understand that light, produced by incandescent bulbs as well as LEDs and compact fluorescents, is the result of numerous interconnected capacitors, resistors, power sources, and wire circuits that all work together. So called “light switches” do not cause light at all–they merely open and close light circuits, allowing electricity to flow (or not) to the bulbs.
But in an interview, Professor Book disagrees with Houston’s account. In Houston’s history, the super flashlight looms large in later researchers’ decision to focus on the switch, but Professor Book responds that research on the super flashlight “did not interest me, it did not impress me, it did not look like the the foundations of a path forward.” Building circuits around the light switch, he says, was not inspired by the popular image of a super bright flashlight with two switches, but “was simply the easiest way to design practical lighting for people’s houses” and that “flipping the switch does actually turn the lights on and off.”
Houston responds that of course we can’t expect actual engineers to know what inspired them or their fields, which is why we need science historians like herself to suss out what was really motivating them.
Author’s note: Professor Houston has degrees in philosophy and literature, but oddly, none in engineering or physics.
A virulent strain of antifeminism is thriving online that treats women’s empowerment as a mortal threat to men and to the integrity of Western civilization. Its proponents cite ancient Greek and Latin texts to support their claims―arguing that they articulate a model of masculinity that sustained generations but is now under siege.
Donna Zuckerberg dives deep into the virtual communities of the far right, where men lament their loss of power and privilege and strategize about how to reclaim them. She finds, mixed in with weightlifting tips and misogynistic vitriol, the words of the Stoics deployed to support an ideal vision of masculine life. On other sites, pickup artists quote Ovid’s Ars Amatoria to justify ignoring women’s boundaries. By appropriating the Classics, these men lend a veneer of intellectual authority and ancient wisdom to their project of patriarchal white supremacy. In defense or retaliation, feminists have also taken up the Classics online, to counter the sanctioning of violence against women.
“If someone were to put a proposition before men bidding them to choose, after examination, the best customs in the world, each nation would certainly select its own”–Herodotus
Translation: “I read a blog and I didn’t like it.”
So Donna Zuckerberg, a white woman with enough wealth and leisure to study the classics for a living and sister of one of the richest, most powerful men in the world (who also loves the classics so much that he has named his daughters “Maxima,” Latin for “greatest,”* and “August,” after Emperor Augustus,) is complaining that Losers on the Internet are sullying the Classics by quoting Ovid.
This is a problem because White Men on the Internet are Privileged (even when they are poor whites who struggle to get a job or even friends,) while rich white women like Donna are the Oppressed.
*(Maxima is also named after two relatives named “Max,” though if honoring relatives were the only motive, Zuck could have gone with “Maxine,” or named her after a female relative.)
Realistically, these men aren’t a threat to Mrs. Zuckerberg; the aren’t going to rise up and force her back into the kitchen, barefoot and pregnant. They are, however, icky, and Donna obviously doesn’t like them impinging on her turf: “By appropriating the Classics, these men lend a veneer of intellectual authority and ancient wisdom to their project of patriarchal white supremacy.”
Appropriating from whom? What culture owns Ovid and Homer? These books are considered the foundation of all of Western Civilization. Is Heartiste not a part of Western Civilization? I suppose you could argue that Roosh is Iranian/Armenian by blood, (despite being born in the US,) but arguing that Roosh can’t enjoy Ovid because he’s Iranian is, well, stupid.
I understand that Mrs. Zuckerberg doesn’t like pickup blogs, but you can’t appropriate the intellectual and literary foundations of your own culture. This is like accusing a Hindu of appropriating the Bhagavad Gita just because he’s a jerk.
‘Mask of Agamemnon’, discovered by Schliemann, 1876. “They sent forth men to battle, But no such men return; And home, to claim their welcome, Come ashes in an urn” Aeschylus, Agamemnon
The implication of “appropriating” is that Donna thinks the classics belong to some narrow class of people–most likely, academic dilettantes like herself. But as I’ve noted before, Donna Zuckerberg doesn’t own the Classics. Being rich doesn’t give her any more right to quote Plato than anyone else in the entire damn world.
But my complaints aside, I think this nicely illustrates a difficulty found in many academic disciplines:
It’s very difficult to make any new arguments about the Classics. Ovid has been around for a long time. So has Homer. Everything you can say about them has probably been said a thousand times already.
Schliemann managed to up the ante by actually finding Troy, but what’s left to discover? You will never be as great as Schliemann. You will always toil in the shadows of the greats of the past.
But there are rules in academia, most notably, “Publish or perish.” If you want to be a professor or otherwise taken seriously as an academic, you’ve got to publish papers.
What, exactly, are you going to publish on a subject that was thoroughly mined for all new ideas and concepts hundreds of years ago?
Are we to believe the Egyptians managed to manufacture pigment from calcium copper silicate and use it in these elaborate paintings without being able to see it?
So I see two options:
1. Lie. Just make something up, like “the ancients couldn’t see blue.” Totally untrue, but people have bought it, hook, line, and sinker.
2. Write things that aren’t new and don’t provide any new insights, but show that you are a member of the “classics community.”
We think of academic disciplines as “producing knowledge,” but it may be more accurate to think of them as “knowledge communities.” to be part of those communities, all you have to do is produce works that show what a good community member you are. People who fit in get friends, mentors, promotions, and opportunities. People who don’t fit in either get pushed out or leave of their own accord. There’s not much new to say about the Classics, but there are plenty of people who enjoy reading the classics and discussing them with others–and that makes a community, and where there’s a community, people will try to protect what is culturally “theirs.” Folks like Roosh and Heartiste, then, are moving in on academic territory.
What counts as being a “good member” of your community depends on the current social norms in that community. If your community is full of people who say things like “The Classics are the foundation for the greatness of Western Civilization,” then aspirant community members will publish things echoing that.
And if your community is full of people who say things like “If your feminism isn’t intersectional, it’s bullshit,” then you’re going to write things like that.
Herodotus’s World “After all, no one is stupid enough to prefer war to peace; in peace sons bury their fathers and in war fathers bury their sons.” –Herodotus
Modern academia is not really comfortable with “Dead white males”* (much less “Alive white males,”) nor the idea of Western Civilization as anything particularly special or qualitatively different from other civilizations–which creates a bit of a conflict when your field is literally the semi-symbolic and literary basis of Western Civilization.
*Note: most people who study the classics know that the “Classical World” is really the circum-Mediterranean world, that Herodotus lived in now-Turkey, St. Augustine was born in now-Algeria, Alexander the Great’s empire stretched to India, etc. Whether these men were “white” (or men) is irrelevant to our understanding of the foundations of Western Civilization.
Now, I understand not liking everyone you meet on the internet. There are lots of wrong and terrible people in here. But this is why you get a blog where you can complain to the five people who can stand you about all of the other annoying people on the internet.
There are probably many academic disciplines which could, at this point, be transformed into blogs and tumblrs without much loss.
It has been an open secret for quite some time (at least since my childhood) that prestigious colleges like Harvard, Yale, Princeton, and Stanford discriminate against Asian applicants for the simple reason that they “score too high” and “if we took all of the qualified Asian applicants, we wouldn’t have room for other minorities.” (As far as I know, Caltech is the only famous school that does’t discriminate.)
As usual, the Asians just sucked it up and worked harder, but it only seemed like a mater of time before the Tiger Moms decided that “enough is enough”–hence the lawsuit.
Harvard’s official excuse is “Asians are boring,” which is utter bullshit; some of the most interesting people I know are Asian. From the NYT:
Harvard has testified that race, when considered in admissions, can only help, not hurt, a student’s chances of getting in.
This graph is a little tricky to understand. It shows the percent of each race’s applicants admitted to Harvard, sorted by academic ranking. So 58% of black applicants with the highest academic ranking–folks with perfect SATs and GPAs–were admitted, while only 12% of Asian applicants with identical SATs and GPAs were admitted. (For some reason, Harvard takes some percentage of students who aren’t really academically stellar, even though it receives plenty of top-tier applications.)
Vox managed to admit how much highly prestigious colleges hate Asians: they get 140 points deducted from their SATs, while Hispanics received a 130 point bonus and blacks a 310 point bonus. (Note, old data but the situation hasn’t changed much.)
Harvard consistently rated Asian-American applicants lower than other races on traits like likability, kindness and “positive personality”.
We need a word for this. I’m calling it “optimist privilege.” It’s time to stop optimists from oppressing the pessimists.
The pessimists are more likely than optimists to be correct, anyway.
Asian-Americans currently comprise 19% of admitted students at Harvard; if evaluated fairly, based on extra-curriculars + academics, they’d be 29%, and if admitted on pure academic merit, they’d be 43%.(Unsurprisingly, this is exactly the percent that Caltech, which does take students on merit, accepts.)
Timofey Pnin on Twitter calculates an even higher Asian acceptance rate if Harvard picked only from its top academic performers–51.7%
Now, many people–such as former defender of liberty, the ACLU–believe that ending Affirmative Action at Harvard would “primarily benefit white students” (the horror! We wouldn’t want to accidentally help white people in the process of being fair to Asians,) but by Timofey Pnin’s data, white admission rates would actually fall by 6%.
Unfortunately for Harvard, ending Affirmative Action would drop their black and Hispanic shares to nearly invisible 0.9% and 2.7%, respectively. Unfortunately, admissions, as currently practiced is a zero-sum game: making room for more Asians means admitting fewer of some other group.
Make no mistake, while the lawsuit is aimed explicitly at Harvard, all of the top schools do it. I wouldn’t be surprised if there were community colleges discriminating against Asians.
It’s easy to imagine a scenario where colleges are caught between a ruling that they have to take Asians in proportion to their academic rankings and a ruling that they have to take blacks and Hispanics in proportion to their population demographics.
(Of course, the biggest affirmative action boost is given to legacies , 33.6% of whom Harvard admits, and jocks [86% acceptance rate for “recruited athletes”].)
To those confused about why Harvard would bother taking anyone who isn’t in the top decile of academic performance–their bottom decile students are rather mediocre–the answer is that Harvard goal isn’t to educate the smartest kids in the nation. (That’s Caltech’s goal.) Harvard’s goal is to educate the future leaders of America, and those future leaders aren’t 50% Asian. (Harvard probably likes to flatter itself that it is enhancing those future leaders, but mostly it is attaching its brand name to successful people in order to get free advertising to boost its prestige, rather like companies offering endorsement deals to racecar drivers. It’s not Verizon that made Will Power win the Indianapolis 500, after all–awesome name, btw. Not only does Will have will power, he’s got wheel power. *badum tish*)
Even if Blacks, Native Americans, and Hispanics score abysmally on the SAT and ACT, some of them will go on to be major leaders, movers and shakers. (Though trends for Native Americans and Pacific Islanders are rather worrying.) Asians, meanwhile, continue to blow everyone else out of the water (there may be some merit to the argument that test scores should be adjusted to account for test prep, which Asians invest in heavily.)
I don’t know how the case will turn out. Perhaps the courts will realize the issue with colleges having to take applicants based on actual qualifications–or perhaps they will decide that blatant discrimination by an institution that receives tons of public funding is a violation of the 14th amendment and the Civil Rights Act.
Personally, I don’t care whether Harvard or Yale continues educating the “future leaders of America and the World,” but I do feel loyal to my Asian friends and desire that they be treated fairly and justly. In general, I think college admissions should be based entirely on academic merit, as any other standards simply skew the system toward those most inclined to cheat and game the system–and the system, as it stands, puts horrible and worthless pressure on high-achieving highschool students while delivering them very little in return.
I almost feel sad for Senator Warren. One day, a little girl looked in the mirror, saw pale skin, brown hair, and blue eyes looking back at her, and thought, “No. This can’t be right. This isn’t me.”
So she found a new identity, based on a family legend–a legend shared by a suspicious number of white people–that one of her ancestors was an American Indian.
This new identity conveyed certain advantages: Harvard Law claimed her as a Native American to boost claims of racial diversity among the faculty:
A majority [83%] of Harvard Law School students are unhappy with the level of representation of women and minorities on the Law School faculty, according to a recent survey. …
Law students said they want to learn from a variety of perspectives and approaches to the law. “A black male from a lower socioeconomic background will approach the study of constitutional law in a different way from a white upper-class male,” Reyes said. …
Of 71 current Law School professors and assistant professors, 11 are women, five are black, one is Native American and one is Hispanic, said Mike Chmura, spokesperson for the Law School.
Although the conventional wisdom among students and faculty is that the Law School faculty includes no minority women, Chmura said Professor of Law Elizabeth Warren is Native American.
In response to criticism of the current administration, Chmura pointed to “good progress in recent years.”
The University of Pennsylvania chose not to tout in the press their newly minted Native American professor. But her minority status was duly noted: The university’s Minority Equity Report, published in April 2005, shows that Warren won a teaching award in 1994. Her name is in bold and italicized to indicate she was a minority. …
The law school was happy to have her count as a diversity statistic, however, and for at least three of the years she taught there — 1991, 1992, and 1994 — an internal publication drawing on statistics from the university’s federal affirmative action report listed one Native American female professor in the university’s law school.
Warren’s Native American identity may have played no role in her hiring (the committees involved appear not to have known or cared about her identity,) but it seems to have been important to Warren herself. As her relatives aged and died, and she moved away from her childhood home in Oklahoma and then Texas, she was faced with that persistent question: Who am I?
The truth, a white woman from a working class family in Oklahoma, apparently wasn’t enough for Elizabeth. (Oklahoma doesn’t carry many status points over in East Coast academic institutions.)
Each of us is the sum of many things, including the stories our families tell us and genetic contributions from all of our ancestors–not just the interesting ones (within a limit–after enough generations, each individual contribution has become so small that it may not be passed on in reproduction.)
I have also done the 23 and Me thing, and found that I hail from something like 20 different ethnic groups–including, like Warren, a little smidge of Native American. But none of those groups make up the majority of my DNA. All of them are me; none of them are me. I just am.
Warren’s announcement of her DNA findings vindicated her claim to a Native American ancestor and simultaneously unveiled the absurdity of her claim to be a Native American. What should have been a set of family tales told to friends and passed on to children and grandchildren about a distant ancestor became a matter of national debate that the Cherokee Nation itself felt compelled to weigh in on:
Using a DNA test to lay claim to any connection to the Cherokee Nation or any tribal nation, even vaguely, is inappropriate and wrong. It makes a mockery out of DNA tests and its legitimate uses while also dishonoring legitimate tribal governments and their citizens, whose ancestors are well documented and whose heritage is proven. Senator Warren is undermining tribal interests with her continued claims of tribal heritage.
Like them or not, the Cherokee have rules about who is and isn’t a Cherokee, because being Cherokee conveys certain benefits–for example, the tribe builds houses for members and helps them look for jobs. This is why conflicts arise over matters like whether the Cherokee Freedmen are official members. When membership in a group conveys benefits, the borders of that group will be policed–and claims like Warren’s, no matter how innocently intended, will be perceived as an attempt at stealing something not meant for her.
Note: I am not saying this kind of group border policing is legitimate. Many “official” Cherokee have about as much actual Cherokee blood in them as Elizabeth Warren, but they have a documented ancestor on the Dawes Rolls, so they qualify and she doesn’t. Border policing is just what happens when there are benefits associated with being part of a group.
I don’t have an issue with Warren’s own self-identity. After all, if race is a social construct,* then she’s doing it exactly right. She’s allowed to have an emotional connection to her own ancestors, whether that connection is documented via the Dawes Rolls or not. All of us here in America should have equal access to Harvard’s benefits, not just the ones who play up a story about their ancestors.
The sad thing, though, is that despite being one of the most powerful and respected women people in America, she still felt the need to be more than she is, to latch onto an identity she doesn’t truly possess.
You know, Elizabeth… it’s fine to just be a white person from Oklahoma. It’s fine to be you.
*Note: This blog regards “species” and nouns generally as social constructs, because language is inherently social. That does not erase biology.
Student leaders at Manchester University declared that Kipling “stands for the opposite of liberation, empowerment, and human rights”.
The poem, which had been painted on the wall of the students’ union building by an artist, was removed by students on Tuesday, in a bid to “reclaim” history on behalf of those who have been “oppressed” by “the likes of Kipling”.
In lieu of Kipling’s If, students used a black marker pen to write out the poem Still I Rise by Maya Angelou on the same stretch of wall.
There’s a word for this: vandalism.
I am not a good judge of poetry, and in general, I think most people are no longer interested in poetry one way or another, so I am not going to judge the poems on their relative merits. I think a reasonable person could like either one. (Note: I have have in the past compared Shakespeare and Audre Lorde.)
You may write me down in history
With your bitter, twisted lies,
You may trod me in the very dirt
But still, like dust, I’ll rise.
Does my sassiness upset you?
Why are you beset with gloom?
‘Cause I walk like I’ve got oil wells
Pumping in my living room.
Just like moons and like suns,
With the certainty of tides,
Just like hopes springing high,
Still I’ll rise. …
Neither of these poems is a clear winner on merit, but they weren’t chosen on merit. Kipling’s poem was chosen to decorate the student center at a British university because Kipling is one of Britain’s most beloved and respected writers and this particular poem was voted one of Britain’s very favorites. Further, it contains practical life advice of the sort you normally aim at students.
Maya Angelou, by contrast, isn’t British. She’s an American.
According to Sara Khan, “Liberation & Access Officer” of the Manchester Student Union, majoring in English:
We, as an exec team, believe that Kipling stands for the opposite of liberation, empowerment, and human rights…
Well-known as author of the racist poem ‘The White Man’s Burden’, and a plethora of other work that sought to legitimate the British Empire’s presence in India and de-humanise people of colour, it is deeply inappropriate to promote the work of Kipling in our SU …
As a statement on the reclamation of history by those who have been oppressed by the likes of Kipling for so many centuries, and continue to be to this day, we replaced his words with those of the legendary Maya Angelou, a black female poet and civil rights activist.”
It takes some special variety of gall to major in English at a British university and then complain about reading one of Britain’s most famous poets–and a great deal of stupidity to put up with it.
Angelou’s words were written in a specifically American context, responding to the way she and other African Americans were treated here in the US. Her poem has nothing to do with Kipling or things Kipling or other Brits have done. It was selected in this perverted sense that all whites are equivalent and interchangeable, as are all non-whites. Any non-white poet will do for replacing white poets.
Maya Angelou’s poem was not selected to replace Kipling’s because the students think it is better on technical, poetic grounds, nor because it reflects an important part of British literature, but for its subject and the author’s identity: a black woman. The message is not, “Here’s a lovely poem; we think students will enjoy it.” The message is, “Fuck you to Kipling and everyone who loves him; we are wiping you off the walls, removing you from our spaces, and replacing you with our own poem about how we are rising up against you.”
Incidentally, for an “English major,” Sara is oddly ignorant of the fact that Kipling’s poem, “The White Man’s Burden,” was not written to justify British colonialism in India. (I guess she is not a very good English major.) It was actually written to encourage the US to colonize the Philippines.
Kipling also seems to have been ambivalent about the whole endeavor:
Take up the White Man’s burden —
And reap his old reward:
The blame of those ye better,
The hate of those ye guard —
The cry of hosts ye humour
(Ah, slowly!) toward the light: —
“Why brought he us from bondage,
Our loved Egyptian night?”
We’re going to kick off today’s Cathedral Round-Up with a trip down memory lane.
This may come as some surprise, given my scintillating wit and gregarious nature, but I was not popular in school. If there was a social totem pole, I was a mud puddle about twenty yards to the left of the pole.
The first time I felt like I truly fit in–I belonged–was at nerd camp. This was a sort of summer camp your parents send you to when you’ve failed at Scouting and they hope maybe you’ll pick up chemistry or philosophy instead.
One evening, when I was gathered in the dorm with my new friends, a girl burst triumphantly into our midst, brandishing a book. “I have it,” she triumphed. “I have it! The book!”
The Book, which we all proceeded to read, and after camp ended, to discuss in what were my very first emails, was The Hitchhiker’s Guide to the Galaxy.
The researchers found that during their informational presentations, the recruiters—no doubt in an attempt to bond with their audiences—frequently referenced “geek culture favorites” such as Star Trek and The Hitchhikers Guide to the Galaxy, focused the conversation exclusively on highly technical aspects of the roles or referred to high school coding experience. …
In case you haven’t noticed or this is your first time visiting my humble blog, I am female. All of my friends at camp were female.
“Through gender-imbalanced presenter roles, geek culture references, overt use of gender stereotypes, and other gendered speech and actions, representatives may puncture the pipeline, lessening the interest of women at the point of recruitment into technology careers,” the researchers write.
Dear Diversity Experts: In the words of the first real friend I ever had, please disembowel yourselves with a rusty spoon.
The study itself is not easily available online, so I will respectfully judge them based on summaries in HRE and Wired.
Short version: A couple of sociologist “gender researchers,” who of course know STEM culture very well, sat in on tech company recruiting sessions at Stanford and discovered that nerds talk about nerd things, OMG EWWW, and concluded that icky nerds doing their nerd thing in public is why women decide to go apply for more prestigious jobs elsewhere.
Now, I understand what it’s like not to get someone else’s references. I haven’t seen Breaking Bad, NCIS, Sex in the City, Seinfeld, The Simpsons, or the past X Starwars installments. I don’t watch sports, play golf, or drink alcohol.
But I don’t go around complaining that other people need to stop talking about things that interest them and just talk about stuff that interests me. It doesn’t bother me that other people have their interests, because I have plenty of room over here on my end of the internet to talk about mine.
But apparently these “Diversity Experts” think that the cultural icons of my childhood need to be expunged from conversation just to make people like them feel more comfortable.
Dear Correll and Wynn: when people like you stop assuming that everyone in your vicinity is interested in hearing about wine and yoga and golf, I’ll stop assuming that people who show some interest in my culture are interested in The Hitchhiker’s Guide to the Galaxy.
Notice that the problem here is not that the women are being turned away, or discriminated against, or receiving fewer callbacks than male applicants. No, the problem is that the women think geek culture is icky and so don’t even bother to apply. They have decided that they have better options, but since someone decided that is imperative that all professions be 50% women (except plumbing, sewer workers, truckers, etc.) they must somehow be tricked into going into their second-choice field.
No one seems to have thought to, ahem, consult the actual women who work in Tech or who have STEM degrees or are otherwise associated with the field about whether or not they thought these sorts of geek cultural references were off-putting. No, we do not exist in Correll and Wynn’s world, or perhaps because our numbers are low, there just aren’t enough of us to matter.
STEM/tech exists in this weird limbo where women abstractly want more women in it, but don’t actually want to be the women in it. Take Wynn. She has a degree in English. She could have majored in Chemistry, but chose not to. Now she whines that there aren’t enough female engineers.
People routinely denigrate law and lawyers. Lawyers are the butt of many jokes, and people claim to hate lawyers, but lawyers themselves are treated with a great deal of courtesy and respect, and have no difficulties on the dating market.
STEM works inversely: people claim to hold scientists and mathematicians in great respect, but in practice they are much lower on the social totem pole. Lots of people would like good grades in math, but don’t want to hang out with the kid who does get good grades in math.
So feminists want women to be acknowledged as equally capable with men at things like “math” and “winning Nobel Prizes” and “becoming billionaire CEOS” (hey, I want those things, too,) but don’t want to do the grunt work that is most of what people in STEM fields actually do. They don’t want to spend their days around sweaty guys who talk about Linux kernels or running around as lab assistant #3. For a lot of people, tech jobs are not only kind of boring and frustrating, but don’t even pay that well, considering all of the education involved in getting them.
The result is a lot of concern trolling from people who claim to want more women in STEM, but don’t want to address the underlying problems for why most women aren’t all that interested in STEM in the first place.
Are there real problems for women in STEM? Maybe. I have female commentators who can tell you about the difficulties they’ve had in STEM communities. It is different being a female in a male-dominated field than being female in a balanced or female-dominated field, and this has its downsides. But “men said nerd things” or “men referenced porn” is not even remotely problematic. (I will note that men have problems in STEM fields, too.)
While we’re here, I’d like to talk about these “Diversity Experts” whom HRE cites as proof for their claims that women find geek culture off-putting. Their link heads not to a study on the subject, nor even an actual expert on anything, but an opinion piece by Kerry Flynn on Mashable:
The lack of diversity in tech isn’t a new issue, and yet top leaders in Silicon Valley still struggle to talk about it.
They struggle so much that this is an entire article about a female CEO talking about it. Talking openly about a thing is the same as struggling to talk about it, right?
The latest stumble comes from YouTube CEO Susan Wojcicki speaking with MSNBC’s Ari Melber and Recode’s Kara Swisher at the media companies’ first town hall titled “Revolution: Google and YouTube Changing the World,” which aired Sunday.
The latest stumble, ladies and gents! Wojcicki might be a female CEO of a tech giant, but what the hell does she know? Kerry Flynn knows much better than she does. Wojcicki had better shape up to Flynn’s standards, because Flynn is keeping track, ladies and gents.
According to Wojcicki, one reason for the lack of women in tech is its reputation for being a “very geeky male industry.”
Ouch.
That kind of statement makes it seem like Wojcicki has forgotten about the diverse and minority perspectives that are fighting for representation in the industry. For instance, with the #IlLookLikeAnEngineer campaign, engineer Isis Wenger wrote about the sexism she faced working in tech and inspired a movement of women shutting down stereotypes.
See, women and minorities are trying to counter the perception of tech being a “very geeky male industry,” which Wojcicki obviously forgot about when she claimed that tech has a reputation for being a “very geeky male industry.”
Kerry Flynn is very stupid.
The entire article goes on in this vein and it’s all awful. Nowhere does Flynn prove anything about women not liking The Hitchhikers’ Guide to the Galaxy.
***
What other interesting articles does Stanford Magazine hold for us?
So what happens when you send your kids to Stanford? Stanford Magazine has helpful interviews with recent grads. Yeji Jung got enmeshed in Social Justice, changed her major from pre-med to “comparative studies in race and ethnicity,” graduated, and went home to her parents to make collages.
I searched for Yeji Jung’s art, which is supposed to be making the world a better and more just place, and found almost nothing. This red cabbage and the lips in the Stanford Mag article are it. This does not look promising.
I bet her parents are very glad they worked their butts off for years making sure their kid got all As in her classes and aced SAT so she could come home from Stanford and paste paper together.
A quote from the article:
A thesis project to investigate the links between her Korean-American identity and the experiences of her Korean grandmothers took her to Seoul, South Korea, and Manassas, Va., to interview them in Korean.
Wait, you can get a degree from Stanford by interviewing your grandparents? Dude, I call my grandma every weekend! That should be worth at least a master’s.
“[My grandmothers’] lives are so deeply gendered in a way that I just have not experienced as someone who grew up in the U.S. One of my interview questions was framed as, ‘What did you study in college?’ [My grandmother in Virginia said,] ‘Oh, I didn’t go to college — girls in that day didn’t go to college. We went to work.’ That was a moment for me of, ‘Wow, I just have these assumptions about my life that are not a given.’
Girls in my grandmothers’ day went to college. Both of mine went to college. One of them earned a PhD in a STEM field; the other became a teacher. Teacher was a pretty common profession for women in my grandmother’s day. So was nurse.
I can take that a step further: my great-grandmother went to college.
Perhaps she meant was girls in Korea didn’t go to college in those days, though I’m sure Korea had needed plenty of nurses about 70 years ago, and frankly I’m not sure many men were going to college in those days.
I often idly wonder if elites push SJW nonsense to remove competitors. Yeji Jung is probably a very bright young woman who would have made an excellent doctor or medical researcher. Instead she has shuffled off to irrelevance.
Make no mistake: Nichols is annoyingly arrogant. He draws a rather stark line between “experts” (who know things) and everyone else (who should humbly limit themselves to voting between options defined for them by the experts.) He implores people to better educate themselves in order to be better voters, but has little patience for autodidacts and bloggers like myself who are actually trying.
But arrogance alone doesn’t make someone wrong.
Nichols’s first thesis is simple: most people are too stupid or ignorant to second-guess experts or even contribute meaningfully to modern policy discussions. How can people who can’t find Ukraine on a map or think we should bomb the fictional city of Agrabah contribute in any meaningful way to a discussion of international policy?
It was one thing, in 1776, to think the average American could vote meaningfully on the issues of the day–a right they took by force, by shooting anyone who told them they couldn’t. Life was less complicated in 1776, and the average person could master most of the skills they needed to survive (indeed, pioneers on the edge of the frontier had to be mostly self-sufficient in order to survive.) Life was hard–most people engaged in long hours of heavy labor plowing fields, chopping wood, harvesting crops, and hauling necessities–but could be mastered by people who hadn’t graduated from elementary school.
But the modern industrial (or post-industrial) world is much more complicated than the one our ancestors grew up in. Today we have cars (maybe even self-driving cars), electrical grids and sewer systems, atomic bombs and fast food. The speed of communication and transportation have made it possible to chat with people on the other side of the earth and show up on their doorstep a day later. The amount if specialized, technical knowledge necessary to keep modern society running would astonish the average caveman–even with 15+ years of schooling, the average person can no longer build a house, nor even produce basic necessities like clothes or food. Most of us can’t even make a pencil.
Even experts who are actually knowledgeable about their particular area may be completely ignorant of fields outside of their expertise. Nichols speaks Russian, which makes him an expert in certain Russian-related matters, but he probably knows nothing about optimal high-speed rail networks. And herein lies the problem:
The American attachment to intellectual self-reliance described by Tocqueville survived for nearly a century before falling under a series of assaults from both within and without. Technology, universal secondary education, the proliferation of specialized expertise, and the emergence of the United States a a global power in the mid-twentieth century all undermined the idea… that the average American was adequately equipped either for the challenges of daily life or for running the affairs of a large country.
… the political scientist Richard Hofstadter wrote that “the complexity of modern life has steadily whittled away the functions the ordinary citizen can intelligently and competently perform for himself.”
… Somin wrote in 2015 that the “size and complexity of government” have mad it “more difficult for voters with limited knowledge to monitor and evaluate the government’s many activities. The result is a polity in which the people often cannot exercise their sovereignty responsibly and effectively.”
In other words, society is now too complex and people too stupid for democracy.
Nichols’s second thesis is that people used to trust experts, which let democracy function, but to day they are less trusting. He offers no evidence other than his general conviction that this change has happened.
He does, however, detail the way he thinks that 1. People have been given inflated egos about their own intelligence, and 2. How our information-delivery system has degenerated into misinformational goo, resulting in the trust-problems he believes we are having These are interesting arguments and worth examining.
A bit of summary:
Indeed, maybe the death of expertise is a sign of progress. Educated professionals, after all, no longer have a stranglehold on knowledge. The secrets of life are no longer hidden in giant marble mausoleums… in the past, there was less tress between experts and laypeople, but only because citizen were simply unable to challenge experts in any substantive way. …
Participation in political, intellectual, and scientific life until the early twentieth century was far more circumscribed, with debates about science, philosophy, and public policy all conducted by a small circle of educated males with pen and ink. Those were not exactly the Good Old Days, and they weren’t that long ago. The time when most people didn’t finish highschool, when very few went to college, and only a tiny fraction of the population entered professions is still within living memory of many Americans.
Aside from Nichols’s insistence that he believes modern American notions about gender and racial equality, I get the impression that he wouldn’t mind the Good Old Days of genteel pen-and-ink discussions between intellectuals. However, I question his claim that participation in political life was far more circumscribed–after all, people voted, and politicians liked getting people to vote for them. People anywhere, even illiterate peasants on the frontier or up in the mountains like to gather and debate about God, politics, and the meaning of life. The question is less “Did they discuss it?” and more “Did their discussions have any effect on politics?” Certainly we can point to abolition, women’s suffrage, prohibition, and the Revolution itself as heavily grass-roots movements.
But continuing with Nichols’s argument:
Social changes only in the past half century finally broke down old barriers of race, class, and sex not only between Americans and general but also between uneducated citizens and elite expert in particular. A wide circle of debate meant more knowledge but more social friction. Universal education, the greater empowerment of women and minorities, the growth of a middle class, and increased social mobility all threw a minority of expert and the majority of citizens into direct contact, after nearly two centuries in which they rarely had to interact with each other.
And yet the result has not been a greater respect for knowledge, but the growth of an irrational conviction among Americans that everyone is as smart as everyone else.
Nichols is distracting himself with the reflexive racial argument; the important change he is highlighting isn’t social but technical.
I’d like to quote a short exchange from Our Southern Highlanders, an anthropologic-style text written about Appalachia about a century ago:
The mountain clergy, as a general rule, are hostile to “book larnin’,” for “there ain’t no Holy Ghost in it.” One of them who had spent three months at a theological school told President Frost, “Yes, the seminary is a good place ter go and git rested up, but ’tain’t worth while fer me ter go thar no more ’s long as I’ve got good wind.”
It used to amuse me to explain how I knew that the earth was a sphere; but one day, when I was busy, a tiresome old preacher put the everlasting question to me: “Do you believe the earth is round?” An impish perversity seized me and I answered, “No—all blamed humbug!” “Amen!” cried my delighted catechist, “I knowed in reason you had more sense.”
But back to Nichols, who really likes the concept of expertise:
One reason claims of expertise grate on people in a democracy is that specialization is necessarily exclusive. WHen we study a certain area of knowledge or spend oulives in a particular occupation, we not only forego expertise in othe jobs or subjects, but also trust that other pople in the community know what they’re doing in thei area as surely as we do in our own. As much as we might want to go up to the cockpit afte the engine flames out to give the pilots osme helpful tips, we assume–in part, ebcause wehave to–that tye’re better able to cope with the problem than we are. Othewise, our highly evovled society breaks down int island sof incoherence, where we spend our time in poorly infomed second-guessing instead of trusting each other.
This would be a good point to look at data on overall trust levels, friendship, civic engagement, etc (It’s down. It’s all down.) and maybe some explanations for these changes.
Nichols talks briefly about the accreditation and verification process for producing “experts,” which he rather likes. There is an interesting discussion in the economics literature on things like the economics of trust and information (how do websites signal that they are trustworthy enough that you will give them your credit card number and expect to receive items you ordered a few days later?) which could apply here, too.
Nichols then explores a variety of cognitive biases, such a superstitions, phobias, and conspiracy theories:
Conspiracy theories are also a way for people to give meaning to events that frighten them. Without a coherent explanation for why terrible thing happen to innocent people, they would have to accept such occurence as nothing more than the random cruelty either of an uncaring universe or an incomprehensible deity. …
The only way out of this dilemma is to imagine a world in which our troubles are the fault of powerful people who had it within their power to avert such misery. …
Just as individual facing grief and confusion look for reasons where none may exist, so, too, will entire societies gravitate toward outlandish theories when collectively subjected to a terrible national experience. Conspiracy theories and flawed reasoning behind them …become especially seductive “in any society that has suffered an epic, collectively felt trauma. In the aftermath, millions of people find themselves casting about for an answer to the ancient question of why bad things happen to good people.” …
Today, conspiracy theories are reaction mostly to the economic and social dislocations of globalization…This is not a trivial obstacle when it comes to the problems of expert engagement with the public: nearly 30 percent of Americans, for example, think “a secretive elite with a globalist agenda is conspiring to eventually rule the world” …
Obviously stupid. A not-secret elite with a globalist agenda already rules the world.
and 15 percent think media or government add secret mind controlling technology to TV broadcasts. (Another 15 percent aren’t sure about the TV issue.)
It’s called “advertising” and it wants you to buy a Ford.
Anyway, the problem with conspiracy theories is they are unfalsifiable; no amount of evidence will ever convince a conspiracy theorist that he is wrong, for all evidence is just further proof of how nefariously “they” are constructing the conspiracy.
Then Nichols gets into some interesting matter on the difference between stereotypes and generalizations, which segues nicely into a tangent I’d like to discuss, but it probably deserves its own post. To summarize:
Sometimes experts know things that contradict other people’s political (or religious) beliefs… If an “expert” finding or field accords with established liberal values, EG, the implicit association test found that “everyone is a little bit racist,” which liberals already believed, then there is an easy mesh between what the academics believe and the rest of their social class.
If their findings contradict conservative/low-class values, EG, when professors assert that evolution is true and “those low-class Bible-thumpers in Oklahoma are wrong,” sure, they might have a lot of people who disagree with them, but those people aren’t part of their own social class/the upper class, and so not a problem. If anything, high class folks love such finding, because it gives them a chance to talk about how much better they are than those low-class people (though such class conflict is obviously poisonous in a democracy where those low-class people can still vote to Fuck You and Your Global Warming, Too.)
But if the findings contradict high-class/liberal politics, then the experts have a real problem. EG, if that same evolution professor turns around and says, “By the way, race is definitely biologically real, and there are statistical differences in average IQ between the races,” now he’s contradicting the political values of his own class/the upper class, and that becomes a social issue and he is likely to get Watsoned.
For years folks at Fox News (and talk radio) have lambasted “the media” even though they are part of the media; SSC recently discussed “can something be both popular and silenced?”
Jordan Peterson isn’t unpopular or “silenced” so much as he is disliked by upper class folks and liked by “losers” and low class folks, despite the fact that he is basically an intellectual guy and isn’t peddling a low-class product. Likewise, Fox News is just as much part of The Media as NPR, (if anything, it’s much more of the Media) but NPR is higher class than Fox, and Fox doesn’t like feeling like its opinions are being judged along this class axis.
For better or for worse (mostly worse) class politics and political/religious beliefs strongly affect our opinions of “experts,” especially those who say things we disagree with.
But back to Nichols: Dunning-Kruger effect, fake cultural literacy, and too many people at college. Nichols is a professor and has seen college students up close and personal, and has a low opinion of most of them. The massive expansion of upper education has not resulted in a better-educated, smarter populace, he argues, but a populace armed with expensive certificates that show the sat around a college for 4 years without learning much of anything. Unfortunately, beyond a certain level, there isn’t a lot that more school can do to increase people’s basic aptitudes.
Colleges get money by attracting students, which incentivises them to hand out degrees like candy–in other words, students are being lied to about their abilities and college degrees are fast becoming the participation trophies for the not very bright.
Nichols has little sympathy for modern students:
Today, by contrast, students explode over imagined slights that are not even remotely int eh same category as fighting for civil rights or being sent to war. Students now build majestic Everests from the smallest molehills, and they descend into hysteria over pranks and hoaxes. In the midst of it all, the students are learning that emotions and volume can always defeat reason and substance, thus building about themselves fortresses that no future teacher, expert, or intellectual will ever be able to breach.
At Yale in 2015, for example, a house master’s wife had the temerity to tell minority students to ignore Halloween costumes they thought offensive. This provoked a campus wide temper tantrum that included professors being shouted down by screaming student. “In your position as master,” one student howled in a professor’s face, “it is your job to create a place of comfort and home for the students… Do you understand that?!”
Quietly, the professor said, “No, I don’t agree with that,” and the student unloaded on him:
“Then why the [expletive] did you accept the position?! Who the [expletive] hired you?! You should step down! If that is what you think about being a master you should step down! It is not about creating an intellectual space! It is not! Do you understand that? It’s about creating a home here. You are not doing that!” [emphasis added]
Yale, instead of disciplining students in violation of their own norms of academic discourse, apologized to the tantrum throwers. The house master eventually resigned from his residential post…
To faculty everywhere, the lesson was obvious: the campus of a top university is not a place for intellectual exploration. It is a luxury home, rented for four to six years, nine months at a time, by children of the elite who may shout at faculty as if they’re berating clumsy maids in a colonial mansion.
The incident Nichols cites (and similar ones elsewhere,) are not just matters of college students being dumb or entitled, but explicitly racial conflicts. The demand for “safe spaces” is easy to ridicule on the grounds that students are emotional babies, but this misses the point: students are carving out territory for themselves on explicitly racial lines, often by violence.
Nichols, though, either does not notice the racial aspect of modern campus conflicts or does not want to admit publicly to doing so.
Nichols moves on to blame TV, especially CNN, talk radio, and the internet for dumbing down the quality of discourse by overwhelming us with a deluge of more information than we can possibly process.
Referring back to Auerswald and The Code Economy, if automation creates a bifurcation in industries, replacing a moderately-priced, moderately available product with a stream of cheap, low-quality product on the one hand and a trickle of expensive, high-quality products on the other, good-quality journalism has been replaced with a flood of low-quality crap. The high-quality end is still working itself out.
Nichols opines:
Accessing the Internet can actually make people dumber than if they had never engaged a subject at all. The very act of searching for information makes people think they’ve learned something,when in fact they’re more likely to be immersed in yet more data they do not understand. …
When a group of experimental psychologists at Yale investigated how people use the internet, they found that “people who search for information on the Web emerge from the process with an inflated sense of how much they know–even regarding topic that are unrelated to the ones they Googled.” …
How can exposure to so much information fail to produce at least some kind of increased baseline of knowledge, if only by electronic osmosis? How can people read so much yet retain so little? The answer is simple: few people are actually reading what they find.
As a University College of London (UCL) study found, people don’t actually read the articles they encounter during a search on the Internet. Instead, they glance at the top line or the first few sentences and then move on. Internet users, the researchers noted, “Are not reading online in the traditional sense; indeed, there are signs that new forms of ‘reading’ are emerging as users ‘power browse’ horizontally through titles, contents pages and abstracts going for quick wins. It almost seems that they go online to avoid reading in the traditional sense.”
The internet’s demands for instant updates, for whatever headlines generate the most clicks (and thus advertising revenue), has upset the balance of speed vs. expertise in the newsroom. No longer have reporters any incentive to spend long hours carefully writing a well-researched story when such stories pay less than clickbait headlines about racist pet costumes and celebrity tweets.
I realize it seems churlish to complain about the feast of news and information brought to us by the Information Age, but I’m going to complain anyway. Changes in journalism, like the increased access to the Internet and to college education, have unexpectedly corrosive effects on the relationship between laypeople and experts. Instead of making people better informed, much of what passes for news in the twenty-first century often leaves laypeople–and sometimes experts–even more confused and ornery.
Experts face a vexing challenge: there’s more news available, and yet people seem less informed, a trend that goes back at least a quarter century. Paradoxically, it is a problem that is worsening rather than dissipating. …
As long ago as 1990, for example, a study conducted by the Pew Trust warned that disengagement from important public questions was actually worse among people under thirty, the group that should have been most receptive to then-emerging sources of information like cable television and electronic media. This was a distinct change in American civic culture, as the Pew study noted:
“Over most of the past five decades younger members of the public have been at least as well informed as older people. In 1990, that is no longer the case. … “
Those respondents are now themselves middle-aged, and their children are faring no better.
If you were 30 in 1990, you were born in 1960, to parents who were between the ages of 20 and 40 years old, that is, born between 1920 and 1940.
Source: Audacious Epigone
Fertility for the 1920-1940 cohort was strongly dysgenic. So was the 1940-50 cohort. The 1900-1919 cohort at least had the Flynn Effect on their side, but later cohorts just look like an advertisement for idiocracy.
Nichols ends with a plea that voters respect experts (and that experts, in turn, be humble and polite to voters.) After all, modern society is too complicated for any of us to be experts on everything. If we don’t pay attention to expert advice, he warns, modern society is bound to end in ignorant goo.
The logical inconsistency is that Nichols believes in democracy at all–he thinks democracy can be saved if ignorant people vote within a range of options as defined by experts like himself, eg, “What vaccine options are best?” rather than “Should we have vaccines at all?”
The problem, then, is that whoever controls the experts (or controls which expert opinions people hear) controls the limits of policy debates. This leads to people arguing over experts, which leads right back where we are today. As long as there are politics, “expertise” will be politicized, eg:
Look at any court case in which both sides bring in their own “expert” witnesses. Both experts testify to the effect that their side is correct. Then the jury is left to vote on which side had more believable experts. This is like best case scenario voting, and the fact that the voters are dumb and don’t understand what the experts are saying and are obviously being mislead in many cases is still a huge problem.
If politics is the problem, then perhaps getting rid of politics is the solution. Just have a bunch of Singapores run by Lee Kwan Yews, let folks like Nichols advise them, and let the common people “vote with their feet” by moving to the best states.
The problem with this solution is that “exit” doesn’t exist in the modern world in any meaningful way, and there are significant reasons why ordinary people oppose open borders.
Conclusion: 3/5 stars. It’s not a terrible book, and Nichols has plenty of good points, but “Americans are dumb” isn’t exactly fresh territory and much has already been written on the subject.
Looks like Dean Faust is stepping down and Lawrence Bacow is stepping up. Bacow has an S.B. in economics from MIT, a J.D. from Harvard Law, and an M.P.P. and Ph.D. from Harvard’s Kennedy School of Government.
I don’t know much about Bacow, but I’m sure I’ll learn once he takes over writing Faust’s column in Harvard Magazine. Overall he looks like a “safe” (ie dull) choice. His work at Tufts involved a expanding financial aid (Harvard already has extremely good financial aid, so there’s not much to do there) and diversity initiatives.
Bridget Terry Long, Economist, Dean of HGSE
Harvard has a couple of other newcomers. Economist Bridget Terry Long will be the new dean of the Harvard Graduate School of Education. Long’s CV is long (no pun intended) and filled with the sorts of awards and commiittee memberships appropriate to an Ivy League striver, like the National Bureau of Economic Research.
Long’s research focuses on getting more poor and dumb (excuse me, unprepared) students into college. I don’t have time to review her entire corpus, but I read her most recent paper, “Does Remediation Work for All Students? How the Effects of Postsecondary Remedial and Developmental Courses Vary by Level of Academic Preparation.” (Co-author: Angela Boatman.) The paper is fine, if rather oddly written (by my standards.)
[Results: placing borderline low-performing students into first-level remedial classes in the University of Tennessee system may be worse than just letting them try their best in regular courses; but really dumb kids actually do benefit from remedial courses. Obvious Conclusions that I didn’t see directly stated: Cut-off score for inclusion in remedial classes in U of Tenn system is too high.]
Long’s research looks fine; I don’t think it’s bad to look at whether a remedial program is actually helping students or whether a financial aid program is working (aside from my conviction that students who can’t do college-level work don’t belong in college.) It’s not exactly groundbreaking work, though. Harvard has plenty of folks like Reich and Pinker who are paving new intellectual (and technical ground); Long’s research seems underewhelming by comparison.
Tomiko Brown-Nagin, Radcliffe Institute, Harvard
Tomiko Brown-Nagin has been tapped to lead the Radcliffe Institute. From Harvard Mag’s article about her:
Brown-Nagin, who holds a J.D. from Yale Law School and a Ph.D. in history from Duke, is best known for her contributions to the history of the civil-rights movement. Her 2011 book Courage to Dissent: Atlanta and the Long History of the Civil Rights Movement won the Bancroft Prize for U.S. history, and is widely regarded as a definitive text on the legal and social history of civil rights in the United States. Her current book project explores the life of Constance Baker Motley, an African-American lawyer, judge, and politician who was an attorney in Brown v. Board of Education. …
Brown-Nagin is a sophisticated, nuanced thinker on the significance of diversity and representation in democratic institutions. In a recent Columbia Law Review article titled “Identity Matters: The Case of Judge Constance Baker Motley,” she wrote:
“Motley did endorse greater representation of women and racial minorities in the judiciary. Her argument for diversity on the bench did not turn on the view that women and people of color have a different voice or would reach different or better decisions than white men. Motley advocated judicial diversity because, she believed, inclusion reinforced democracy. By affirming openness and fairness, the mere presence of women and racial-minority judges built confidence in government. …”
Radcliffe is a women’s college that Harvard officially absorbed in 1999; the Radcliffe Institute came with it. According to Wikipedia:
The Radcliffe Institute for Advanced Study at Harvard shares transformative ideas across the arts, humanities, sciences, and social sciences. The Institute comprises three programs:
The Radcliffe Institute Fellowship Program annually supports the work of 50 artists and scholars, with an acceptance rate of around 5 percent each year.
The Academic Ventures program is for collaborative research projects and hosts lectures and conferences.
The Radcliffe Institute hosts public events, many of which can be watched online. It is one of the nine member institutions of the Some Institutes for Advanced Study consortium.
Yale Law is the most prestigious lawschool in the entire US (Harvard Law is probably #2). YL’s professors, therefore, are some of the US’s top legal scholars; it’s students are likely to go on to be important lawyers, judges, and opinion-makers.
If you’re wondering about the coat of arms, it was designed in 1956 as a pun on the original three founders’ names: Seth Staples, (BA, Yale, 1797), Judge David Daggett aka Doget, (BA 1783), and Samuel Hitchcock, (BA, 1809), whose name isn’t really a pun but he’s Welsh and when Welsh people cross the Atlantic, their dragon transforms into a crocodile. (The Welsh dragon has also been transformed into a crocodile on the Jamaican coat of arms.)
(For the sake of Yale’s staple-bearing coat of arms, let us hope that none of the founders were immoral in any way, as Harvard‘s were.)
Gideon Yaffe presents a theory of criminal responsibility according to which child criminals deserve leniency not because of their psychological, behavioural, or neural immaturity but because they are denied the vote. He argues that full shares of criminal punishment are deserved only by those who have a full share of say over the law.
He proposes that children are owed lesser punishments because they are denied the right to vote. This conclusion is reached through accounts of the nature of criminal culpability, desert for wrongdoing, strength of legal reasons, and what it is to have a say over the law. The heart of this discussion is the theory of criminal culpability.
To be criminally culpable, Yaffe argues, is for one’s criminal act to manifest a failure to grant sufficient weight to the legal reasons to refrain. The stronger the legal reasons, then, the greater the criminal culpability. Those who lack a say over the law, it is argued, have weaker legal reasons to refrain from crime than those who have a say, according to the book. They are therefore reduced in criminal culpability and deserve lesser punishment for their crimes. Children are owed leniency, then, because of the political meaning of age rather than because of its psychological meaning. This position has implications for criminal justice policy, with respect to, among other things, the interrogation of children suspected of crimes and the enfranchisement of adult felons. …
He holds an A.B. in philosophy from Harvard and a Ph.D. in philosophy from Stanford.
I don’t think you need a degree in philosophy or law to realize that this is absolutely insane.
Even in countries where no one can vote, we still expect the government to try to do a good job of rounding up criminals so their citizens can live in peace, free from the fear of random violence. The notion that “murder is bad” wasn’t established by popular vote in the first place. Call it instinct, human nature, Natural Law, or the 6th Commandment–whatever it is, we all want murderers to be punished.
The point of punishing crime is 1. To deter criminals from committing crime; 2. To get criminals off the street; 3. To provide a sense of justice to those who have been harmed. These needs do not change depending on whether or not the person who committed the crime can vote. Why, if I wanted to commit a crime, should I hop the border into Canada and commit it there, then claim the Canadian courts should be lenient since I am not allowed to vote in Canada? Does the victim of a disenfranchised felon deserve less justice than the victim of someone who still had the right to vote?
Since this makes no sense at all from any sort of public safety or discouraging crime perspective, permit me a cynical theory: the author would like to lower the voting age, let immigrants (legal or not) vote more easily, and end disenfranchisement for felons.
The age of human rights has been kindest to the rich. Even as state violations of political rights garnered unprecedented attention due to human rights campaigns, a commitment to material equality disappeared. In its place, market fundamentalism has emerged as the dominant force in national and global economies. In this provocative book, Samuel Moyn analyzes how and why we chose to make human rights our highest ideals while simultaneously neglecting the demands of a broader social and economic justice. …
In the wake of two world wars and the collapse of empires, new states tried to take welfare beyond its original European and American homelands and went so far as to challenge inequality on a global scale. But their plans were foiled as a neoliberal faith in markets triumphed instead.
In a tightly-focused tour of the history of distributive ideals, Moyn invites a new and more layered understanding of the nature of human rights in our global present. From their origins in the Jacobin welfare state
Which chopped people’s heads off.
to our current neoliberal moment, Moyn tracks the subtle shifts in how human rights movements understood what, exactly, their high principles entailed.
Like not chopping people’s heads off?
Earlier visionaries imagined those rights as a call for distributive justice—a society which guaranteed a sufficient minimum of the good things in life. And they generally strove, even more boldly, to create a rough equality of circumstances, so that the rich would not tower over the rest.
By chopping their heads off.
Over time, however, these egalitarian ideas gave way. When transnational human rights became famous a few decades ago, they generally focused on civil liberties — or, at most sufficient provision.
Maybe because executing the kulaks resulted in mass starvation, which seems kind of counter-productive in the sense of minimum sufficient provision for human life.
In our current age of human rights, Moyn comments, the pertinence of fairness beyond some bare minimum has largely been abandoned.
By the way:
From Human Progress
Huh. Why would anyone think that economic freedom and human well-being go hand-in-hand?
At the risk of getting Pinkerian, the age of “market fundamentalism” has involved massive improvements in human well-being, while every attempt to make society economically equal has caused mass starvation and horrible abuses against humans.
Moyn’s argument that we have abandoned “social justice” is absurd on its face; in the 1950s, the American south was still racially segregated; in the 1980s South Africa was still racially segregated. Today both are integrated and have had black presidents. In 1950, homosexuality was widely illegal; today gay marriage is legal in most Western nations. Even Saudi Arabia has decided to let women drive.
If we want to know why, absurdly, students believe that things have never been worse for racial minorities in America, maybe the answer is the rot starts from the top.
The first ruling dramatically stopped the unconstitutional Muslim ban in January 2017, when students from the Worker and Immigrant Rights Advocacy Clinic (WIRAC) mobilized overnight to ground planes and free travelers who were being unjustly detained. The students’ work, along with co-counsel, secured the first nationwide injunction against the ban, and became the template for an army of lawyers around the country who gathered at airports to provide relief as the chaotic aftermath of the executive order unfolded.
Next came a major ruling in California in November 2017 in which a federal Judge granted a permanent injunction that prohibited the Trump Administration from denying funding to sanctuary cities—a major victory for students in the San Francisco Affirmative Litigation Project (SFALP) …
And on February 13, 2018, WIRAC secured yet another nationwide injunction—this time halting the abrupt termination of the Deferred Action for Childhood Arrivals program (DACA). … The preliminary injunction affirms protections for hundreds of thousands of Dreamers just weeks before the program was set to expire.
The Rule of Law Clinic launched at Yale Law School in the Spring of 2017 and in less than one year has been involved in some of the biggest cases in the country, including working on the travel ban, the transgender military ban, and filing amicus briefs on behalf of the top national security officials in the country, among many other cases. The core goal of the clinic is to maintain U.S. rule of law and human rights commitments in four areas: national security, antidiscrimination, climate change, and democracy promotion.
Meanwhile, Amy Chua appears to be the only sane, honest person at Yale Law:
In her new book, Political Tribes: Group Instinct and the Fate of Nations (Penguin, 2018), Amy Chua diagnoses the rising tribalism in America and abroad and prescribes solutions for creating unity amidst group differences.
Chua, who is the John M. Duff, Jr. Professor of Law, begins Political Tribes with a simple observation: “Humans are tribal.” But tribalism, Chua explains, encompasses not only an innate desire for belonging but also a vehement and sometimes violent “instinct to exclude.” Some groups organize for noble purposes, others because of a common enemy. In Chua’s assessment, the United States, in both foreign and domestic policies, has failed to fully understand the importance of these powerful bonds of group identity.
Unlike the students using their one-in-a-million chance at a Yale Law degree to help members of a different tribe for short-term gain, Amy Chua at least understands politics. I might not enjoy Chua’s company if I met her, but I respect her honesty and clear-sightedness.
Why Children Follow Rules focuses upon legal socialization outlining what is known about the process across three related, but distinct, contexts: the family, the school, and the juvenile justice system. Throughout, Tom Tyler and Rick Trinkner emphasize the degree to which individuals develop their orientations toward law and legal authority upon values connected to responsibility and obligation as opposed to fear of punishment. They argue that authorities can act in ways that internalize legal values and promote supportive attitudes. In particular, consensual legal authority is linked to three issues: how authorities make decisions, how they treat people, and whether they recognize the boundaries of their authority. When individuals experience authority that is fair, respectful, and aware of the limits of power, they are more likely to consent and follow directives.
Despite clear evidence showing the benefits of consensual authority, strong pressures and popular support for the exercise of authority based on dominance and force persist in America’s families, schools, and within the juvenile justice system. As the currently low levels of public trust and confidence in the police, the courts, and the law undermine the effectiveness of our legal system, Tom Tyler and Rick Trinkner point to alternative way to foster the popular legitimacy of the law in an era of mistrust.
Speaking as a parent… I understand where Tyler is coming from. If I act in a way that doesn’t inspire my children to see me as a fair, god-like arbitrator of justice, then they are more likely to see me as an unjust tyrant who should be disobeyed and overthrown.
On the other hand, sometimes things are against the rules for reasons kids don’t understand. One of my kids, when he was little, thought turning the dishwasher off was the funniest thing and would laugh all the way through timeout. Easy solution: I didn’t turn it on when he was in the room and he forgot. Tougher problem: one of the kids thought climbing on the stove to get to the microwave was a good idea. Time outs didn’t work. Explaining “the stove is hot sometimes” didn’t work. Only force solved this problem.
Some people will accept your authority. Some people can reason their way to “We should cooperate and respect the social contract so we can live in peace.” And some people DON’T CARE no matter what.
So I agree that police, courts, etc., should act justly and not abuse their powers, and I can pull up plenty of examples of cases where they did. But I am afraid this is not a complete framework for dealing with criminals and legal socialization.