An Historian of Science Discovers the “Light Switch”

Historian and philosopher Emma Houston interrogates the science of electricity and light bulbs

The light switch was at the center of Houston’s first big foray into the history of electricity and lights. The story told in introductory electrical engineering textbooks is relatively simple: flipping the switch up turns the lights “on”; flipping the switch down turns the lights “off.” Whether a switch is in the on or off position has for for decades been seen as an expression of a light bulb’s “true” state or of “light itself.” It is the job of a science historian to discover where these stories come from, and why.

Houston’s doctoral dissertation, published in 2004 as Light Itself: The Search for On and Off in the Electric Circuit does just this, tracing the history of the idea that electromagnetic radiation is turned “on” and “off” by switches found on the wall. Early in the twentieth century, she shows, it was controversial to refer to “light switches” because sometimes electricians accidentally wired around them when installing lights.

But the fact that switches are visible, (unlike electricity) made them useful for enough to two groups of engineers–those building electrical circuits, and those working to untangle the role of electricity in light generation–that the association between switches and light solidified for decades.

Associating “light” with the “light switch,” writes Richardson, has serious consequences, as when engineers tried to develop a “super flashlight” that used two light switches and multiple batteries.

The “super flashlight” was finally abandoned in the development stage when engineers decided it was simpler to use bigger batteries, but in Light Itself, Houston argues that it made the light switch the star of electrical engineering in a way that still reverberates. She points to engineers like Professor Book, whose research focused for decades on using light switches to design home lighting plans. Such a focus was not inevitable, Houston argues: from the 1920s through the 50s, based on evidence in lasers, researchers saw buttons as drivers of light output.

It turns out that “light switches” do not actually cause bulbs to emit electromagnetic radiation. Engineers now understand that light, produced by incandescent bulbs as well as LEDs and compact fluorescents, is the result of numerous interconnected capacitors, resistors, power sources, and wire circuits that all work together. So called “light switches” do not cause light at all–they merely open and close light circuits, allowing electricity to flow (or not) to the bulbs.

But in an interview, Professor Book disagrees with Houston’s account. In Houston’s history, the super flashlight looms large in later researchers’ decision to focus on the switch, but Professor Book responds that research on the super flashlight “did not interest me, it did not impress me, it did not look like the the foundations of a path forward.” Building circuits around the light switch, he says, was not inspired by the popular image of a super bright flashlight with two switches, but “was simply the easiest way to design practical lighting for people’s houses” and that “flipping the switch does actually turn the lights on and off.”

Houston responds that of course we can’t expect actual engineers to know what inspired them or their fields, which is why we need science historians like herself to suss out what was really motivating them.

Author’s note: Professor Houston has degrees in philosophy and literature, but oddly, none in engineering or physics.

This parody is thanks to Harvard Magazine’s The Science of Sex: Historian and Philosopher Sarah Richardson Interrogates the Science of Sex and Gender.

 

 

 

 

Donna Zuckerberg and Knowledge Production vs. Knowledge Community

800px-Homeric_Greece-en.svgDonna Zuckerberg–that is, Mark Zuckerberg’s less famous sibling–recently published a book titled Not All Dead White Men: Classics and Misogyny in the Digital Age. This is a curious book–to quote from the summary on Amazon:

A virulent strain of antifeminism is thriving online that treats women’s empowerment as a mortal threat to men and to the integrity of Western civilization. Its proponents cite ancient Greek and Latin texts to support their claims―arguing that they articulate a model of masculinity that sustained generations but is now under siege.

Donna Zuckerberg dives deep into the virtual communities of the far right, where men lament their loss of power and privilege and strategize about how to reclaim them. She finds, mixed in with weightlifting tips and misogynistic vitriol, the words of the Stoics deployed to support an ideal vision of masculine life. On other sites, pickup artists quote Ovid’s Ars Amatoria to justify ignoring women’s boundaries. By appropriating the Classics, these men lend a veneer of intellectual authority and ancient wisdom to their project of patriarchal white supremacy. In defense or retaliation, feminists have also taken up the Classics online, to counter the sanctioning of violence against women.

800px-Relief_Herodotus_cour_Carree_Louvre
“If someone were to put a proposition before men bidding them to choose, after examination, the best customs in the world, each nation would certainly select its own”–Herodotus

Translation: “I read a blog and I didn’t like it.”

So Donna Zuckerberg, a white woman with enough wealth and leisure to study the classics for a living and sister of one of the richest, most powerful men in the world (who also loves the classics so much that he has named his daughters “Maxima,” Latin for “greatest,”* and “August,” after Emperor Augustus,) is complaining that Losers on the Internet are sullying the Classics by quoting Ovid.

This is a problem because White Men on the Internet are Privileged (even when they are poor whites who struggle to get a job or even friends,) while rich white women like Donna are the Oppressed.

*(Maxima is also named after two relatives named “Max,” though if honoring relatives were the only motive, Zuck could have gone with “Maxine,” or named her after a female relative.)

Realistically, these men aren’t a threat to Mrs. Zuckerberg; the aren’t going to rise up and force her back into the kitchen, barefoot and pregnant. They are, however, icky, and Donna obviously doesn’t like them impinging on her turf: “By appropriating the Classics, these men lend a veneer of intellectual authority and ancient wisdom to their project of patriarchal white supremacy.”

Appropriating from whom? What culture owns Ovid and Homer? These books are considered the foundation of all of Western Civilization. Is Heartiste not a part of Western Civilization? I suppose you could argue that Roosh is Iranian/Armenian by blood, (despite being born in the US,) but arguing that Roosh can’t enjoy Ovid because he’s Iranian is, well, stupid.

I understand that Mrs. Zuckerberg doesn’t like pickup blogs, but you can’t appropriate the intellectual and literary foundations of your own culture. This is like accusing a Hindu of appropriating the Bhagavad Gita just because he’s a jerk.

800px-maskeagamemnon
‘Mask of Agamemnon’, discovered by Schliemann, 1876.
“They sent forth men to battle,
But no such men return;
And home, to claim their welcome,
Come ashes in an urn”  Aeschylus, Agamemnon

The implication of “appropriating” is that Donna thinks the classics belong to some narrow class of people–most likely, academic dilettantes like herself. But as I’ve noted before, Donna Zuckerberg doesn’t own the Classics. Being rich doesn’t give her any more right to quote Plato than anyone else in the entire damn world.

But my complaints aside, I think this nicely illustrates a difficulty found in many academic disciplines:

It’s very difficult to make any new arguments about the Classics. Ovid has been around for a long time. So has Homer. Everything you can say about them has probably been said a thousand times already.

Schliemann managed to up the ante by actually finding Troy, but what’s left to discover? You will never be as great as Schliemann. You will always toil in the shadows of the greats of the past.

But there are rules in academia, most notably, “Publish or perish.” If you want to be a professor or otherwise taken seriously as an academic, you’ve got to publish papers.

What, exactly, are you going to publish on a subject that was thoroughly mined for all new ideas and concepts hundreds of years ago?

egyptianblue-1
Are we to believe the Egyptians managed to manufacture pigment from calcium copper silicate and use it in these elaborate paintings without being able to see it?

So I see two options:

1. Lie. Just make something  up, like “the ancients couldn’t see blue.” Totally untrue, but people have bought it, hook, line, and sinker.

2. Write things that aren’t new and don’t provide any new insights, but show that you are a member of the “classics community.”

We think of academic disciplines as “producing knowledge,” but it may be more accurate to think of them as “knowledge communities.” to be part of those communities, all you have to do is produce works that show what a good community member you are. People who fit in get friends, mentors, promotions, and opportunities. People who don’t fit in either get pushed out or leave of their own accord. There’s not much new to say about the Classics, but there are plenty of people who enjoy reading the classics and discussing them with others–and that makes a community, and where there’s a community, people will try to protect what is culturally “theirs.” Folks like Roosh and Heartiste, then, are moving in on academic territory.

What counts as being a “good member” of your community depends on the current social norms in that community. If your community is full of people who say things like “The Classics are the foundation for the greatness of Western Civilization,” then aspirant community members will publish things echoing that.

And if your community is full of people who say things like “If your feminism isn’t intersectional, it’s bullshit,” then you’re going to write things like that.

788px-Herodotus_world_map-en.svg
Herodotus’s World “After all, no one is stupid enough to prefer war to peace; in peace sons bury their fathers and in war fathers bury their sons.” –Herodotus

Modern academia is not really comfortable with “Dead white males”* (much less “Alive white males,”) nor the idea of Western Civilization as anything particularly special or qualitatively different from other civilizations–which creates a bit of a conflict when your field is literally the semi-symbolic and literary basis of Western Civilization.

*Note: most people who study the classics know that the “Classical World” is really the circum-Mediterranean world, that Herodotus lived in now-Turkey, St. Augustine was born in now-Algeria, Alexander the Great’s empire stretched to India, etc. Whether these men were “white” (or men) is irrelevant to our understanding of the foundations of Western Civilization.

Now, I understand not liking everyone you meet on the internet. There are lots of wrong and terrible people in here. But this is why you get a blog where you can complain to the five people who can stand you about all of the other annoying people on the internet.

There are probably many academic disciplines which could, at this point, be transformed into blogs and tumblrs without much loss.

Cathedral Round-Up: You can have my towel when you pry it from my cold, dead hands

We’re going to kick off today’s Cathedral Round-Up with a trip down memory lane.

This may come as some surprise, given my scintillating wit and gregarious nature, but I was not popular in school. If there was a social totem pole, I was a mud puddle about twenty yards to the left of the pole.

The first time I felt like I truly fit in–I belonged–was at nerd camp. This was a sort of summer camp your parents send you to when you’ve failed at Scouting and they hope maybe you’ll pick up chemistry or philosophy instead.

One evening, when I was gathered in the dorm with my new friends, a girl burst triumphantly into our midst, brandishing a book. “I have it,” she triumphed. “I have it! The book!”

The Book, which we all proceeded to read, and after camp ended, to discuss in what were my very first emails, was The Hitchhiker’s Guide to the Galaxy.

Over at Human Resource Executives, McIlvane reports on a new study by Stanford’s Correll and Wynn:

An interesting new study from Stanford University finds that company recruiters from tech firms may be putting off female college grads through their behavior—some of it a bit questionable. …

The researchers found that during their informational presentations, the recruiters—no doubt in an attempt to bond with their audiences—frequently referenced “geek culture favorites” such as Star Trek and The Hitchhikers Guide to the Galaxy, focused the conversation exclusively on highly technical aspects of the roles or referred to high school coding experience. …

As diversity experts have pointed out before, geek culture references tend to resonate most strongly with white men while women tend to feel excluded by that culture.

In case you haven’t noticed or this is your first time visiting my humble blog, I am female. All of my friends at camp were female.

“Through gender-imbalanced presenter roles, geek culture references, overt use of gender stereotypes, and other gendered speech and actions, representatives may puncture the pipeline, lessening the interest of women at the point of recruitment into technology careers,” the researchers write.

Dear Diversity Experts: In the words of the first real friend I ever had, please disembowel yourselves with a rusty spoon.

The study itself is not easily available online, so I will respectfully judge them based on summaries in HRE and Wired.

Short version: A couple of sociologist “gender researchers,” who of course know STEM culture very well, sat in on tech company recruiting sessions at Stanford and discovered that nerds talk about nerd things, OMG EWWW, and concluded that icky nerds doing their nerd thing in public is why women decide to go apply for more prestigious jobs elsewhere.

Now, I understand what it’s like not to get someone else’s references. I haven’t seen Breaking Bad, NCIS, Sex in the City, Seinfeld, The Simpsons, or the past X Starwars installments. I don’t watch sports, play golf, or drink alcohol.

But I don’t go around complaining that other people need to stop talking about things that interest them and just talk about stuff that interests me. It doesn’t bother me that other people have their interests, because I have plenty of room over here on my end of the internet to talk about mine.

But apparently these “Diversity Experts” think that the cultural icons of my childhood need to be expunged from conversation just to make people like them feel more comfortable.

Dear Correll and Wynn: when people like you stop assuming that everyone in your vicinity is interested in hearing about wine and yoga and golf, I’ll stop assuming that people who show some interest in my culture are interested in The Hitchhiker’s Guide to the Galaxy.

Notice that the problem here is not that the women are being turned away, or discriminated against, or receiving fewer callbacks than male applicants. No, the problem is that the women think geek culture is icky and so don’t even bother to apply. They have decided that they have better options, but since someone decided that is imperative that all professions be 50% women (except plumbing, sewer workers, truckers, etc.) they must somehow be tricked into going into their second-choice field.

No one seems to have thought to, ahem, consult the actual women who work in Tech or who have STEM degrees or are otherwise associated with the field about whether or not they thought these sorts of geek cultural references were off-putting. No, we do not exist in Correll and Wynn’s world, or perhaps because our numbers are low, there just aren’t enough of us to matter.

STEM/tech exists in this weird limbo where women abstractly want more women in it, but don’t actually want to be the women in it. Take Wynn. She has a degree in English. She could have majored in Chemistry, but chose not to. Now she whines that there aren’t enough female engineers.

People routinely denigrate law and lawyers. Lawyers are the butt of many jokes, and people claim to hate lawyers, but lawyers themselves are treated with a great deal of courtesy and respect, and have no difficulties on the dating market.

STEM works inversely: people claim to hold scientists and mathematicians in great respect, but in practice they are much lower on the social totem pole. Lots of people would like good grades in math, but don’t want to hang out with the kid who does get good grades in math.

So feminists want women to be acknowledged as equally capable with men at things like “math” and “winning Nobel Prizes” and “becoming billionaire CEOS” (hey, I want those things, too,) but don’t want to do the grunt work that is most of what people in STEM fields actually do. They don’t want to spend their days around sweaty guys who talk about Linux kernels or running around as lab assistant #3. For a lot of people, tech jobs are not only kind of boring and frustrating, but don’t even pay that well, considering all of the education involved in getting them.

The result is a lot of concern trolling from people who claim to want more women in STEM, but don’t want to address the underlying problems for why most women aren’t all that interested in STEM in the first place.

Are there real problems for women in STEM? Maybe. I have female commentators who can tell you about the difficulties they’ve had in STEM communities. It is different being a female in a male-dominated field than being female in a balanced or female-dominated field, and this has its downsides. But “men said nerd things” or “men referenced porn” is not even remotely problematic. (I will note that men have problems in STEM fields, too.)

While we’re here, I’d like to talk about these “Diversity Experts” whom HRE cites as proof for their claims that women find geek culture off-putting. Their link heads not to a study on the subject, nor even an actual expert on anything, but an opinion piece by Kerry Flynn on Mashable:

The lack of diversity in tech isn’t a new issue, and yet top leaders in Silicon Valley still struggle to talk about it.

They struggle so much that this is an entire article about a female CEO talking about it. Talking openly about a thing is the same as struggling to talk about it, right?

The latest stumble comes from YouTube CEO Susan Wojcicki speaking with MSNBC’s Ari Melber and Recode’s Kara Swisher at the media companies’ first town hall titled “Revolution: Google and YouTube Changing the World,” which aired Sunday.

The latest stumble, ladies and gents! Wojcicki might be a female CEO of a tech giant, but what the hell does she know? Kerry Flynn knows much better than she does. Wojcicki had better shape up to Flynn’s standards, because Flynn is keeping track, ladies and gents.

According to Wojcicki, one reason for the lack of women in tech is its reputation for being a “very geeky male industry.”

Ouch.

That kind of statement makes it seem like Wojcicki has forgotten about the diverse and minority perspectives that are fighting for representation in the industry. For instance, with the #IlLookLikeAnEngineer campaign, engineer Isis Wenger wrote about the sexism she faced working in tech and inspired a movement of women shutting down stereotypes.

See, women and minorities are trying to counter the perception of tech being a “very geeky male industry,” which Wojcicki obviously forgot about when she claimed that tech has a reputation for being a “very geeky male industry.”

Kerry Flynn is very stupid.

The entire article goes on in this vein and it’s all awful. Nowhere does Flynn prove anything about women not liking The Hitchhikers’ Guide to the Galaxy.

***

What other interesting articles does Stanford Magazine hold for us?

So what happens when you send your kids to Stanford? Stanford Magazine has helpful interviews with recent grads. Yeji Jung got enmeshed in Social Justice, changed her major from pre-med to “comparative studies in race and ethnicity,” graduated, and went home to her parents to make collages.

I searched for Yeji Jung’s art, which is supposed to be making the world a better and more just place, and found almost nothing. This red cabbage and the lips in the Stanford Mag article are it. This does not look promising.

I bet her parents are very glad they worked their butts off for years making sure their kid got all As in her classes and aced SAT so she could come home from Stanford and paste paper together.

A quote from the article:

A thesis project to investigate the links between her Korean-American identity and the experiences of her Korean grandmothers took her to Seoul, South Korea, and Manassas, Va., to interview them in Korean.

Wait, you can get a degree from Stanford by interviewing your grandparents? Dude, I call my grandma every weekend! That should be worth at least a master’s.

“[My grandmothers’] lives are so deeply gendered in a way that I just have not experienced as someone who grew up in the U.S. One of my interview questions was framed as, ‘What did you study in college?’ [My grandmother in Virginia said,] ‘Oh, I didn’t go to college — girls in that day didn’t go to college. We went to work.’ That was a moment for me of, ‘Wow, I just have these assumptions about my life that are not a given.’

Girls in my grandmothers’ day went to college. Both of mine went to college. One of them earned a PhD in a STEM field; the other became a teacher. Teacher was a pretty common profession for women in my grandmother’s day. So was nurse.

I can take that a step further: my great-grandmother went to college.

Perhaps she meant was girls in Korea didn’t go to college in those days, though I’m sure Korea had needed plenty of nurses about 70 years ago, and frankly I’m not sure many men were going to college in those days.

I often idly wonder if elites push SJW nonsense to remove competitors. Yeji Jung is probably a very bright young woman who would have made an excellent doctor or medical researcher. Instead she has shuffled off to irrelevance.

Maybe America is too Dumb for Democracy: A Review of Nichols’s The Death of Expertise

For today’s Cathedral Round-Up, I finally kept my commitment to review Tom Nichols’s The Death of Expertise: The Campaign against Established Knowledge and Why it Matters. It was better than I expected, (though that isn’t saying much.)

Make no mistake: Nichols is annoyingly arrogant. He draws a rather stark line between “experts” (who know things) and everyone else (who should humbly limit themselves to voting between options defined for them by the experts.) He implores people to better educate themselves in order to be better voters, but has little patience for autodidacts and bloggers like myself who are actually trying.

But arrogance alone doesn’t make someone wrong.

Nichols’s first thesis is simple: most people are too stupid or ignorant to second-guess experts or even contribute meaningfully to modern policy discussions. How can people who can’t find Ukraine on a map or think we should bomb the fictional city of Agrabah contribute in any meaningful way to a discussion of international policy?

It was one thing, in 1776, to think the average American could vote meaningfully on the issues of the day–a right they took by force, by shooting anyone who told them they couldn’t. Life was less complicated in 1776, and the average person could master most of the skills they needed to survive (indeed, pioneers on the edge of the frontier had to be mostly self-sufficient in order to survive.) Life was hard–most people engaged in long hours of heavy labor plowing fields, chopping wood, harvesting crops, and hauling necessities–but could be mastered by people who hadn’t graduated from elementary school.

But the modern industrial (or post-industrial) world is much more complicated than the one our ancestors grew up in. Today we have cars (maybe even self-driving cars), electrical grids and sewer systems, atomic bombs and fast food. The speed of communication and transportation have made it possible to chat with people on the other side of the earth and show up on their doorstep a day later. The amount if specialized, technical knowledge necessary to keep modern society running would astonish the average caveman–even with 15+ years of schooling, the average person can no longer build a house, nor even produce basic necessities like clothes or food. Most of us can’t even make a pencil.

Even experts who are actually knowledgeable about their particular area may be completely ignorant of fields outside of their expertise. Nichols speaks Russian, which makes him an expert in certain Russian-related matters, but he probably knows nothing about optimal high-speed rail networks. And herein lies the problem:

The American attachment to intellectual self-reliance described by Tocqueville survived for nearly a century before falling under a series of assaults from both within and without. Technology, universal secondary education, the proliferation of specialized expertise, and the emergence of the United States a a global power in the mid-twentieth century all undermined the idea… that the average American was adequately equipped either for the challenges of daily life or for running the affairs of a large country.

… the political scientist Richard Hofstadter wrote that “the complexity of modern life has steadily whittled away the functions the ordinary citizen can intelligently and competently perform for himself.”

… Somin wrote in 2015 that the “size and complexity of government” have mad it “more difficult for voters with limited knowledge to monitor and evaluate the government’s many activities. The result is a polity in which the people often cannot exercise their sovereignty responsibly and effectively.”

In other words, society is now too complex and people too stupid for democracy.

Nichols’s second thesis is that people used to trust experts, which let democracy function, but to day they are less trusting. He offers no evidence other than his general conviction that this change has happened.

He does, however, detail the way he thinks that 1. People have been given inflated egos about their own intelligence, and 2. How our information-delivery system has degenerated into misinformational goo, resulting in the trust-problems he believes we are having These are interesting arguments and worth examining.

A bit of summary:

Indeed, maybe the death of expertise is a sign of progress. Educated professionals, after all, no longer have a stranglehold on knowledge. The secrets of life are no longer hidden in giant marble mausoleums… in the past, there was less tress between experts and laypeople, but only because citizen were simply unable to challenge experts in any substantive way. …

Participation in political, intellectual, and scientific life until the early twentieth century was far more circumscribed, with debates about science, philosophy, and public policy all conducted by a small circle of educated males with pen and ink. Those were not exactly the Good Old Days, and they weren’t that long ago. The time when most people didn’t finish highschool, when very few went to college, and only a tiny fraction of the population entered professions is still within living memory of many Americans.

Aside from Nichols’s insistence that he believes modern American notions about gender and racial equality, I get the impression that he wouldn’t mind the Good Old Days of genteel pen-and-ink discussions between intellectuals. However, I question his claim that participation in political life was far more circumscribed–after all, people voted, and politicians liked getting people to vote for them. People anywhere, even illiterate peasants on the frontier or up in the mountains like to gather and debate about God, politics, and the meaning of life. The question is less “Did they discuss it?” and more “Did their discussions have any effect on politics?” Certainly we can point to abolition, women’s suffrage, prohibition, and the Revolution itself as heavily grass-roots movements.

But continuing with Nichols’s argument:

Social changes only in the past half century finally broke down old barriers of race, class, and sex not only between Americans and general but also between uneducated citizens and elite expert in particular. A wide circle of debate meant more knowledge but more social friction. Universal education, the greater empowerment of women and minorities, the growth of a middle class, and increased social mobility all threw a minority of expert and the majority of citizens into direct contact, after nearly two centuries in which they rarely had to interact with each other.

And yet the result has not been a greater respect for knowledge, but the growth of an irrational conviction among Americans that everyone is as smart as everyone else.

Nichols is distracting himself with the reflexive racial argument; the important change he is highlighting isn’t social but technical.

I’d like to quote a short exchange from Our Southern Highlanders, an anthropologic-style text written about Appalachia about a century ago:

The mountain clergy, as a general rule, are hostile to “book larnin’,” for “there ain’t no Holy Ghost in it.” One of them who had spent three months at a theological school told President Frost, “Yes, the seminary is a good place ter go and git rested up, but ’tain’t worth while fer me ter go thar no more ’s long as I’ve got good wind.”

It used to amuse me to explain how I knew that the earth was a sphere; but one day, when I was busy, a tiresome old preacher put the everlasting question to me: “Do you believe the earth is round?” An impish perversity seized me and I answered, “No—all blamed humbug!” “Amen!” cried my delighted catechist, “I knowed in reason you had more sense.”

But back to Nichols, who really likes the concept of expertise:

One reason claims of expertise grate on people in a democracy is that specialization is necessarily exclusive. WHen we study a certain area of knowledge or spend oulives in a particular occupation, we not only forego expertise in othe jobs or subjects, but also trust that other pople in the community know what they’re doing in thei area as surely as we do in our own. As much as we might want to go up to the cockpit afte the engine flames out to give the pilots osme helpful tips, we assume–in part, ebcause wehave to–that tye’re better able to cope with the problem than we are. Othewise, our highly evovled society breaks down int island sof incoherence, where we spend our time in poorly infomed second-guessing instead of trusting each other.

This would be a good point to look at data on overall trust levels, friendship, civic engagement, etc (It’s down. It’s all down.) and maybe some explanations for these changes.

Nichols talks briefly about the accreditation and verification process for producing “experts,” which he rather likes. There is an interesting discussion in the economics literature on things like the economics of trust and information (how do websites signal that they are trustworthy enough that you will give them your credit card number and expect to receive items you ordered a few days later?) which could apply here, too.

Nichols then explores a variety of cognitive biases, such a superstitions, phobias, and conspiracy theories:

Conspiracy theories are also a way for people to give meaning to events that frighten them. Without a coherent explanation for why terrible thing happen to innocent people, they would have to accept such occurence as nothing more than the random cruelty either of an uncaring universe or an incomprehensible deity. …

The only way out of this dilemma is to imagine a world in which our troubles are the fault of powerful people who had it within their power to avert such misery. …

Just as individual facing grief and confusion look for reasons where none may exist, so, too, will entire societies gravitate toward outlandish theories when collectively subjected to a terrible national experience. Conspiracy theories and flawed reasoning behind them …become especially seductive “in any society that has suffered an epic, collectively felt trauma. In the aftermath, millions of people find themselves casting about for an answer to the ancient question of why bad things happen to good people.” …

Today, conspiracy theories are reaction mostly to the economic and social dislocations of globalization…This is not a trivial obstacle when it comes to the problems of expert engagement with the public: nearly 30 percent of Americans, for example, think “a secretive elite with a globalist agenda is conspiring to eventually rule the world” …

Obviously stupid. A not-secret elite with a globalist agenda already rules the world.

and 15 percent think media or government add secret mind controlling technology to TV broadcasts. (Another 15 percent aren’t sure about the TV issue.)

It’s called “advertising” and it wants you to buy a Ford.

Anyway, the problem with conspiracy theories is they are unfalsifiable; no amount of evidence will ever convince a conspiracy theorist that he is wrong, for all evidence is just further proof of how nefariously “they” are constructing the conspiracy.

Then Nichols gets into some interesting matter on the difference between stereotypes and generalizations, which segues nicely into a tangent I’d like to discuss, but it probably deserves its own post. To summarize:

Sometimes experts know things that contradict other people’s political (or religious) beliefs… If an “expert” finding or field accords with established liberal values, EG, the implicit association test found that “everyone is a little bit racist,” which liberals already believed, then there is an easy mesh between what the academics believe and the rest of their social class.

If their findings contradict conservative/low-class values, EG, when professors assert that evolution is true and “those low-class Bible-thumpers in Oklahoma are wrong,” sure, they might have a lot of people who disagree with them, but those people aren’t part of their own social class/the upper class, and so not a problem. If anything, high class folks love such finding, because it gives them a chance to talk about how much better they are than those low-class people (though such class conflict is obviously poisonous in a democracy where those low-class people can still vote to Fuck You and Your Global Warming, Too.)

But if the findings contradict high-class/liberal politics, then the experts have a real problem. EG, if that same evolution professor turns around and says, “By the way, race is definitely biologically real, and there are statistical differences in average IQ between the races,” now he’s contradicting the political values of his own class/the upper class, and that becomes a social issue and he is likely to get Watsoned.

For years folks at Fox News (and talk radio) have lambasted “the media” even though they are part of the media; SSC recently discussed “can something be both popular and silenced?

Jordan Peterson isn’t unpopular or “silenced” so much as he is disliked by upper class folks and liked by “losers” and low class folks, despite the fact that he is basically an intellectual guy and isn’t peddling a low-class product. Likewise, Fox News is just as much part of The Media as NPR, (if anything, it’s much more of the Media) but NPR is higher class than Fox, and Fox doesn’t like feeling like its opinions are being judged along this class axis.

For better or for worse (mostly worse) class politics and political/religious beliefs strongly affect our opinions of “experts,” especially those who say things we disagree with.

But back to Nichols: Dunning-Kruger effect, fake cultural literacy, and too many people at college. Nichols is a professor and has seen college students up close and personal, and has a low opinion of most of them. The massive expansion of upper education has not resulted in a better-educated, smarter populace, he argues, but a populace armed with expensive certificates that show the sat around a college for 4 years without learning much of anything. Unfortunately, beyond a certain level, there isn’t a lot that more school can do to increase people’s basic aptitudes.

Colleges get money by attracting students, which incentivises them to hand out degrees like candy–in other words, students are being lied to about their abilities and college degrees are fast becoming the participation trophies for the not very bright.

Nichols has little sympathy for modern students:

Today, by contrast, students explode over imagined slights that are not even remotely int eh same category as fighting for civil rights or being sent to war. Students now build majestic Everests from the smallest molehills, and they descend into hysteria over pranks and hoaxes. In the midst of it all, the students are learning that emotions and volume can always defeat reason and substance, thus building about themselves fortresses that no future teacher, expert, or intellectual will ever be able to breach.

At Yale in 2015, for example, a house master’s wife had the temerity to tell minority students to ignore Halloween costumes they thought offensive. This provoked a campus wide temper tantrum that included professors being shouted down by screaming student. “In your position as master,” one student howled in a professor’s face, “it is your job to create a place of comfort and home for the students… Do you understand that?!”

Quietly, the professor said, “No, I don’t agree with that,” and the student unloaded on him:

“Then why the [expletive] did you accept the position?! Who the [expletive] hired you?! You should step down! If that is what you think about being a master you should step down! It is not about creating an intellectual space! It is not! Do you understand that? It’s about creating a home here. You are not doing that!” [emphasis added]

Yale, instead of disciplining students in violation of their own norms of academic discourse, apologized to the tantrum throwers. The house master eventually resigned from his residential post…

To faculty everywhere, the lesson was obvious: the campus of a top university is not a place for intellectual exploration. It is a luxury home, rented for four to six years, nine months at a time, by children of the elite who may shout at faculty as if they’re berating clumsy maids in a colonial mansion.

The incident Nichols cites (and similar ones elsewhere,) are not just matters of college students being dumb or entitled, but explicitly racial conflicts. The demand for “safe spaces” is easy to ridicule on the grounds that students are emotional babies, but this misses the point: students are carving out territory for themselves on explicitly racial lines, often by violence.

Nichols, though, either does not notice the racial aspect of modern campus conflicts or does not want to admit publicly to doing so.

Nichols moves on to blame TV, especially CNN, talk radio, and the internet for dumbing down the quality of discourse by overwhelming us with a deluge of more information than we can possibly process.

Referring back to Auerswald and The Code Economy, if automation creates a bifurcation in industries, replacing a moderately-priced, moderately available product with a stream of cheap, low-quality product on the one hand and a trickle of expensive, high-quality products on the other, good-quality journalism has been replaced with a flood of low-quality crap. The high-quality end is still working itself out.

Nichols opines:

Accessing the Internet can actually make people dumber than if they had never engaged a subject at all. The very act of searching for information makes people think they’ve learned something,when in fact they’re more likely to be immersed in yet more data they do not understand. …

When a group of experimental psychologists at Yale investigated how people use the internet, they found that “people who search for information on the Web emerge from the process with an inflated sense of how much they know–even regarding topic that are unrelated to the ones they Googled.” …

How can exposure to so much information fail to produce at least some kind of increased baseline of knowledge, if only by electronic osmosis? How can people read so much yet retain so little? The answer is simple: few people are actually reading what they find.

As a University College of London (UCL) study found, people don’t actually read the articles they encounter during a search on the Internet. Instead, they glance at the top line or the first few sentences and then move on. Internet users, the researchers noted, “Are not reading online in the traditional sense; indeed, there are signs that new forms of ‘reading’ are emerging as users ‘power browse’ horizontally through titles, contents pages and abstracts going for quick wins. It almost seems that they go online to avoid reading in the traditional sense.”

The internet’s demands for instant updates, for whatever headlines generate the most clicks (and thus advertising revenue), has upset the balance of speed vs. expertise in the newsroom. No longer have reporters any incentive to spend long hours carefully writing a well-researched story when such stories pay less than clickbait headlines about racist pet costumes and celebrity tweets.

I realize it seems churlish to complain about the feast of news and information brought to us by the Information Age, but I’m going to complain anyway. Changes in journalism, like the increased access to the Internet and to college education, have unexpectedly corrosive effects on the relationship between laypeople and experts. Instead of making people better informed, much of what passes for news in the twenty-first century often leaves laypeople–and sometimes experts–even more confused and ornery.

Experts face a vexing challenge: there’s more news available, and yet people seem less informed, a trend that goes back at least a quarter century. Paradoxically, it is a problem that is worsening rather than dissipating. …

As long ago as 1990, for example, a study conducted by the Pew Trust warned that disengagement from important public questions was actually worse among people under thirty, the group that should have been most receptive to then-emerging sources of information like cable television and electronic media. This was a distinct change in American civic culture, as the Pew study noted:

“Over most of the past five decades younger members of the public have been at least as well informed as older people. In 1990, that is no longer the case. … “

Those respondents are now themselves middle-aged, and their children are faring no better.

If you were 30 in 1990, you were born in 1960, to parents who were between the ages of 20 and 40 years old, that is, born between 1920 and 1940.

Source: Audacious Epigone

Fertility for the 1920-1940 cohort was strongly dysgenic. So was the 1940-50 cohort. The 1900-1919 cohort at least had the Flynn Effect on their side, but later cohorts just look like an advertisement for idiocracy.

Nichols ends with a plea that voters respect experts (and that experts, in turn, be humble and polite to voters.) After all, modern society is too complicated for any of us to be experts on everything. If we don’t pay attention to expert advice, he warns, modern society is bound to end in ignorant goo.

The logical inconsistency is that Nichols believes in democracy at all–he thinks democracy can be saved if ignorant people vote within a range of options as defined by experts like himself, eg, “What vaccine options are best?” rather than “Should we have vaccines at all?”

The problem, then, is that whoever controls the experts (or controls which expert opinions people hear) controls the limits of policy debates. This leads to people arguing over experts, which leads right back where we are today. As long as there are politics, “expertise” will be politicized, eg:

Look at any court case in which both sides bring in their own “expert” witnesses. Both experts testify to the effect that their side is correct. Then the jury is left to vote on which side had more believable experts. This is like best case scenario voting, and the fact that the voters are dumb and don’t understand what the experts are saying and are obviously being mislead in many cases is still a huge problem.

If politics is the problem, then perhaps getting rid of politics is the solution. Just have a bunch of Singapores run by Lee Kwan Yews, let folks like Nichols advise them, and let the common people “vote with their feet” by moving to the best states.

The problem with this solution is that “exit” doesn’t exist in the modern world in any meaningful way, and there are significant reasons why ordinary people oppose open borders.

Conclusion: 3/5 stars. It’s not a terrible book, and Nichols has plenty of good points, but “Americans are dumb” isn’t exactly fresh territory and much has already been written on the subject.

Cathedral Round-Up: Checking in with the Bright Minds at Yale Law

Yale Law’s Coat of Arms

Yale Law is the most prestigious lawschool in the entire US (Harvard Law is probably #2). YL’s professors, therefore, are some of the US’s top legal scholars; it’s students are likely to go on to be important lawyers, judges, and opinion-makers.

If you’re wondering about the coat of arms, it was designed in 1956 as a pun on the original three founders’ names: Seth Staples, (BA, Yale, 1797), Judge David Daggett aka Doget, (BA 1783), and Samuel Hitchcock, (BA, 1809), whose name isn’t really a pun but he’s Welsh and when Welsh people cross the Atlantic, their dragon transforms into a crocodile. (The Welsh dragon has also been transformed into a crocodile on the Jamaican coat of arms.)

(For the sake of Yale’s staple-bearing coat of arms, let us hope that none of the founders were immoral in any way, as Harvard‘s were.)

So what have Yale’s luminaries been up to?

Professor Yaffe has a new book on Criminal Responsibility, titled The Age of Culpability: Children and the Nature of Criminal Responsibility. The blurb from Amazon:

Gideon Yaffe presents a theory of criminal responsibility according to which child criminals deserve leniency not because of their psychological, behavioural, or neural immaturity but because they are denied the vote. He argues that full shares of criminal punishment are deserved only by those who have a full share of say over the law.

The YLS Today article goes into more depth:

He proposes that children are owed lesser punishments because they are denied the right to vote. This conclusion is reached through accounts of the nature of criminal culpability, desert for wrongdoing, strength of legal reasons, and what it is to have a say over the law. The heart of this discussion is the theory of criminal culpability.

To be criminally culpable, Yaffe argues, is for one’s criminal act to manifest a failure to grant sufficient weight to the legal reasons to refrain. The stronger the legal reasons, then, the greater the criminal culpability. Those who lack a say over the law, it is argued, have weaker legal reasons to refrain from crime than those who have a say, according to the book. They are therefore reduced in criminal culpability and deserve lesser punishment for their crimes. Children are owed leniency, then, because of the political meaning of age rather than because of its psychological meaning. This position has implications for criminal justice policy, with respect to, among other things, the interrogation of children suspected of crimes and the enfranchisement of adult felons. …

He holds an A.B. in philosophy from Harvard and a Ph.D. in philosophy from Stanford.

I don’t think you need a degree in philosophy or law to realize that this is absolutely insane.

Even in countries where no one can vote, we still expect the government to try to do a good job of rounding up criminals so their citizens can live in peace, free from the fear of random violence. The notion that “murder is bad” wasn’t established by popular vote in the first place. Call it instinct, human nature, Natural Law, or the 6th Commandment–whatever it is, we all want murderers to be punished.

The point of punishing crime is 1. To deter criminals from committing crime; 2. To get criminals off the street; 3. To provide a sense of justice to those who have been harmed. These needs do not change depending on whether or not the person who committed the crime can vote. Why, if I wanted to commit a crime, should I hop the border into Canada and commit it there, then claim the Canadian courts should be lenient since I am not allowed to vote in Canada? Does the victim of a disenfranchised felon deserve less justice than the victim of someone who still had the right to vote?

Since this makes no sense at all from any sort of public safety or discouraging crime perspective, permit me a cynical theory: the author would like to lower the voting age, let immigrants (legal or not) vote more easily, and end disenfranchisement for felons.

Professor Moyn has a new book on Human Rights: Not Enough: Human Rights in an Unequal World. According to the Amazon blurb:

The age of human rights has been kindest to the rich. Even as state violations of political rights garnered unprecedented attention due to human rights campaigns, a commitment to material equality disappeared. In its place, market fundamentalism has emerged as the dominant force in national and global economies. In this provocative book, Samuel Moyn analyzes how and why we chose to make human rights our highest ideals while simultaneously neglecting the demands of a broader social and economic justice. …

In the wake of two world wars and the collapse of empires, new states tried to take welfare beyond its original European and American homelands and went so far as to challenge inequality on a global scale. But their plans were foiled as a neoliberal faith in markets triumphed instead.

As Yale puts it:

In a tightly-focused tour of the history of distributive ideals, Moyn invites a new and more layered understanding of the nature of human rights in our global present. From their origins in the Jacobin welfare state

Which chopped people’s heads off.

to our current neoliberal moment, Moyn tracks the subtle shifts in how human rights movements understood what, exactly, their high principles entailed.

Like not chopping people’s heads off?

Earlier visionaries imagined those rights as a call for distributive justice—a society which guaranteed a sufficient minimum of the good things in life. And they generally strove, even more boldly, to create a rough equality of circumstances, so that the rich would not tower over the rest.

By chopping their heads off.

Over time, however, these egalitarian ideas gave way. When transnational human rights became famous a few decades ago, they generally focused on civil liberties — or, at most sufficient provision.

Maybe because executing the kulaks resulted in mass starvation, which seems kind of counter-productive in the sense of minimum sufficient provision for human life.

In our current age of human rights, Moyn comments, the pertinence of fairness beyond some bare minimum has largely been abandoned.

By the way:

From Human Progress

Huh. Why would anyone think that economic freedom and human well-being go hand-in-hand?

The Dramatic Decline in World Poverty, from CATO https://www.cato.org/blog/dramatic-decline-world-poverty

At the risk of getting Pinkerian, the age of “market fundamentalism” has involved massive improvements in human well-being, while every attempt to make society economically equal has caused mass starvation and horrible abuses against humans.

Moyn’s argument that we have abandoned “social justice” is absurd on its face; in the 1950s, the American south was still racially segregated; in the 1980s South Africa was still racially segregated. Today both are integrated and have had black presidents. In 1950, homosexuality was widely illegal; today gay marriage is legal in most Western nations. Even Saudi Arabia has decided to let women drive.

If we want to know why, absurdly, students believe that things have never been worse for racial minorities in America, maybe the answer is the rot starts from the top.

In related news, Yale Law School Clinics Secure Third Nationwide Injunction:

The first ruling dramatically stopped the unconstitutional Muslim ban in January 2017, when students from the Worker and Immigrant Rights Advocacy Clinic (WIRAC) mobilized overnight to ground planes and free travelers who were being unjustly detained. The students’ work, along with co-counsel, secured the first nationwide injunction against the ban, and became the template for an army of lawyers around the country who gathered at airports to provide relief as the chaotic aftermath of the executive order unfolded.

Next came a major ruling in California in November 2017 in which a federal Judge granted a permanent injunction that prohibited the Trump Administration from denying funding to sanctuary cities—a major victory for students in the San Francisco Affirmative Litigation Project (SFALP) …

And on February 13, 2018, WIRAC secured yet another nationwide injunction—this time halting the abrupt termination of the Deferred Action for Childhood Arrivals program (DACA). … The preliminary injunction affirms protections for hundreds of thousands of Dreamers just weeks before the program was set to expire.

And Rule of Law Clinic files Suit over Census Preparations:

The Rule of Law Clinic launched at Yale Law School in the Spring of 2017 and in less than one year has been involved in some of the biggest cases in the country, including working on the travel ban, the transgender military ban, and filing amicus briefs on behalf of the top national security officials in the country, among many other cases. The core goal of the clinic is to maintain U.S. rule of law and human rights commitments in four areas: national security, antidiscrimination, climate change, and democracy promotion.

 

Meanwhile, Amy Chua appears to be the only sane, honest person at Yale Law:

In her new book, Political Tribes: Group Instinct and the Fate of Nations (Penguin, 2018), Amy Chua diagnoses the rising tribalism in America and abroad and prescribes solutions for creating unity amidst group differences.

Chua, who is the John M. Duff, Jr. Professor of Law, begins Political Tribes with a simple observation: “Humans are tribal.” But tribalism, Chua explains, encompasses not only an innate desire for belonging but also a vehement and sometimes violent “instinct to exclude.” Some groups organize for noble purposes, others because of a common enemy. In Chua’s assessment, the United States, in both foreign and domestic policies, has failed to fully understand the importance of these powerful bonds of group identity.

Unlike the students using their one-in-a-million chance at a Yale Law degree to help members of a different tribe for short-term gain, Amy Chua at least understands politics. I might not enjoy Chua’s company if I met her, but I respect her honesty and clear-sightedness.

 

On a final note, Professor Tyler has a new book, also about children and law, Why Children Follow Rules: Legal Socialization and the Development of Legitimacy. (Apparently the publishers decided to stiff the cover artist.) From the Amazon blurb:

Why Children Follow Rules focuses upon legal socialization outlining what is known about the process across three related, but distinct, contexts: the family, the school, and the juvenile justice system. Throughout, Tom Tyler and Rick Trinkner emphasize the degree to which individuals develop their orientations toward law and legal authority upon values connected to responsibility and obligation as opposed to fear of punishment. They argue that authorities can act in ways that internalize legal values and promote supportive attitudes. In particular, consensual legal authority is linked to three issues: how authorities make decisions, how they treat people, and whether they recognize the boundaries of their authority. When individuals experience authority that is fair, respectful, and aware of the limits of power, they are more likely to consent and follow directives.

Despite clear evidence showing the benefits of consensual authority, strong pressures and popular support for the exercise of authority based on dominance and force persist in America’s families, schools, and within the juvenile justice system. As the currently low levels of public trust and confidence in the police, the courts, and the law undermine the effectiveness of our legal system, Tom Tyler and Rick Trinkner point to alternative way to foster the popular legitimacy of the law in an era of mistrust.

Speaking as a parent… I understand where Tyler is coming from. If I act in a way that doesn’t inspire my children to see me as a fair, god-like arbitrator of justice, then they are more likely to see me as an unjust tyrant who should be disobeyed and overthrown.

On the other hand, sometimes things are against the rules for reasons kids don’t understand. One of my kids, when he was little, thought turning the dishwasher off was the funniest thing and would laugh all the way through timeout. Easy solution: I didn’t turn it on when he was in the room and  he forgot. Tougher problem: one of the kids thought climbing on the stove to get to the microwave was a good idea. Time outs didn’t work. Explaining “the stove is hot sometimes” didn’t work. Only force solved this problem.

Some people will accept your authority. Some people can reason their way to “We should cooperate and respect the social contract so we can live in peace.” And some people DON’T CARE no matter what.

So I agree that police, courts, etc., should act justly and not abuse their powers, and I can pull up plenty of examples of cases where they did. But I am afraid this is not a complete framework for dealing with criminals and legal socialization.

Re Nichols: Times the Experts were Wrong, pt 3/3

Welcome to our final post of “Times the Experts were Wrong,” written in preparation for our review of The Death of Expertise: The Campaign Against Established Knowledge and Why it Matters. Professor Nichols, if you ever happen to read this, I hope it give you some insight into where we, the common people, are coming from. If you don’t happen to read it, it still gives me a baseline before reading your book. (Please see part 1 for a discussion of relevant definitions.)

Part 3 Wars:

WWI, Iraq, Vietnam etc.

How many “experts” have lied to convince us to go to war? We were told we had to attack Iraq because they had weapons of mass destruction, but the promised weapons never materialized. Mother Jones (that source of all things pro-Trump) has a timeline:

November 1999: Chalabi-connected Iraqi defector “Curveball”—a convicted sex offender and low-level engineer who became the sole source for much of the case that Saddam had WMD, particularly mobile weapons labs—enters Munich seeking a German visa. German intel officers describe his information as highly suspect. US agents never debrief Curveball or perform background check. Nonetheless, Defense Intelligence Agency (DIA) and CIA will pass raw intel on to senior policymakers. …

11/6/00: Congress doubles funding for Iraqi opposition groups to more than $25 million; $18 million is earmarked for Chalabi’s Iraqi National Congress, which then pays defectors for anti-Iraq tales. …

Jan 2002: The FBI, which favors standard law enforcement interrogation practices, loses debate with CIA Director George Tenet, and Libi is transferred to CIA custody. Libi is then rendered to Egypt. “They duct-taped his mouth, cinched him up and sent him to Cairo,” an FBI agent told reporters. Under torture, Libi invents tale of Al Qaeda operatives receiving chemical weapons training from Iraq. “This is the problem with using the waterboard. They get so desperate that they begin telling you what they think you want to hear,” a CIA source later tells ABC. …

Feb 2002: DIA intelligence summary notes that Libi’s “confession” lacks details and suggests that he is most likely telling interrogators what he thinks will “retain their interest.” …

9/7/02: Bush claims a new UN International Atomic Energy Agency (IAEA) report states Iraq is six months from developing a nuclear weapon. There is no such report. …

9/8/02: Page 1 Times story by Judith Miller and Michael Gordon cites anonymous administration officials saying Saddam has repeatedly tried to acquire aluminum tubes “specially designed” to enrich uranium. …

Tubes “are only really suited for nuclear weapons programs…we don’t want the smoking gun to be a mushroom cloud.”—Rice on CNN …

“We do know, with absolute certainty, that he is using his procurement system to acquire the equipment he needs in order to enrich uranium to build a nuclear weapon.”—Cheney on Meet the Press

Oct 2002: National Intelligence Estimate produced. It warns that Iraq “is reconstituting its nuclear program” and “has now established large-scale, redundant and concealed BW agent production capabilities”—an assessment based largely on Curveball’s statements. But NIE also notes that the State Department has assigned “low confidence” to the notion of “whether in desperation Saddam would share chemical or biological weapons with Al Qaeda.” Cites State Department experts who concluded that “the tubes are not intended for use in Iraq’s nuclear weapons program.” Also says “claims of Iraqi pursuit of natural uranium in Africa” are “highly dubious.” Only six senators bother to read all 92 pages. …

10/4/02: Asked by Sen. Graham to make gist of NIE public, Tenet produces 25-page document titled “Iraq’s Weapons of Mass Destruction Programs.” It says Saddam has them and omits dissenting views contained in the classified NIE. …

2/5/03: In UN speech, Powell says, “Every statement I make today is backed up by sources, solid sources. These are not assertions. What we’re giving you are facts and conclusions based on solid intelligence.” Cites Libi’s claims and Curveball’s “eyewitness” accounts of mobile weapons labs. (German officer who supervised Curveball’s handler will later recall thinking, “Mein Gott!”) Powell also claims that Saddam’s son Qusay has ordered WMD removed from palace complexes; that key WMD files are being driven around Iraq by intelligence agents; that bioweapons warheads have been hidden in palm groves; that a water truck at an Iraqi military installation is a “decontamination vehicle” for chemical weapons; that Iraq has drones it can use for bioweapons attacks; and that WMD experts have been corralled into one of Saddam’s guest houses. All but the last of those claims had been flagged by the State Department’s own intelligence unit as “WEAK.”

I’m not going to quote the whole article, so if you’re fuzzy on the details, go read the whole darn thing.

If you had access to the actual documents from the CIA, DIA, British intelligence, interrogators, etc., you could have figured out that the “experts” were not unanimously behind the idea that Iraq was developing WMDs, but we mere plebes were dependent on what the government, Fox, and CNN told us the “experts” believed.

For the record, I was against the Iraq War from the beginning. I’m not sure what Nichols’s original position was, but in Just War, Not Prevention (2003) Nichols argued:

More to the point, Iraq itself long ago provided ample justifications for the United States and its allies to go to war that have nothing to do with prevention and everything to do with justice. To say that Saddam’s grasping for weapons of mass destruction is the final straw, and that it is utterly intolerable to allow Saddam or anyone like to gain a nuclear weapon, is true but does not then invalidate every other reason for war by subsuming them under some sort of putative ban on prevention.

The record provides ample evidence of the justice of a war against Saddam Hussein’s regime. Iraq has shown itself to be a serial aggressor… a supreme enemy of human rights that has already used weapons of mass destruction against civilians, a consistent violator of both UN resolutions and the therms of the 1991 cease-fire treaty … a terrorist entity that has attempted to reach beyond its own borders to support and engage in illegal activities that have included the attempted assassination of a former U.S. president; and most important, a state that has relentlessly sought nuclear arms against all international demands that it cease such efforts.

Any one of these would be sufficient cause to remove Saddam and his regime … but taken together they are a brief for what can only be considered a just war. ..

Those concerned that the United States is about to revise the international status quo might conside that Western inaction will allow the status quo to be revised in any case, only under the gun of a dictator commanding an arsenal of the most deadly materials on earthy. These are the two alternatives, and sadly, thee is no third choice.

Professor Nichols, I would like to pause here.

First: you think Trump is bad, you support the President under whom POWs were literally tortured, and you call yourself a military ethicist?

Second: you, an expert, bought into this “WMD” story (invented primarily by “Curveball,” an unreliable source,) while I, a mere plebe, knew it was a load of garbage.

Third: while I agree Saddam Hussein killed a hell of a lot of people–according to Wikipedia, Human Rights Watch estimates a quarter of a million Iraqis were killed or “disappeared” in the last 25 years of Ba’th party rule, the nine years of the Iraq war killed 150,000 to 460,000 people (depending on which survey you trust,) and based on estimates from the Iraq Body Count, a further 100,000 have died since then. Meanwhile, instability in Iraq allowed the horrifically violent ISIS to to sprout into existence. I Am Syria (I don’t know if they are reliable) estimates that over half a million Syrians have died so far because of the ISIS-fueled civil war rampaging there.

In other words, we unleashed a force that is twice as bad as Saddam in less than half the time–and paid a lovely 2.4 TRILLION dollars to accomplish this humanitarian feat! For that much money you could have just evacuated all of the Kurds and built them their own private islands to live on. You could have handed out $90,000 to every man, woman, and child in Iraq in exchange for “being friends with the US” and still had $150 BILLION left over to invest in things like “cancer treatments for children” and “highspeed rail infrastructure.”

Seriously, you could have spent the entire 2.4 trillion on hookers and blow and we would have still come out ahead.

Back in 2015, you tried to advise the Republican frontrunners on how to answer questions about the Iraq War:
First, let’s just stipulate that the question is unfair.

It’s asking a group of candidates to re-enact a presidential order given 12 years ago, while Hillary Clinton isn’t even being asked about decisions in which she took part, much less about her husband’s many military actions. …

Instead, Republican candidates should change the debate. Leadership is not about what people would do with perfect information; it’s about what people do when faced with danger and uncertainty. So here’s an answer that every Republican, from Paul to Bush, could give:

“Knowing exactly what we know now, I would not have invaded when we did or in the way we did. But I do not regret that we deposed a dangerous maniac like Saddam Hussein, and I know the world is better for it. What I or George Bush or anyone else would have done with better information is irrelevant now, because the next president has to face the world as it is, not as we would like to imagine it. And that’s all I intend to say about second-guessing a tough foreign-policy decision from 12 years ago, especially since we should have more pressing questions about foreign policy for Hillary Clinton that are a lot more recent than that.”

While I agree that Hillary should have been questioned about her own military decisions, Iraq was a formally declared war that the entire Republican establishment, think tanks, newspapers, and experts like you supported. They did such a convincing job of selling the war that even most of the Democratic establishment got on board, though never quite as enthusiastically.

By contrast, there was never any real Democratic consensus on whether Obama should remove troops or increase troops, on whether Hillary should do this or that in Libya. Obama and Hillary might have hideously bungled things, but there was never enthusiastic, party-wide support for their policies.

This makes it very easy for any Dem to distance themselves from previous Dem policies: “Yeah, looks like that was a big whoopsie. Luckily half our party knew that at the time.”

But for better or worse, the Republicans–especially the Bushes–own the Iraq War.

The big problem here is not that the Republican candidates (aside from Trump and Rand Paul) were too dumb to come up with a good response to the question (though that certainly is a problem.) The real problem is that none of them had actually stopped to take a long, serious look at the Iraq War, ask whether it was a good idea, and then apologize.

The Iraq War deeply discredited the Republican party.

Ask yourself: What did Bush conserve? What have I conserved? Surely being a “conservative” means you want to conserve something, so what was it? Iraqi freedom? Certainly not. Mid East stability? Nope. American lives? No. American tax dollars? Definitely not.

The complete failure of the Republicans to do anything good while squandering 2.4 trillion dollars and thousands of American lives is what triggered the creation of the “alt” right and set the stage for someone like Trump–someone willing to make a formal break with past Republican policies on Iraq–to rise to power.

Iraq I, the prequel:

But Iraq wasn’t the first war we were deceived into fighting–remember the previous war in Iraq, the one with the other President Bush? The one where we were motivated to intervene over stories of poor Kuwaiti babies ripped from their incubators by cruel Iraqis?

The Nayirah testimony was a false testimony given before the Congressional Human Rights Caucus on October 10, 1990 by a 15-year-old girl who provided only her first name, Nayirah. The testimony was widely publicized, and was cited numerous times by United States senators and President George H. W. Bush in their rationale to back Kuwait in the Gulf War. In 1992, it was revealed that Nayirah’s last name was al-Ṣabaḥ (Arabic: نيره الصباح‎) and that she was the daughter of Saud Al-Sabah, the Kuwaiti ambassador to the United States. Furthermore, it was revealed that her testimony was organized as part of the Citizens for a Free Kuwait public relations campaign which was run by an American public relations firm Hill & Knowlton for the Kuwaiti government. Following this, al-Sabah’s testimony has come to be regarded as a classic example of modern atrocity propaganda.[1][2]

In her emotional testimony, Nayirah stated that after the Iraqi invasion of Kuwait she had witnessed Iraqi soldiers take babies out of incubators in a Kuwaiti hospital, take the incubators, and leave the babies to die.

Her story was initially corroborated by Amnesty International[3] and testimony from evacuees. Following the liberation of Kuwait, reporters were given access to the country. An ABC report found that “patients, including premature babies, did die, when many of Kuwait’s nurses and doctors… fled” but Iraqi troops “almost certainly had not stolen hospital incubators and left hundreds of Kuwaiti babies to die.”[4][5]

Kuwaiti babies died because Kuwaiti doctors and nurses abandoned them. Maybe the “experts” at the UN and in the US government should vet their sources a little better (like actually find out their last names) before starting wars based on the testimony of children?

Vietnam:

And then there was Vietnam. Cold War “experts” were certain it was very important for us to spend billions of dollars in the 1950s to prop of the French colony in Indochina. When the French gave up, fighting the war somehow became America’s problem. The Cold War doctrine of the “Domino Theory” held that the loss of even one obscure, third-world country to Communism would unleash an unstoppable chain-reaction of global Soviet conquest, and thus the only way to preserve democracy anywhere in the world was to oppose communism wherever it emerged.

Of course, one could not be a Cold War “expert” in 1955, as we had never fought a Cold War before. This bi-polar world lead by a nuclear-armed communist faction on one side and a nuclear-armed democratic faction on the other was entirely new.

Atop the difficulties of functioning within an entirely novel balance of powers (and weapons), almost no one in America spoke Vietnamese (and no one in Vietnam spoke English) in 1955. We couldn’t even ask the Vietnamese what they thought. At best, we could play a game of telephone with Vietnamese who spoke French and translators who spoke French and English, but the Vietnamese who had learned the language of their colonizers were not a representative sample of average citizens.

In other words, we had no idea what we were getting into.

I lost family in Vietnam, so maybe I take this a little personally, but I don’t think American soldiers exist just to enrich Halliburton or protect French colonial interests. And you must excuse me, but I think you “experts” grunting for war have an extremely bad track record that involves people in my family getting killed.

While we are at it, what is the expert consensus on Russiagate?

Well, Tablet Mag thinks it’s hogwash:

At the same time, there is a growing consensus among reporters and thinkers on the left and right—especially those who know anything about Russia, the surveillance apparatus, and intelligence bureaucracy—that the Russiagate-collusion theory that was supposed to end Trump’s presidency within six months has sprung more than a few holes. Worse, it has proved to be a cover for U.S. intelligence and law-enforcement bureaucracies to break the law, with what’s left of the press gleefully going along for the ride. Where Watergate was a story about a crime that came to define an entire generation’s oppositional attitude toward politicians and the country’s elite, Russiagate, they argue, has proved itself to be the reverse: It is a device that the American elite is using to define itself against its enemies—the rest of the country.

Yet for its advocates, the questionable veracity of the Russiagate story seems much less important than what has become its real purpose—elite virtue-signaling. Buy into a storyline that turns FBI and CIA bureaucrats and their hand-puppets in the press into heroes while legitimizing the use of a vast surveillance apparatus for partisan purposes, and you’re in. Dissent, and you’re out, or worse—you’re defending Trump.

“Russia done it, all the experts say so” sounds suspiciously like a great many other times “expert opinion” has been manipulated by the government, industry, or media to make it sound like expert consensus exists where it does not.

Let’s look at a couple of worst case scenarios:

  1. Nichols and his ilk are right, but we ignore his warnings, overlook a few dastardly Russian deeds, and don’t go to war with Russia.
  2. Nichols is wrong, but we trust him, blame Russia for things it didn’t do, and go to war with a nuclear superpower.

But let’s look at our final fail:

Failure to predict the fall of the Soviet Union

This is kind of an ironic, given that Nichols is a Sovietologist, but one of the continuing questions in Political Science is “Why didn’t political scientists predict the fall of the Soviet Union?”

In retrospect, of course, we can point to the state of the Soviet economy, or glasnost, or growing unrest and dissent among Soviet citizens, but as Foreign Policy puts it:

In the years leading up to 1991, virtually no Western expert, scholar, official, or politician foresaw the impending collapse of the Soviet Union, and with it  and with it one-party dictatorship, the state-owned economy, and the Kremlin’s control over its domestic and Eastern European empires. … 

Whence such strangely universal shortsightedness? The failure of Western experts to anticipate the Soviet Union’s collapse may in part be attributed to a sort of historical revisionism — call it anti-anti-communism — that tended to exaggerate the Soviet regime’s stability and legitimacy. Yet others who could hardly be considered soft on communism were just as puzzled by its demise. One of the architects of the U.S. strategy in the Cold War, George Kennan, wrote that, in reviewing the entire “history of international affairs in the modern era,” he found it “hard to think of any event more strange and startling, and at first glance inexplicable, than the sudden and total disintegration and disappearance … of the great power known successively as the Russian Empire and then the Soviet Union.”

I don’t think this is Political Science’s fault–even the Soviets don’t seem to have really seen it coming. Some things are just hard to predict.

Sometimes we overestimate our judgment. We leap before we look. We think there’s evidence where there isn’t or that the evidence is much stronger than it is.

And in the cases I’ve selected, maybe I’m the one who’s wrong. Maybe Vietnam was a worthwhile conflict, even if it was terrible for everyone involved. Maybe the Iraq War served a real purpose.

WWI was still a complete disaster. There is no logic where that war makes any sense at all.

When you advocate for war, step back a moment and ask how sure you are. If you were going to be the canon fodder down on the front lines, would you still be so sure? Or would you be the one suddenly questioning the experts about whether this was really such a good idea?

Professor Nichols, if you have read this, I hope it has given you some food for thought.

Cathedral Round-Up: Should I read Nichols or Pinker?

Harvard Mag had interesting interviews/reviews of both Tom Nichols’s “Death of Expertise” and Steven Pinker’s “Enlightenment Now“.

From the article about Nichols:

Several years ago, Tom Nichols started writing a book about ignorance and unreason in American public discourse—and then he watched it come to life all around him, in ways starker than he had imagined. A political scientist who has taught for more than a decade in the Harvard Extension School, he had begun noticing what he perceived as a new and accelerating—and dangerous—hostility toward established knowledge. People were no longer merely uninformed, Nichols says, but “aggressively wrong” and unwilling to learn. They actively resisted facts that might alter their preexisting beliefs. They insisted that all opinions, however uninformed, be treated as equally serious. And they rejected professional know-how, he says, with such anger. That shook him.

Skepticism toward intellectual authority is bone-deep in the American character, as much a part of the nation’s origin story as the founders’ Enlightenment principles. Overall, that skepticism is a healthy impulse, Nichols believes. But what he was observing was something else, something malignant and deliberate, a collapse of functional citizenship.

What are people aggressively wrong about, and what does he think is causing the collapse of functional citizenship?

The Death of Expertise resonated deeply with readers. … Readers regularly approach Nichols with stories of their own disregarded expertise: doctors, lawyers, plumbers, electricians who’ve gotten used to being second-guessed by customers and clients and patients who know little or nothing about their work. “So many people over the past year have walked up to me and said, ‘You wrote what I was thinking,’” he says.

Sounds like everyone’s getting mansplained these days.

The Death of Expertise began as a cri de coeur on his now-defunct blog in late 2013. This was during the Edward Snowden revelations, which to Nichols’s eye, and that of other intelligence experts, looked unmistakably like a Russian operation. “I was trying to tell people, ‘Look, trust me, I’m a Russia guy; there’s a Russian hand behind this.’ ” But he found more arguments than takers. “Young people wanted to believe Snowden was a hero.”

I don’t have a particular opinion on Snowdon because I haven’t studied the issue, but let’s pretend you were in the USSR and one day a guy in the government spilled a bunch of secrets about how many people Stalin was having shot and how many millions were starving to death in Holodomor (the Ukrainian genocide.) (Suppose also that the media were sufficiently free to allow the stories to spread.)

Immediately you’d have two camps: the “This guy is a capitalist spy sent to discredit our dear leader with a hideous smear campaign” and “This guy is totally legit, the people need to know!”

Do you see why “Snowden is a Russian” sounds like the government desperately trying to cover its ass?

Now let’s suppose the guy who exposed Stalin actually was a capitalist spy. Maybe he really did hate communism and wanted to bring down the USSR. Would it matter? As long as the stuff he said was true, would you want to know anyway? I know that if I found out about Holodomor, I wouldn’t care about the identity of the guy who released the information besides calling him a hero.

I think a lot of Trump supporters feel similarly about Trump. They don’t actually care whether Russia helped Trump or not; they think Trump is helping them, and that’s what they care about.

In other words, it’s not so much “I don’t believe you” as “I have other priorities.”

In December, at a JFK Library event on reality and truth in public discourse, a moderator asked him a version of “How does this end?” … “In the longer term, I’m worried about the end of the republic,” he answered. Immense cynicism among the voting public—incited in part by the White House—combined with “staggering” ignorance, he said, is incredibly dangerous. In that environment, anything is possible. “When people have almost no political literacy, you cannot sustain the practices that sustain a democratic republic.” The next day, sitting in front of his fireplace in Rhode Island, where he lives with his wife, Lynn, and daughter, Hope, he added, “We’re in a very perilous place right now.”

Staggering ignorance about what, I wonder. Given our increased access to information, I suspect that the average person today both knows and can easily find the answers to far more questions than the average person of the 80s, 50s, or 1800s.

I mean, in the 80s, we still had significant numbers of people who believed in: faith healing; televangelists; six-day creationism; “pyramid power”; crop circles; ESP; UFOs; astrology; multiple personality disorder; a global Satanic daycare conspiracy; recovered memories; Freudianism; and the economic viability of the USSR. (People today still believe in the last one.)

One the one hand, I think part of what Nichols is feeling is just the old distrust of experts projected onto the internet. People used to harass their local school boards about teaching ‘evilution’; today they harass each other on Twitter over Ben Ghazi or birtherism or Russia collusion or whatever latest thing.

We could, of course, see a general decline in intellectual abilities as the population of the US itself is drawn increasingly from low-IQ backgrounds and low-IQ people (appear to) outbreed the high-IQ ones, but I have yet to see whether this has had time to manifest as a change in the amount of general knowledge people can use and display, especially given our manifestly easier time actually accessing knowledge. I am tempted to think that perhaps the internet forced Nichols outside of his Harvard bubble and he encountered dumb people for the first time in his life.

On the other hand, however, I do feel a definite since of malaise in America. It’s not about IQ, but how we feel about each other. We don’t seem to like each other very much. We don’t trust each other. Trust in government is low. Trust in each other is low. People have fewer close friends and confidants.

We have material prosperity, yes, despite our economic woes, but there is a spiritual rot.

Both sides are recognizing this, but the left doesn’t understand what is causing it.

They can point at Trump. They can point at angry hoards of Trump voters. “Something has changed,” they say. “The voters don’t trust us anymore.” But they don’t know why.

Here’s what I think happened:

The myth that is “America” got broken.

A country isn’t just a set of laws with a tract of land. It can be that, but if so, it won’t command a lot of sentimental feeling. You don’t die to defend a “set of laws.” A country needs a people.

“People” can be a lot of things. They don’t have to be racially homogenous. “Jews” are a people, and they are not racially homogenous. “Turks” are a people, and they are not genetically homogenous. But fundamentally, people have to see themselves as “a people” with a common culture and identity.

America has two main historical groups: whites and blacks. Before the mass immigration kicked off in 1965, whites were about 88% of the country and blacks were about 10%. Indians, Asians, Hispanics, and everyone else rounded out that last 2%. And say what you will, but whites thought of themselves as the American culture, because they were the majority.

America absorbed newcomers. People came, got married, had children: their children became Americans. The process takes time, but it works.

Today, though, “America” is fractured. It is ethnically fractured–California and Texas, for example, are now majority non-white. There is nothing particularly wrong with the folks who’ve moved in, they just aren’t from one of America’s two main historical ethnic groups. They are their own groups, with their own histories. England is a place with a people and a history; Turkey is a place with a people and a history. They are two different places with different people and different history. It is religiously fractured–far fewer people belong to one of America’s historically prominent religions. It is politically fractured–more people now report being uncomfortable with their child dating a member of the opposite political party than of a different race.

Now we see things like this: After final vote, city will remove racist Pioneer Monument Statue:

As anticipated, the San Francisco Arts Commission voted unanimously Monday to remove the “Early Days” statue from Civic Center’s Pioneer Monument, placing the century-plus old bronze figures in storage until a long-term decision about their fate can be made.

The decision caps off a six-month long debate, after some San Franciscans approached the commission in August 2017 to complain about the statue, which features a pious but patronizing scene of a Spanish missionary helping a beaten Indian to his feet and pointing him toward heaven.

In February the city’s Historic Preservation Commission voted unanimously to recommend removing “Early Days” despite some commissioners expressing reservations about whether the sculpture has additional value as an expose of 19th century racism.

Your statues are racist. Your history is racist. Your people is racist.

What do they think the reaction to this will look like?

 

But before we get too dark, let’s take a look at Pinker’s latest work, Enlightenment Now:

It is not intuitive that a case needs to be made for “Reason, Science, Humanism, and Progress,” stable values that have long defined our modernity. And most expect any attack on those values to come from the far right: from foes of progressivism, from anti-science religious movements, from closed minds. Yet Steven Pinker argues there is a second, more profound assault on the Enlightenment’s legacy of progress, coming from within intellectual and artistic spheres: a crisis of confidence, as progress’s supporters see so many disasters, setbacks, emergencies, new wars re-opening old wounds, new structures replicating old iniquities, new destructive side-effects of progress’s best intentions. …

Pinker’s volume moves systematically through various metrics that reflect progress, charting improvements across the last half-century-plus in areas from racism, sexism, homophobia, and bullying, to car accidents, oil spills, poverty, leisure, female empowerment, and so on. …

the case Pinker seeks to make is at once so basic and so difficult that a firehose of evidence may be needed—optimism is a hard sell in this historical moment. … Pinker credits the surge in such sentiments since the 1960s to several factors. He points to certain religious trends, because a focus on the afterlife can be in tension with the project of improving this world, or caring deeply about it. He points to nationalism and other movements that subordinate goods of the individual or even goods of all to the goods of a particular group. He points to what he calls neo-Romantic forms of environmentalism, not all environmentalisms but specifically those that subordinate the human species to the ecosystem and seek a green future, not through technological advances, but through renouncing current technology and ways of living. He also points to a broader fascination with narratives of decline …

I like the way Pinker thinks and appreciate his use of actual data to support his points.

To these decades-old causes, one may add the fact that humankind’s flaws have never been so visible as in the twenty-first century. … our failures are more visible than ever through the digital media’s ceaseless and accelerating torrent of grim news and fervent calls to action, which have pushed many to emotional exhaustion. Within the last two years, though not before, numerous students have commented in my classroom that sexism/racism/inequality “is worse today than it’s ever been.” The historian’s answer, “No, it used to be much worse, let me tell you about life before 1950…,” can be disheartening, especially when students’ rage and pain are justified and real. In such situations, Pinker’s vast supply of clear, methodical data may be a better tool to reignite hope than my painful anecdotes of pre-modern life.

Maybe Nichols is on to something about people today being astoundingly ignorant…

Pinker’s celebration of science is no holds barred: he calls it an achievement surpassing the masterworks of art, music, and literature, a source of sublime beauty, health, wealth, and freedom.

I agree with Pinker on science, but Nichols’s worldview may be the one that needs plumbing.

Which book do you want me to read/review?

Cathedral Round-Up #30: HLS’s Bicentennial Class

Harvard Law Bulletin recently released a special issue commemorating HLS’s 200th anniversary:

Invocation

A Memorial to the Enslaved People Who Enabled the Founding of Harvard Law School

On a clear, windy afternoon in early September at the opening of its bicentennial observance, Harvard Law School unveiled a memorial on campus. The plaque, affixed to a large stone, reads:

In honor of the enslaved whose labor created wealth that made possible the founding of Harvard Law School

May we pursue the highest ideals of law and justice in their memory

Harvard Law School was founded in 1817, with a bequest from Isaac Royall Jr. Royall’s wealth was derived from the labor of enslaved people on a sugar plantation he owned on the island of Antigua and on farms he owned in Massachusetts.

“We have placed this memorial here, in the campus cross-roads, at the center of the school, where everyone travels, where it cannot be missed,” said HLS Dean John Manning ’85. …

Harvard University President Drew Faust… also spoke at the unveiling, which followed a lecture focused on the complicated early history of the school.

“How fitting that you should begin your bicentennial,” said Faust, “with this ceremony reminding us that the path toward justice is neither smooth nor straight.” …

Halley, holder of the Royall Professorship of Law, who has spoken frequently about the Royall legacy, read aloud the names of enslaved men, women, and children of the Royall household from records that have survived, “so that we can all share together the shock of the sheer number, she said, “and a brief shared experience of their loss.”

“These names are the tattered, ruined remains, the accidents of recording and the encrustation of a system that sought to convert human beings into property,’ she said “But they’re our tattered remains.”

This commemorative issue also contains an interview with ImeIme Umana, Harvard Law Review’s 131st president, “How Have Harvard Scholars Shaped the Law?”:

How has legal scholarship changed since the Law Review began publishing more than a century ago?

Scholarship certainly has changed over time, and these pieces, whether or not they acknowledge it to a great extent, are consistent with the changing nature of the legal field in that they bring more voices to the table and more diverse perspectives. If you look back at our older scholarship, you’ll tend to see more traditional, doctrinal, technical pieces. now, they’re more aspirational, more critical, and have more social commentary in them. It’s a distinction between writing on what the law is and writing on what the law should be, and asking why things are the way they are.

BTW, you can purchase the Harvard Law Review on Amazon.

What Kind of scholarship do you find especially meaningful?

I’m really passionate about the sate of the criminal legal system and civil rights. The cherry on top within those topics is scholarship that proposes new ways of thinking or challenges the status quo.

One of my favorite articles is [Assistant] Professor Andrew Crespo’s “Systemic Facts” [published in the June 2016 Harvard Law Review], because it does just that. The thesis is that courts are institutionally positioned to bring about systemic change, and that they can use their position to collect facts that they are institutionally privy to. It calls on them to do that such that we might learn more about how the legal system is structured.

I’ve noticed the increased emphasis on criminal law lately, especially bail reform.

The Law Review was founded 130 years ago, and now you are its president. Do you ever get caught up in thinking about the historical implications of running such a well-known and influential publication?

… Looking at it through a historical lens, the diversity of the student body and Law Review editors and authors is especially meaningful, as it makes legal institutions more inclusive, and therefore the law more inclusive. It’s important to keep pushing in that direction and never become complacent. The history is very important.

You are the first black woman who was elected to serve as president of the Law Review. Why do you think it took so long for that to happen?

Ive thought about it a lot and I just don’t know the answer. My thought is that it just tracks the lack of inclusion of black women in legal institutions, full stop. It’s a function of that. There’ always more we can be doing to be more inclusive. The slowness of milestones like this might have a broader cause than just something specific to the Law Review.

It probably tracks closer to the inclusion of Nigerian women at Harvard than black women. Umana is Nigerian American, and Nigerian Americans score significantly better on the SAT and LSAT than African Americans. (Based on average incomes, Nigerian Americans do better than white Americans, too.) So I’m going to go out on a limb and wager that significant black firsts at HLR are due to the arrival of more Nigerian and Kenyan immigrants, rather than the integration of America’s African American community.

While reading about ImeIme Umana, I noticed that American publications–such as NBC News–describe her as a “native” of Harrisburg, Pennsylvania. By contrast, Financial Nigeria proudly claims her as a “Nigerian American”:

Born to Nigerian immigrant parents originally from Akwa Ibom State in Nigeria, Umana is a resident of Harrisburg, Pennsylvania, United States. Umana graduated with a BA in Joint Concentration in African American Studies and Government from Harvard University in 2014. She is currently working on a Doctor of Law degree (Class of 2018) at the Harvard Law School.

Who is this man? HLS Class of 1926

The issue is full of fascinating older photographs with minimalist captions, because the graphic design team prefers white space over information.

For example, on page 58 is a photo of a collection of students and older men (is that Judge Learned Hand in the first row?) captioned simply 1926 and “Stepping up: by 1925, lawyers could pursue graduate degrees (LL.M.s and S.J.D.s) at HLS.

<- Seated in the front row is this man. Who is he? Quick perusal of a list of famous Indians reveals only that he isn’t any of them.

There is also an Asian man seated directly behind him whose photo I’ll post below. You might think, in our diversity obsessed age, when we track the first black editor of this and first black female head of that, someone would be curious enough about these men to tell us their stories. Who were they? How did they get to Harvard Law?

After some searching and help from @prius_1995, I think the Indian man is Dr. Kashi Narayan Malaviya, S.J.D. HLS 1926, and the Asian man is Domingo Tiongco Zavalla, LL.M. 1927, from the Philippines. (If you are curious, here are the relevant class lists.)

I haven’t been able to find out much about Dr. Malaviya. Clearly he associated with folks in high places, as indicated by this quote from Hindu Nationalism and the Language of Politic in Late Colonial India:

In Allahabad, during a meeting attended by Uma Nehru, Hriday Nath Kunzru and Dr. Kashi Narayan Malaviya, M. K. Acharya made the link between the politics of the nation and the plight of Hinduism very clear…

Domingo Tiongco Zavalla, LL.M. HLS 1927

(Unfortunately, it appears that he has a more famous relative named Madan Mohan Malaviya, who is coming up in the search results. His great-grandson is single, however, if any of you ladies are looking for a Brahmin husband.)

1926 was during the period when America ruled the Philippines, so it would be sensible for Filipinos to want to learn about the American legal system and become credentialed in it. Domingo Zavalla went on to be a delegate to the Philippines’s Commonwealth Constitutional Convention (This was probably the 1934 Convention: “The Convention drafted the 1935 Constitution, which was the basic law of the Philippines under the American-sponsored Commonwealth of the Philippines and the post-War, sovereign Third Republic.”)

That’s about all I’ve found about Zavalla.

How quickly we fall into obscurity and are forgotten.

Cathedral Round-Up #28: They’re not coming for George Washington, that’s just a silly right-wing conspiracy–

Titus Kaphar’s Shadows of Liberty, 2016, at Yale University Art Gallery

Is that… George Washington? With rusty nails pounded into his face?

Holding up “cascading fragments of his slave records.”

Oh. I see.

Carry on, then.

I was going to write about Harvard forbidding its female students from forming female-only safe spaces (College will Debut Plans to Enforce Sanctions Next Semester) in an attempt to shut down all single-gender frats and Finals Clubs, but then Princeton upped the ante with Can Art Amend Princeton’s History of Slavery?

No. Of course not.

Princeton University [has] a new public-art project that confronts the school’s participation in the nation’s early sins. On Monday, the university unveiled Impressions of Liberty, by the African American artist Titus Kaphar. The sculpture is the conceptual core of a campus-wide initiative that begins this fall and aims to reconcile the university’s ties to slavery. The Princeton and Slavery Project’s website has released hundreds of articles and primary documents about slavery and racism at Princeton…

Attaching strips of canvas or other material to the faces of people he disapproves of is apparently one of Kaphar’s shticks.

I’m old enough to remember when George Washington was admired for freeing all of his slaves in an era when most people took slavery for granted. Today he is castigated for not having sprung from the womb with a fully modern set of moral opinions.

Impressions of Liberty, by Titus Kaphar

Impressions of Liberty is Kaphar’s portrait of Samuel Finley–fifth president and one of the original trustees of Princeton (1761-1766)–interwoven with photographs of black actors in historical dress etched in glass.

For generations, slave-owning Christians—including Princeton’s founders—used religious ideas to justify a horrific national practice, [Kaphar] noted; Finley is holding a bible in Impressions of Liberty.

Note the framing: yes, Christians used religion to justify owning slaves. So did Muslims, Jews, Hindus, Buddhists, pagans, and atheists. There’s nothing unique about Christians and slavery aside from the fact that Finley was Christian. No mention is made of pagan Africans who captured and sold each other into slavery, nor of Muslims who raided Africa and Europe in search of slaves. There were Jewish slave merchants and Confederates, as well, for slavery was a near-universal practice justified by people all over the world prior to its abolition by whites in the 1800s. The article mentions none of that; only Christians are singled out for criticism.

The article doesn’t say how much Princeton paid for the sculpture it commissioned to castigate the memory of one of its founders. The work currently stands outside MacLean House, but will soon be moved indoors, to Princeton’s permanent art collection. MacLean House–completed in 1756–is a national landmark that was home to Princeton’s first presidents, including Samuel Finley. It also housed George Washington during the Battle of Princeton.

According to the article:

On the one hand, according to records, Princeton was a bastion of liberty, educating numerous Revolutionary War leaders and in 1783 hosting the Continental Congress… At the same time, Sandweiss found that the institution’s first nine presidents all owned slaves at some point, as did the school’s early trustees. She also discovered that the school enrolled a significant number of anti-abolitionist, Southern students during its early years; an alumni delivered a pro-slavery address at the school’s 1850 commencement ceremony. …

Princeton’s racist history enabled it to provide social and political benefits for alumni—an advantage that students will continue to enjoy well into the future.

While I happen to think that universities have it much too good these days and deserve to be taken down a notch, I find this claim extremely dubious. Harvard and Yale are located in staunchly abolitionist New England and had very few ties to slavery, (Mr. Yale apparently knew a guy who had slaves, and Harvard Law School received some money from a guy who had slaves,) yet these schools are arguably even wealthier and more powerful than closer-to-the-South and more-tied-to-slavery Princeton. Stanford was founded after slavery was outlawed, and yet its students enjoy social and political benefits on par with Princeton’s.

We could argue that the entire area of the Confederacy reaped the economic benefits of slavery, yet today this region is much poorer than the Free States of the North. There isn’t just no correlation between slavery, wealth, and power–there’s actually a negative correlation. Slavery, if it has any effect at all, makes a region poorer and weaker.

Monumental Inversion: George Washington, (Titus Kaphar,) Princeton Art Museum

… Princeton University is spreading the mission across various pieces of art through a show this fall entitled “Making History Visible: Of American Myths And National Heroes.” At the exhibit’s entrance, viewers begin with Kaphar’s piece Monumental Inversion: George Washington—a sculpture of the leader astride his horse, made out of wood, blown glass, and steel. The sculpture depicts the former president’s dueling nature: He’s glorified within a great American equestrian monument but he’s also sitting astride a charred cavity, surrounded by glass on the ground. In juxtaposing Kaphar’s artwork and a George Washington plaster bust, “Making History Visible” forces visitors, hopefully, to see and feel the contradiction in colonial leaders who sought freedom from tyranny but did not extend that ideal to slaves.

I repeat: George Washington freed all of his slaves.

We might question the point of all this. Kaphar is free to make his art, of course. His paintings display quite excellent technical skill, I admit. But why do we, as a society, feel the need to commission and display attacks on our founders? Princeton’s students could just as happily go to class each day without looking at images of Finley’s slaves; unlike Washington, Finley isn’t famous and most students were probably blissfully unaware of his slaveholding until someone decided to stick a sculpture dedicated to it on the lawn.

How do Princeton’s black students feel after walking past a sculpture depicting slaves? Uplifted? Happy? Ready to go to class and concentrate on their lectures? I doubt it. Art may be “powerful” or “open dialogues,” but no one seems to feel better after viewing such pieces.

No, I don’t see how this selective dwelling on the past improves anything.

A world in which images of your founders and heroes are defaced, their corpses judged and rusty nails are driven into their portraits: it’s like a cruel dystopia, Lewis’s That Hideous Strength or 1984. According to Wikipedia:

During and after the October Revolution, widespread destruction of religious and secular imagery took place, as well as the destruction of imagery related to the Imperial family. The Revolution was accompanied by destruction of monuments of past tsars, as well as the destruction of imperial eagles at various locations throughout Russia. According to Christopher Wharton, “In front of a Moscow cathedral, crowds cheered as the enormous statue of Tsar Alexander III was bound with ropes and gradually beaten to the ground. After a considerable amount of time, the statue was decapitated and its remaining parts were broken into rubble”.[40]

The Soviet Union actively destroyed religious sites, including Russian Orthodox churches and Jewish cemeteries, in order to discourage religious practice and curb the activities of religious groups.

You know, they tell us, “No one is attacking George Washington; that’s just a crazy right-wing conspiracy theory,” and then they go and do it.

Incidentally, Georgetown, according to the article, “announced last year that it would grant admissions preference to descendants of slaves whose sale it profited from in the early 1800s.” How do you qualify for that? Do you have to prove that you’re descended from the specific slaves involved, or can you be descended from any American slaves? Because I had ancestors who were enslaved, too, and I’d like to get in on this racket.

In the end, the article answers its titular question:

When Impressions of Liberty is removed from Maclean House in December and enters Princeton’s permanent museum collection, its greatest achievement may lie in the realization that no apology or recompense can ever suffice. …

“No civil-rights project can ever fully redeem anything.”

Cathedral Round-Up #26: Philosophy

In sob stories about just how hard it is to be one of the most privileged people in the world, 21 Harvard and Oxford Students Share Their Experiences of Racism They Face Everyday [sic]:

I, Too, Am Harvard is a powerful photo campaign highlighting the faces and voices of black students at Harvard College. Fed up with the institutional racism they face everyday [sic], the students are speaking speaking out against it by sharing their heartfelt stories in a series of portraits.

“Our voices often go unheard on this campus, our experiences are devalued, our presence is questioned,” they say. “This project is our way of speaking back,of claiming this campus, of standing up to say: we are here.”

The students from Oxford University have also begun a similar campaign.

Examples of nefarious racism keeping down Harvard and Oxford students include people asking if they’re listening to rap music on their headphones and jokes abut Somali pirates. Several complaints also center on the fact that whites believe that blacks and Hispanics get an admissions boost due to Affirmative Action, which Blacks find terribly offensive. (Of course, as the NY Times notes, Harvard has been acting affirmatively since 1971:

The university has a long and pioneering history of support for affirmative action, going back at least to when Derek Bok, appointed president of Harvard in 1971, embraced policies that became a national model.

The university has extended that ethos to many low-income students, allowing them to attend free. Harvard has argued in a Supreme Court brief that while it sets no quotas for “blacks, or of musicians, football players, physicists or Californians,” if it wants to achieve true diversity, it must pay some attention to the numbers. The university has also said that abandoning race-conscious admissions would diminish the “excellence” of a Harvard education.

This is why Harvard is now getting sued by Asians, whose excellent SAT scores result in a significant admissions discrimination at top schools.)

Why do students at such elite schools indulge in such petulant whining? For that matter, why do these schools allow inanities like students yelling at faculty members about Halloween costumes (Yale, I’m looking at you)?

They say the Devil’s best trick was convincing people he doesn’t exist; perhaps the Cathedral’s best trick is convincing people that it’s oppressed. If Cathedralites are oppressed, then you can’t complain that they’re oppressing you.

Or perhaps people who get into top schools develop some form of survivor’s remorse? How do you reconcile a belief that “elitism” is bad, that intelligence isn’t genetic, that no one is “inherently” better than anyone else nor deserves to be “privileged” with the reality that you have been hand-selected to be part of a privileged, intellectual elite that enjoys opportunities we commoners can only dream of? Perhaps much of what passes for liberal signaling in college is just overcompensation for the privileges they have but can’t explicitly claim to deserve.

Over at Yale, the Philosophy Department is very concerned that too many white males are signing up for their courses:

But not all departments draw evenly across Yale’s many communities — some will be more demographically homogeneous than others, such as Yale’s Philosophy Department, which has historically been majority white and male.

Philosophy has struggled as a discipline to attract students from diverse backgrounds, and faculty and students within Yale’s Philosophy Department told the News that while the department is not as diverse as it could be in terms of racial and gender makeup or curricular offerings, ongoing efforts to remedy the problem are a cause for optimism.

Yalies seem lacking in basic numeracy: if some departments–say the African American Studies and Women’s, Gender, and Sexuality Studies–attract disproportionately high numbers of blacks and women, then there won’t be enough blacks and women left over to spread out to all of the other departments to get racial parity everywhere. Some departments, by default, will have to have more whites and men.

“[Lack of diversity] has inspired a lot of soul-searching in the discipline in recent years,” said Joanna Demaree-Cotton GRD ’21, co-coordinator of Yale’s chapter of Minorities and Philosophy which works to combat issues faced by minorities in academia. “Lots of departments, including ours at Yale, have started asking tough questions about the cause of this drop-off in the representation of women and racial minorities, and how we might go about ameliorating the problem.”

In case you’re wondering, Demaree-Cotton is a white lady. When the push comes to get some professors to give up their spots in favor of women-of-color philosophers, will Demaree-Cotton get pushed out for being white, or will she be saved because she’s female?

Unfortunately for Yale, they recently lost their only black philosophy professor, Chris Lebron, author of The Making of Black Lives Matter: The History of an Idea and The Color of Our Shame: Race and Justice In Our Time, to Johns Hopkins U.

Yale has a second problem: very few people major in philosophy, period. For example, in 2016, only 20 undergrads received degrees in philosophy or philosophy of mathematics (data is not broken down for each department.) Of these, 13 were men and 7 were female. These are the kind of numbers that let you write hand-wringing articles about how “philosophy is only 35% female!” when we are actually talking about a 6-person gap. The recent “drop-off” in women and racial minorities, therefore, is likely just random chance.

“For example, although over the last four years women have represented less than 25 percent of applicants to our Ph.D. program, they represent about 40 percent of students currently in our program,” Darwall said.

Sounds like Yale is actually giving women preferential treatment, just not enough preferential treatment to make up for the lack of female applicants.

Deputy Dean for Diversity and Faculty Development Kathryn Lofton said that Yale is working hard to “rethink diversity” across the University…

Each academic department and program, not just philosophy, must engage with this topic, Lofton added.

“The protests in the fall of 2015 showed that our students believe we have work yet to do to achieve this ambition,” she said. “The University has responded to their call with a strong strategic vision. But this work takes time to accomplish.”

Kathryn Lofton is also a white woman. Besides lecturing people about the importance of Halloween Costume protests, she is also a professor of Religious Studies, American Studies, History and Divinity and chair of the Religious Studies department. Her faculty page describes her work:

Kathryn Lofton is a is a historian of religion who has written extensively about capitalism, celebrity, sexuality, and the concept of the secular. … Her first book, Oprah: The Gospel of an Icon (2011) used the example of Oprah Winfrey’s multimedia productions to evaluate the material strategies of contemporary spirituality. Her forthcoming book, Consuming Religion offers a profile of religion and its relationship to consumption and includes analysis of many subjects, including office cubicles, binge viewing, the family Kardashian, and the Goldman Sachs Group. Her next book-length study will consider the religions of American singer-songwriter Bob Dylan.

Apparently writing about Oprah and the Kardashians now qualifies you to be a Yale professor.

Back to the Philosophy Department:

For instance, [Jocelyn] Wang said that many undergraduate introductory philosophy classes are history-based and focused on “dead white men,” which is not necessarily as accessible to students from varied backgrounds.

“I think the way that the undergraduate philosophy curriculum is structured contributes partially to the demographic composition of the major,” Wang said.

In other words, Miss Wang thinks that Yale’s black and Hispanic students are too dumb to read Socrates and Kant.

This is really a bullshit argument, if you will pardon my language. Any student who has been accepted to Yale is smart (and well-educated) enough to “access” Socrates. These are Yalies, not bright but underprivileged kids from the ‘hood. If language is an issue for Yale’s foreign exchange students, all of the philosophy texts can be found in translation (in fact, most of them weren’t written in English to start with.) But if you struggle with English, you might not want to attend a university where English is the primary language to start with.

However, if we interpret “accessible” in Miss Wang’s statement as a euphemism for “interesting,” we may have a reasonable claim: perhaps interest in historical figures really is tribal, with whites more interested in white philosophers and blacks more interested in black philosophers. Your average “Philosophy 101” course is likely to cover the most important philosophers in the Western Tradition, because these courses were originally designed and written by Westerners who wanted to discuss their own philosophical tradition. Now in order to attract non-Westerners, they are being told they need to discard the discussion of their own philosophical tradition in favor of other philosophical traditions.

Now, I don’t see anything wrong with incorporating non-western philosophies if they have something interesting to say. My own Philosophy 101 course covered (IIRC) Plato, Aristotle, Kant, Mill, John Rawls, the Bhagavad Gita, Taoism, and Confucianism. I enjoyed this course, and never found myself thinking, “Gee, I just can’t access this Confucius. This course would be a lot more accessible without all of the brown guys in it.”

Look, some people like talking about what a “chair” is and what “is” is, and some people like talking about police brutality against black bodies, and if one group of people wants to get together in the Philosophy department and talk about chairs and the other group wants to go to the African American Studies department, that’s fine. We don’t all have to hang out in the same place and talk about the same stuff. Sometimes, you just need to sit down and acknowledge that the thing you’re interested in isn’t 100% interesting to everyone else on the planet. Some subjects are more interesting to women, some are more interesting to men. Some are more interesting to whites or blacks or Asians, married or single people, young or old, city or country dwellers, etc. We are allowed to be different. If different people have different interests, then the only way to get people with different interests into the philosophy department is by changing the department itself to cater to those different interests–interests that are already being better served by a different department. If you turn Philosophy into Gender + Race Studies, then you’ve just excluded all of the people who were attracted to it in the first place because they wanted to study philosophy.

Speaking of which:

[Wang] added that in her experience, students can create a “culture of intimidation” by making references to philosophers without explaining them, thus setting up a barrier for people who are not familiar with that background.

I know some people can be cliquish, reveling in overly-obtuse language that they use to make themselves sound smart and to exclude others from their exclusive intellectual club. Academic publications are FILLED with such writing, and it’s awful.

But Miss Wang is criticizing what amount to private conversations between other students for being insufficiently transparent to outsiders, which rubs me the wrong way. Every field has some amount of specialized knowledge and vocabulary that experienced members will know better than newcomers. Two bikers talking about their motorcycles wouldn’t make much sense to me. Two engineers talking about an engineering project also wouldn’t make much sense to me. And I had to look up a lot of Jewish vocabulary words like “Gemara” before I could write that post on the Talmud. Balancing between the amount of information someone who is well-versed in a field needs vs the amount a newcomer needs, without actually knowing how much knowledge that newcomer already has nor whether you are coming across as condescending, simplistic, or “mansplaining,” can be very tricky.

This is something I worry about in real life when talking to people I don’t know very well, so I’m sensitive about it.

Still, [Rita Wang–a different student with the same last name] noted several classes that delved into questions of race and gender, such as a class on American philosophy that included the writings of Martin Luther King, Jr. and another course on G.W.F. Hegel that discussed his interpretation of the Haitian Revolution.

I was curious about Hegel’s interpretation of the Haitian revolution. A quick search of “Hegel Haiti” brings me to Susan Buck-Morss‘s Hegel, Haiti and Universal History. According to The Marx & Philosophy Review of Books:

The premise of Susan Buck-Morss’s Hegel, Haiti and Universal History is the arresting claim that Hegel’s renowned ‘master-slave dialectic’ was directly inspired by the contemporaneous Haitian Revolution. Commencing with a slave uprising on the French colony of Saint Domingue in 1791, the victorious former slaves declared Haiti’s independence from Napoleon’s France in 1804, three years before Hegel published his Phenomenology of Spirit, which contained the earliest published (and still the best known) rendition of the master-slave dialectic. …

However, if Buck-Morss is right to claim that Hegel was alluding to the Haitian Revolution when writing his master-slave dialectic, then Hegel’s seemingly callow optimism was not mere fancy but drew directly on lived historical experience: the achievement of Haitian slaves not only in overthrowing a savage and comprehensive tyranny but also in establishing their own modern state. Buck-Morss only hints at this possibility, however. Her aim, she says, is different: she wants to ensure that the great German philosopher is forever linked to the greatest of Caribbean revolutions (16)….

Bridge made of trash, Haiti

Ah, yes, Haiti, such a great revolution! And such a great country! Say, how have things been in Haiti since the revolution?

Dessalines was proclaimed “Emperor for Life” by his troops.[63] …Once in power, he ordered the massacre of most whites. … In the continuing competition for power, he was assassinated by rivals on 17 October 1806.[66]

The revolution led to a wave of emigration.[70] In 1809, nearly 10,000 refugees from Saint-Domingue settled en masse in New Orleans.[71] They doubled the city’s population. …

Haitian politics have been contentious: since independence, Haiti has suffered 32 coups.[134]

Cité Soleil in Port-au-Prince, one of the biggest slums in the Northern Hemisphere, has been called “the most dangerous place on Earth” by the United Nations.[138]

Haiti has consistently ranked among the most corrupt countries in the world on the Corruption Perceptions Index.[159] It is estimated that President “Baby Doc” Duvalier, his wife Michelle, and their agents stole US $504 million from the country’s treasury between 1971 and 1986.[160]

Haiti’s purchasing power parity GDP fell 8% in 2010 (from US$12.15 billion to US$11.18 billion) and the GDP per capita remained unchanged at PPP US$1,200.[2] … Haiti is one of the world’s poorest countries and the poorest in the Americas region, with poverty, corruption, poor infrastructure, lack of health care and lack of education cited as the main sources. …

Haiti’s population (1961–2003) from 4 to 10 million

Meanwhile, Haiti’s population has steadily increased from 4 million (in 1961) to 10 million (in 2003).

Sounds great. Who wouldn’t embrace a revolution that brought such peace, prosperity, and well-being to its people?

Interestingly, Wikipedia notes that Susan Buck-Morss is a member of the Frankfurt School. Wikipedia also notes that the idea that the Frankfurt School is a bunch of Marxists–or “Cultural Marxists” is just a “conspiracy theory.” Clearly there is nothing Marxist about the Frankfurt School.

But back to Yale:

This year, new faculty members who have joined the department will teach courses that diversify the curriculum, Gendler said. Philosophy professor Robin Dembroff, who is genderqueer, is teaching a social ontology course next semester that focuses on questions surrounding social construction and the nature of social categories.

Here’s an excerpt from Dembroff’s PhD dissertation summary (Princeton):

Many important social debates concern who should count as belonging to various social categories. Who should count as black? …as a woman? …as married? …it is widely assumed that these questions turn on metaphysical analyses of what makes someone black, a woman, and so on. That is, it is assumed that we should count someone as (e.g.) a woman just in case they satisfy sufficient conditions for having the property `woman’. My dissertation argues that this assumption is wrong: whether someone should count as a woman turns not on whether they satisfy the correct metaphysical analysis of what it is to be a woman, but on ethical considerations about how we ought to treat each other. …

While researching this post, I also came across what I think is Dembroff’s old Myspace account. While looking back at our teenage selves can be ridiculous and often embarassing, the teenage Dembroff seemed a much realer, more relateable human than the current one who is trying so hard to look Yale. Perhaps it isn’t the same Dembroff, of course. But we were all teenagers, once, trying to find our place in this world. I think I would have liked teenage Dembroff.

Dembroff’s dissertation, boiled down to its point, is that all of these debates over things like “trans identity” don’t really matter because we ought to just try to be kind to each other. While I think statements like, “What matters for determining ethical gender ascriptions are normative questions about how we ought to perceive and treat others, and not facts about who is a man or a woman. This claim has an important implication: It may be unethical to make true gender ascriptions, and ethical to make false ascriptions,” are quite wrong, because reality is an important thing and basing an ethics around lying has all sorts of bad implications, I find this at least a more honest and straightforward idea than all of the “gender is a social construct” nonsense.

Still, I question the wisdom of having someone who thinks that social ontology, social construction, and the nature of social categories don’t really matter teach a course on the subject. But maybe Dembroff brings a refreshing new perspective to the subject. Who knows.

Let’s finish our article:

“One thing that I have found really encouraging at Yale is that I have been made to feel as if the graduate student community as a whole — including white men — truly cares about working together to create positive change,” Demaree-Cotton said. “This really makes a big difference. The importance of all students and faculty — not just minorities — taking an active interest in these issues should not be underestimated.”

So much social signaling. So much trying to impress the legions of other privileged, to scrabble to the top, to hang on to some piece of the pie while deflecting blame onto someone else.

I wish people could leave all of this signaling behind.