256 Comments

This is a very good post. I do want to give more of an insider perspective. I'm vice-president of University of Waterloo EA despite having extreme disagreements with EAs when it comes to AI.

EA, practically, is a very large dragnet that picks up ambitious people, a subset of whom have the talent to pull off their ambition. To me this is the definition of EA that most accurately gets at what it does in practice. More detail here: https://forum.effectivealtruism.org/posts/g8aBf2oLwDvgd4ovf/much-ea-value-comes-from-being-a-schelling-point. In general, the typical recruiting process is that someone shows up to the a club event, we tell them about EA, emphasizing the conferences and organizations. The main pipeline is that people attend the conferences, go on some retreats, and join one of the organizations. Of course, there's a drop off at each point. This is also the pattern I've heard happens at other local EA groups.

Until the last point, there isn't really a selection for talent at all (other than being able to fill out forms). This means that it's extremely easy to become heavily invested in EA while not having the talent to contribute meaningfully. Even students who don't follow rationalist reasoning norms at all can go pretty far down this pipeline. This produces a large number of people who feel like they are part of EA while not really practicing anything related to what EA is about, hence Bostrum cancellation, dishonesty around group differences, etc. While I think that the other factors you point out (demographics, WQ, media) are contributors, this is by far the biggest thing EA could do to self-correct.

That being said, the long-term goal of EA branding itself as anti-woke is net positive, if indirectly. Even being vaguely associated with right-wing figures or groups is fairly good for filtering out status climbers. What probably happens if EA is anti-woke but doesn't solve this earlier problem is that it becomes dominated by grifters (like anti-wokeness writ large), but that's probably a better problem than the status quo.

Expand full comment

Came in here expecting to read an article about how Electronic Arts will be anti-woke and it was something video game related. This is still cool though.

Expand full comment

I've seen this claim in a few of your pieces, that liberals are reasonable but just have a blind spot on identity issues. I think energy/climate is another blind spot, a huge one. Here liberals want to pursue the goal of "decarbonization", which is absolutely insane -- an entire overhaul of the energy system (the most fundamental of industries) in pursuit of an incredibly vaguely specified goal. So you have lots of smart economists and scientists conducting rational discourse and employing utilitarianism (the way you describe in this essay) on the best strategies to go about this without questioning the underlying premises which are absolutely bonkers. This makes me think that it's not just identity issues that are a blind spot: liberalism (so e.g. the NYT and liberal media) has fundamental problems wherever social justice ideology holds sway, and this applied to a lot of things -- race and feminism, but also the environment, crime, homelessness, and more.

Expand full comment

This was pretty good. As a religious conservative bordering on a theonomist, I like the rationalist and EA stuff because I generally think they're straight shooters, and I learn about interesting ideas and concepts. But I think their teleology is lacking (to the extent it exists at all) and I wouldn't want them making hard moral decisions. Which makes sense...since I'm a borderline theonomist.

Utilitarianism leads to some weird places. Like arguing that "we should destroy nature because nature involves the suffering of animals" and "is it okay to eat mentally challenged people?" I enjoy the idea balls being sent from the rationalists but you guys could still stand to church it up a bit to remain relevant.

Regarding Jesus-Darwin I'm surprised by how often I use concepts from evolutionary psychology in Sunday School. Of course, I don't phrase it like that. But I do think Christian thoughts on human nature are often in agreement with naturalism.

Expand full comment

Richard, this is the best thing I've read in a long time. You really have a way to boldly state things we all recognize as true, but often don't like to talk about.

I would love to see utilitarians argue for executing violent criminals to improve society.

Expand full comment

Common sense ought to tell us that women – who now insanely believe they can do whatever men can, sexually or otherwise – are going to get themselves into trouble by tantalizing exploitative or weak men in a power relationship. Trouble is, power is an aphrodisiac. Therefore, in order to protect themselves, normal men should make sure there are witnesses when dealing with women in a work environment. Almost all of these harassment cases involve two people by themselves and women do lie, or more likely, have morning after regrets, apparently even lasting decades.

Expand full comment

I agree with Richard and Tyler that the main reason part of EA is "woke" is because it draws from secular liberals at elite universities. But within EA, there are three rough groups. The AI/rationalist/far future group, the mosquito net people, and the animal people. The first group is the most anti-woke. It's also the group that's most likely to think the importance of AI trumps everything else, we might all be dead next year and so forth. As such the expected value of fighting wokeness is very low even if you think wokeness is very bad.

The people who care about the global poor and animals (who start off being more woke) are also the people who are optimizing for long-term influence (assuming no AI apocalypse). So I expect EA to become more left-wing.

Finally, I think both admirers and detractors of EA exaggerate how much people within it are driven by philosophical commitments. EAs are happy to praise weird philosophical when it's on noncontroversial issues (welfare of shrimps and insects). But EAs are barely more likely than typical people to change their life paths because of utilitarian arguments about natalism or theism or whatever. (Possibly the prevalence of veganism among EAs shows that many are willing to bite the bullet.)

Expand full comment

Well, yeah, but t'was ever thus. Conquest's Second Law - well, it was attributed to him, regardless of whether or not he actually said it - says that any organization that is not constitutionally right-wing will become left-wing. Just because we've changed what left and right denote doesn't make this any less true.

The path you've suggested - trying to wave a shiny penny at the rubes and have them follow - is what the "Intellectual Dark Web" is doing, and it's having some success, even though the IDW is dorky, and that's because a lot of the Right is credulous. Example: a lot of people on the Right became isolationist converts in the Trump era, but the IDW has waved "women's tears in Iran" at them, and now we're hearing about how important Taiwanese sovereignty is (to whom?) and now a lot of people who wanted us out of Afghanistan are back to wanting a three-front war with China, Iran, Russia, and anyone else foolish enough to oppose ARE FREEDOMS. There really is a direct causal line of success between "nod and smile when conservatives say they don't trust puberty blockers for children" and "then fill their head with whatever else you want them to believe." So, EA will succeed on that score if it tries it.

The problem is that support from anti-wokes is either highly conditional (in the case of people of principal) or wide but shallow (in the case of the Right, who'll scream bloody murder at a squirrel until a slightly bigger squirrel appears in their peripheral vision, at which point the original squirrel may as well never have existed.) Or to be slightly more charitable, it's shallow less because of attention span and more because being animated by wanting to be left alone is never, ever, ever, ever, ever, ever, ever a winning strategy against someone who doesn't want to leave you alone. To wit: anti-wokeism is a losing proposition because it has no positive view of its own to offer.

Expand full comment

"We can use the terms “rationalism” and “utilitarianism” almost interchangeably. Most rationalists I think would say that they are Utilitarians on most things, and when they’re not utilitarians, as in when they bring something like “human dignity” into the equation, they are honest and upfront about it. But to them, non-utilitarian views are more of an aberration than they are for most other people."

I'm going to push back here.

Consequentialism means judging an act by its consequences. Utilitarianism is a subset of consequentialism, usually making a value judgement that the optimal consequence involves some sort of utility maximization.

I don't think it's irrational to NOT want a society that tries to maximize utility, especially if utility is defined in unusual ways or makes no distinction about whose utility is being increased/decreased and why.

I'm skeptical that any society that is built for humans can be called 'good' if it goes out of its way to maximize **anything**. More often then not it's about optimizing the balance of things that make life enjoyable with the things that make life possible. (pleasure vs. self control, suicidal belligerence vs. suicidal altruism, etc.)

Expand full comment

This was a double plus good post. A 100% improvement over your ode to the MSM.

Expand full comment

Small quibble: at elite universities, there are way more strong female college applicants than male. To maintain near gender parity at those universities, there’s basically an affirmative action program for male applicants. There are lots of reasons that the males are less qualified in general that isn’t about IQ. And this says nothing about advanced degrees in cognitively demanding fields, where there is clearly affirmative action for women. Still, it’s worth being accurate.

Expand full comment

On the topic of racism, I think this is a sloppy argument by an “anti-woke” rationalist:

https://betonit.substack.com/p/the-ironclad-argument-against-racism

“Racism is wrong because collective guilt is wrong.”

Right. Not all blacks are criminals or low I.Q. Many are fine people. But enough of them are in the former category such that a NIMBY attitude might be warranted. It’s not rational to turn your country into Brazil.

Expand full comment

The Aella and Meghan Murphy interview is extremely amusing for those who have yet to see it, Its like Aella is talking to a brick wall, yet of course Murphy was clearly the victor in the eyes of twitter ppl. It's almost like Julia Galef is wrong about everything, and in order to influence ppl you don't engage in reasonable discussion Aumann's agreement theorem style rather you seek to humiliate your opponent with the eventual goal of making your opponents belief reprehensible to hold.

"B-B-But wat about that HIV treatment action grou.." Yes of course the normalization of homosexuality and homosexuality related accessories was the product of reasonable discussion and not indoctrinating small children and suing ppl who don't want to bake u a cake.... wait what, I think I have that backwards. Maybe there is a reason homo sapiens evolved to be bad at reasoning towards the truth and to be exceptional at intellectual tribalism.

"Why EA Will Be Anti-Woke or Die" no but what if it all just dies, like what if every conservative movement and any attempt at resistance and last ditch reactionary effort will just die, I dunno maybe by then we would have all evolved into 70.00 IQ Bomalians (notice the dot). Needless to say I'm not hopeful, I also find the lack of dysgenic and race and IQ talk by ppl such as Scott pretty interesting, like he clearly has gone on a autism fueled deep dive at 4 am, and like any reasonable person concluded that yes.....

I also find your attempt had resisting the woke mind virus valiant Mr. Richard and wish you the best of luck in your quest, till them im going to live in a cabin in the middle of the woods.

Expand full comment

Not everybody agrees on "common decency" or "treating other people well," to wit the Abbott-DeSantis stunt with asylum seekers and people who push back against (not just forget) to use people's preferred gender pronouns.

But I agree that "rational" makes sense when applied to altruism.

Expand full comment

I'm in an odd kind of agreement here, like that perhaps experienced as a family friend listening to a father sternly warn a daughter about boys and pregnancy when I know that she is, in fact, already pregnant and has been having morning sickness for weeks.

That is, while I very much enjoy the writings of the OG rationalists mentioned, and apply the principles of rationality in myself wherever I can, I've never understood how anyone could think altruism is rational, or even compatible with rationality. I didn't need Robin Hanson to tell me about charity and social signals, and I can't see any way in which caring about malaria in africa is more rational than caring about women's tears.

Expand full comment

Thank you for this piece; I just want to nitpick on a couple of small issues. Neuroticism is not opposite of rationality, these seem more like orthogonal. The rationalist movement contains many neurotic males; I myself am a highly neurotic female, yet still feel best when reading rationalists, and am disgusted by excessive wokeism. Having read your post I'm now trying to figure out at which point I should consider the EA to have lost its purpose and stop donating to them.

About polyamory: the reason why it's less popular among women compared to men is not that women are less rational (whether they are or not). For a rational ordinary heterosexual woman polyamory is probably not the best system, considering that aging lowers females quality much faster than that of males. It may be pleasant while we are young, but worse than monogamy when we get old.

There's this other little bug (or is it a feature?) in polyamory: having several partners absolutely lowers the emotional commitment to each one of them, almost like it was a limited resource. This is even somewhat true for having more children: getting a new child more often than not reduces the overwhelming feeling of love that the mother feels for her older child. It's not politically correct to say it (and I would never say it in front of my kids) but it's true. In some ways this is a good thing - should you lose one of your kids or your spouse, the pain will not be as strong as when he was your only kid or spouse. So yeah, there's a trade-off.

Expand full comment