158 Comments

I disagree that it's immoral to value the lives of their fellow citizens so much more than those of foreigners. If I help a homeless person in my home town get a job, I have benefitted myself by making my own environment safer and more productive in a way that is not true if I do the same for someone halfway around the world. So it seems to me clearly rational to value my neighbors and fellow citizens more than foreigners. You might then say that even if it is rational, it is immoral, because one should value all lives equally. But that strikes me as circular, since the very question is whether it is immoral not to value all lives equally.

I am not making an abstract point. To me, it seems clearly true that I should love my wife more than my neighbor, my friends more than strangers, my community more than my country, and my fellow citizens more than foreigners. Is there anything inconsistent or illogical about that belief?

Expand full comment

I think Richard's error here is to conflate the idea that everyone is to some degree a utilitarian with the idea that everyone engages in and approves of atheistic moral philosophizing.

Most people rely on some combination of moral intuitions (i.e. going with their gut, some combination of inborn disposition and societal conditioning) and religious/spiritual thinking (not necessarily anything resembling an orthodox religion -- can easily be something as vague as "karma" in the pop New Age sense). "Everyone is a utilitarian" is really describing a moral intuition: if we can help two people instead of one with an equivalent effort, then sure, let's help two. This is distinct from the atheistic moral philosophy of utilitarianism. Which, ironically, is without utility for most people.

Only a small slice of the population places any weight on atheistic moral philosophy: the idea that, although there is no ultimate meaning to anything, it's still a worthwhile exercise to start with your moral intuitions but then construct elaborate mental frameworks around what is good and bad -- even when this takes you to morally unintuitive places. For what reason? None, really -- just a semi-autistic, systematizing personality that has rejected religion and spirituality while at the same time not being content to rely entirely on intuition.

Naturally, most people really dislike this exercise because most people dislike weird, autistic things. They also dislike when people promote things that strongly contradict their moral intuitions, like having sex with dogs.

If you have the atheistic moral philosophizing personality type, then what you're doing seems clearly better than what the normies are doing: you're developing internally-consistent systems! More internally consistent = more true! But this isn't necessarily true when we're talking about systems that are ultimately meaningless. Since the only reason non-religious people care about morality is because they don't want to contradict their intuitions (i.e. their conscience) and feel bad, and since systematizing requires approving of morally unintuitive things, then clearly systematizing is worse for them. Plus it requires a bunch of extra work.

Expand full comment

> Based on what EAs have written, I have replaced much of the shrimp and chicken in my diet with beef and pork, which they say gets rid of most of the harm, but Iā€™m not going any further than that.

I think this argumentation is totally mistaken and the net effect of your actions has sadly been to increase, not decrease, suffering.

As I argued here https://akarlin.com/animals/, it is likely that capacity to suffer is correlated to cognitive capacity. Consequently, even taking bulk into account, it seems that it is better to eat beef or chicken than pork, pigs being about as smart as the smartest dog breeds, and far better to eat fish (salmon in particular mostly die bad deaths anyway so I don't think catching and eating them even increases suffering over the fact of their existence), and one can destroy and consume arbitrary amounts of crustaceans (lobsters have 100,000 neurons, that's less than the ant's 250,000) and be ethically almost indistinguishable from a vegetarian. After all the vegetarian still ends up swallowing some bugs.

As you probably gathered from the above, I am mostly a pescetarian, although I do eat chicken and beef because of protein and taste, respectively, as well as cooking convenience. If fish could be made relatively more convenient, then I would be happy to switch over entirely until the development of lab meats.

Expand full comment

Agree with almost everything except the speculation around whether EA ideas would better propagate without a movement.

It's true that many EAs are former bleeding hearts but a non trivial number are people like me, who don't score particularly high on empathy or compassion but were convinced by the arguments. And most people who fall in this camp, even if often neuroatypical, aren't autistic enough to make decisions or sacrifices based on these arguments if there isn't some mechanism for social rewards on this basis. The intertia is massive - in my example, working in biosecurity is something that no one else in my MBA class even considered. The reason I did was because in addition to being convinced some lunatic or Jihadi group could very plausibly kill millions of people with an engineered pandemic because of our collective stupidity, I knew it would give me a professional and social community that I could go back to if this weird bet didn't pay off. In other words, It's good that we have a community where status somewhat tracks how much good you do. Now EA doesn't do this even close to perfectly (something I want to write about another time) but it does it better than any other place does.

Expand full comment

THE VALUE OF A LIFE VARIES DRAMATICALLY

"people are really dumb for worrying so much about school shootings and terrorist attacks relative to other things"

- Ok, but these things aren't obviously other crimes, and definitely not poverty in Africa. What's happening here is that people, even when they won't admit it, value the quality of the victims not just their number or raw amounts of suffering.

The average homicide victim has racked up a crime spree of his own and was a net negative to society. The average victim of a murderous husband, wasn't exactly an innocent bystander but had any number of opportunities to leave, and maybe even cooperated with his abuse of their children. And if you look at the demographics most affected by murder in general, you quickly realize that these people are not exactly generally valuable.

Most school shooting victims are in fact innocent, and of relatively high human capital. Few places could have more human capital than the world trade center on 9/11. How many average lives, not to mention average murder victim lives, were these people worth?

Expand full comment

The factory farming thing is not exactly new or some kind of innovation by EA. PETA has been around forever. The reality is people care more about people than animals. There isn't any kind of logical argument you can make against that. The fact that EAs both treat this as some kind of novel problem they spotted for the first time, and then just take it for granted that you are obligated to value animal life as much as they claim to in the same way that everyone hates vegans for, really does not endear them to people.

Going back to PETA, people don't hate PETA because PETA is some kind of moral mirror that actually makes them feel bad. They hate PETA because PETA are annoying, obnoxious moral busybodies wasting everyone's time with a problem that, to be frank, most people don't care about and don't consider a problem at all. EAs are less overtly meddlesome, but the same basic thing applies.

Expand full comment

Effective Altruists are basically utilitarians who have finally realised that utilitarianism is actually really complicated and thus requires a lot of math. This is a step forward, but they are still massively underestimating how complicated utilitarianism really is. They are analogous to socialists who believe that AI can solve the economic calculation problem.

Some of their recommendations are good, like that you should try to eat only free range meat. But all of their good recommendations have been made by others already, and also some of their bad ones (note that SA cited EA as influencing the British government to abandon its correct pandemic policy and embrace its failed lockdown policy).

Expand full comment
Dec 5, 2023Ā·edited Dec 5, 2023

Strongly feel like EA people can't seem to figure out why people don't like their philosophy because they seem unable to conceive of a moral philosophy outside of their own. Even when they try to understand others, like you've done here, they can't get outside the window of their own philosophy, so they never see why anyone is disagreeing with them and think it's just deficiencies on the parts of their counterparties -- feeling dumb, feeling defensive, feeling imposed upon. When in fact the most common moral stances are totally unrecognizable by utilitarianism, and no amount of understanding them *in terms of utilitarianism* will ever be able to comprehend them.

No point in disagreeing with the article line-by-line; you'll get the idea from the intro:

> I think EAs miss just how much their ideology offends people by its very nature.

I agree with this wholeheartedly.

> EAs are, for example, attacked for being both ā€œwokeā€ and ā€œwhite supremacist,ā€ which is an indication that each side is basically just throwing out the worst epithet they have at the group.

This is just dumb; different people say different things for all reasons, and the only thing less useful than averaging over the opinions of a large disparate group is averaging over the opinions of the complement of that group.

> The main problem I think people have with EA is that it is a mirror. It tells you exactly what you are doing wrong, and why.

Completely disagree. The reason they dislike it is that it strikes them as profoundly immoral. Especially immoral is the fact that EA can't comprehend how EA could be immoral.

> Everyone accepts utilitarianism to some degree.

The degree to which they *don't* is what you're entirely missing.

Expand full comment

I stopped at "I am too masculine, smart, and brave..."

Expand full comment

The correct answer to animal suffering isn't to reduce the efficiency of our farming methods, but to breed dumber animals which don't suffer as much under the same conditions.

Expand full comment

I pretty much agree with Richard throughout, but I still think that EA is basically just [vaguely Christian / Buddhist religious sentiment] x [highly educated elite]. The farther it goes down the rabbit-hole of its particular theology (in this case, utilitarianism + sci-fi novels), the less admirable it tends to become, just like any other religion. If you devote significant resources to stifling AI development because your religion tells you that algorithms running on computer processors will develop magical powers and cast an evil spell on all of humanity to make us kill ourselves, you're not much different from the church lady screaming about how you should remove the roof of your house because the Rapture is happening tomorrow, which is to say, a fucking loon. But that doesn't mean there isn't an intellectually defensible, highly admirable form of EA/Christianity/Buddhism/Islam/Judaism/whatever that focuses mostly on being a decent, pro-social person in your day to day life.

If the EA theologians really believed their own utilitarian bullshit, they'd be screaming at anyone who supported Scott Alexander's decision to donate his kidney, because it resulted in a 1% higher chance of Scott being incapacitated later in life, which would prevent him from proselytizing for EA quite so much, which would result in a 0.001% lower chance that some brilliant child would someday convert to EA and come up with the one true foolproof way to stop robots from doing a Skynet, which is the only thing that matters. The more an EA person tries to argue from suspect premises that have no support outside of their sacred literature (e.g., there's a thing called "superintelligence" that nobody has ever seen but is at least 78.6% certain to end the world some time later this year), the less I am inclined to view their adherence to EA as healthy or admirable. That being said: most people who would classify themselves as EAs seem very healthy and admirable! Just like most people who would classify themselves as Christians. All IME, of course.

Personally, I use utilitarianism as a heuristic when making certain types of decisions, while recognizing that it has plenty of limitations (which have been pretty well documented by moral philosophers over the last 300 years). And the older I get, the less useful I find it as a way of guarding against moral error. In fact, I find that it seems to lead otherwise smart people morally astray more than just about any other article of faith. The entire premise of utilitarianism depends on being able to predict all future consequences of one's choices (including how you talk to other people about those choices and how they respond to those choices), into eternity. So it only appears to works as a reliable moral guide for people suffering from overbearing hubris to the point of megalomaniacal delusion.

Expand full comment

I don't hate EA. I mostly scoff at it while appreciating some of its value. Obviously charity ratings are very useful; who among us can do that kind of diligent drill-down legwork. I review various ratings now and then over the years, as I evaluate charities and decide how to donate.

But EA types, IME, tend to wildly overstate the "objective" and "obvious" value of their simplistic (and inevitably "biased") numeric formulas, presuppositions, and resulting evaluative conclusions. Scoff. So I look well beyond that, when seeking input on the charities I choose to support.

That so many prominent EAers think their judgments are superior to others' because they've won the economic survivorship/success lottery is quite offputting, but I just shake my head, roll my eyes, and sigh resignedly at the foolishness of humankind. Then ignore a decent chunk of what they have to say.

Expand full comment

I really like EA and generally loathe the people who loathe it, but I wildly disagree with the overall assessment here. First, there's no "one" reason people dislike it, because the reasons are highly bifurcated by ideology. Rightists hate it because they oppose egalitarianism in general, and more specifically they hate the idea of forking over vast resources to save Africans, whom they regard essentially as some sort of mammalian locust swarm. Leftists, on the other hand, hate it because it prominently features people they culturally pattern match to "libertarian tech bros" who don't have the same political bĆŖtes noires as they do (preoccupation with fighting racism/capitalism/transphobia/"justice"/whatever) and who don't consider revolutionary change to be desirable, much less absolutely mandatory.

Second, at least with regard to the latter camp (leftists), I think they dislike it in a sense because it doesn't shame or call people Hitler *enough*. That is, EA's are generally extremely chill, affectively positive, conflict-averse people who favor emotional carrots over sticks. But the most strident leftist critics of EA like Timnit Gebru are the polar opposite of that, and love nothing more than blaming problems of subalterns on the actions or indifference of heartless Westerners (specifically white Westerners).

If I were tasked with popularizing EA in some Leftist (with a capital "L") space, I'd advocate for all the same causes with all the same fervor, but I'd go out of my way to emphasize notions of guilt and complicity of Western societies over literally everything else. Africa is only poor because of centuries of ruthless colonialist resource extraction, which *you* personally benefit from every day from, you monster, so the very least you could do is donate 10% of your income. Call it "direct action" or "mutual aid" or "reparations" or whatever. Make sure to include a lot of historical anecdotes about affluent white people visiting Third World countries and behaving in some entitled manner in the midst of crushing subsistence poverty, and go out of your way to compare those people to complacent Americans who feast in blissful ignorance while the rest of the world burns. Drop the word "justice" at every possible opportunity, and loudly insinuate that anyone who doesn't participate hates "justice" and favors oppression. You get the picture.

Expand full comment

1. Richard, did you ever read the Caplan vs Huemer debate on animal rights?

2. Is it worse for a chicken to be eaten by a human than get torn apart in the wild by a fox?

3. ā€œPeople often say that EA has a cold or autistic feel to it, but the irony of this is that modern Westerners with their WEIRD morality seem that way to much of the rest of the world.ā€ Isnā€™t WEIRD just white? Wonā€™t that disappear if the West is turned into the Third World demographically?

Expand full comment

Put more simply: EA's vibe like "experts." We're currently in an expert-aversion economy of ideas.

Expand full comment

EA, being basically extreme utilitarianism, has the same issue that all utilitarianism has: it requires you to be able to accurately evaluate the proper utility function, which is a hard problem in general. With deontology (eg "thou shalt not kill"), you are unlikely to get the most moral outcome imaginable, but you are also unlikely to cause disaster. With utilitarianism, maybe your master plan to save the world works out, and the ends end up justifying the meansā€”or maybe you are Stalin, you deluded yourself into believing that communism could work, and now millions are dead for nothing. IOW, deontology is the allowance that utilitarianism makes for human fallibility.

EAs, having higher IQs, are less likely to make certain kinds of dumb mistakes when evaluating utility functions. But, being human, they still have many weaknesses, for example arrogrance, which can lead them astray if not checked. Scott Alexander is pretty humble, I am not worried about him; you, Richard, are not humble at all, so I trust your version of EA less.

Expand full comment