Moral differences in mediocristan

Scott Alexander writes:

Utilitarianism agrees that we should give to charity and shouldn’t steal from the poor, because Utility, but take it far enough to the tails and we should tile the universe with rats on heroin. Religious morality agrees that we should give to charity and shouldn’t steal from the poor, because God, but take it far enough to the tails and we should spend all our time in giant cubes made of semiprecious stones singing songs of praise.

He suggests that these are surprisingly divergent visions of the highest good, for moral visions that give similar advice for day-to-day life:

converting the mass of the universe into nervous tissue experiencing euphoria isn’t just the second-best outcome from a religious perspective, it’s completely abominable

But what strikes me about them is how similar they seem, when you strip away the decorative metaphors.

First of all, in both cases you can afford many more instances of the best thing if you simulate it than make a real one. So in both cases the universe is better converted into computronium than actual rats or heavenly choirs - the substitution of "nervous tissue experiencing euphoria" for "rats on heroin" is an implied acknowledgement of this. Nor should we imagine that religion eschews such optimization. Many religions promote asceticism, which allows more humans to subsist and praise God on the same resources. Moreover, the Bible urges that we be fruitful and multiply.

But also, it's not at all clear that an imagined end state of constant songs of praise is meaningfully different from the hedonic-utilitarian state. After all, of what interest is a song of praise if it is not an expression of genuine appreciation? And, couldn't you - by economizing on the actual churches, or all the parts of the mind not engaged in songs of praise - make your religious heaven more efficient and therefore have more of it?

(Anyone familiar with Dante's Paradiso will have recognized a vision of heaven - where we want as many people to go as possible, of course - identical or nearly identical to the hedonic utilitarian goal - the souls of the saved forming a sort of Matrioshka brain orbiting about a reference point of maximum goodness, basking in the energy radiating out from it in eternal unchanging bliss.)

I don't see how this optimization ends up with anything but "nervous tissue experiencing euphoria" - simulated nervous tissue, of course. There's an implied disagreement on whether rats have minds sufficiently complex to sing songs of praise, but that's an open question among hedonic utilitarians (and among the religious) too. In any case, it's reasonably likely that if you tried to simulate just the expression of appreciation, it wouldn't matter much whether you started with a model of a human brain singing songs of praise, or a rat brain enjoying heroin.

It's actually in the near term that these visions of the good diverge. Here's what I see as the actual, applied disagreements between a few major schools of thought.

Utilitarianism implicitly recommends that we make decisions by directly quantifying the good our different actions might accomplish, and doing the one that scores highest. Communism seems to think the people doing all the work should coordinate to seize control of the social systems being used to extract things from them, and then gradually wind them down, though step 1 never seems to work out. Fascism seems somewhat like a more cynical alternative to Communism that doesn't bother pretending there's a second step, which appeals to people who want to boss others around even if they don't get to be at the apex of the hierarchy.

Christianity thinks that we should make more people Christian, and have them listen to sermons or read books about helping others (or, in a surprising variant, subject themselves to Roman imperial administration despite the ostensible collapse of the Roman empire quite a while ago), and also actively try to help people around us in order to send a credible signal that Christianity's a good thing. Islam seems to think it should gradually conquer the world by a mixed strategy of conversion by persuasion and seizing and holding territory, and administer an unambiguous code of laws with clear lines of authority and social roles. Judaism thinks that Jews should have children and train them to engage in an intergenerational project to develop a perfected code of laws with a history spanning millennia, in the hope that eventually Jews will have something good enough that other people will want to adopt it, and in the meantime doesn't have much in the way of advice for non-Jews. (It's unclear to what extent Christianity, Islam, and Utilitarianism constitute partial successes for this agenda.) Buddhism seems to think we should teach people how to chill out, don't sweat the small stuff, and it's all small stuff, in the hopes that this will cause a persuasion cascade similar in mechanism to the Christian one, but without promoting either the weird book or the Roman imperial bureaucracy.

This can get confusing because utilitarianism tends to foreground the idea of a particular end state, because that's the utilitarian strategy, and utilitarians tend to misunderstand people with different strategies as having different ends in mind. But in practice, most people most of the time have not got a global highest end in mind - they're implementing a strategy that feels right or advantageous to them, or one they've been acculturated to implement, and correctly attending to the means this directs them to, since they're unlikely to ever be in a position to directly specify ends globally, and any long-run strategy for programming the universe is likely to involve future generations better situated to specify the end-state than we are.


Related: Against Responsibility, The Basic AI Drives

6 thoughts on “Moral differences in mediocristan

  1. Zvi Mowshowitz

    Always great when someone beats me to pointing out something that I noticed but couldn't quite lock down properly. If you're all about the most efficient production of [thing], or equivalently you're all about having the most [thing] or [thingness] possible, including if you think [various things] can trade off via multiplication, then whatever object most efficiently produces [thing] becomes what you want to do with [literal everything].

    Which is actually even more convergent than the strategies used by different candidate [thing]s to convince people to make more of them.

    Reply
    1. Benquo Post author

      We have at least one influence in common here aside from Scott's post - Michael Vassar pointed out to me that to an outside observer FAI and UFAI look pretty much identical - a sphere of computronium expanding at just under lightspeed until it engulfs the accessible mass of the known universe. I also found Hegel helpful for learning to notice this kind of thing, though not enough to affirmatively recommend him to anyone yet.

      Reply
  2. Andaro

    I think value has to be linked to personal identity. After all, utility is always someone's utility. I happen to not value hedonium or theistic worship, so they're not a desirable end goal. Perhaps others disagree, and this makes me care (negatively or positively) about what they care about, in the spirit of reciprocity or just practical diplomacy. But then those preferences are linked to the identity of those entities, and what they refer to is irrelevant. It makes no intrinsic difference whether those others are paperclip maximizers, hedonium maximizers, worship maximizers or something else entirely.

    Reply
  3. TheWakalix

    I think the cult accusation may have been referring to the religious comparisons in the post or the expanding sphere of computronium. Although I’m not sure there’s any meaningful point being made - when someone wants to say something, they’ll look for any excuse to say it.

    Reply

Leave a Reply to Benquo Cancel reply

Your email address will not be published. Required fields are marked *