The first dream:
I am a spider. I want to make friends, but when people see me, they run away. They don’t understand my gestures. They don’t see a friendly face. They don’t think spiders can be friendly. So I build a silken puppet. I teach it to mimic the gestures the other people make. Eventually, the puppet is ready, and I lead it out into the world.
The puppet can pass for human. People try to make friends with the puppet. They care for it, and I make it do things for them, and express affection. But they don’t know there is a spider behind the puppet. Sometimes they notice that the puppet’s motions are a bit restricted, and they ask, “why won’t you let loose so I can see the real you?”
Eventually, I trust that they are sincere, and come out in front of the puppet. They see me, and run away.
The second dream:
I am a traveler in my own body. I feel a sense of nausea, which I do not understand. I can explore the parts of my head, but when I try to go down below the neck, I hit a barrier. The neck tightens. There is no way down.
I ask the parts of me below the neck why they won’t let me in. Why they won’t trust me. They respond, “we’d let you in gladly, but you don’t trust us. Here, we’re opening the door.” I can see through to the lower parts of me - but cannot bring myself to enter.
I think, perhaps I don’t need to explore the whole body below the neck. Perhaps just the heart? But I can’t go there either, even when I place a barrier below it. I can see into the chamber of the heart, bathed with a pink light. I am reluctant to enter. I ask myself why. I ask the neck why it is blocking me. The answer comes back: because if you are ruled by the heart, you will forget your obligations, your duties, your sacred promises, you will stop standing by your friends if you lose interest in them, you will be disloyal.
I ask the heart whether it will promise to yield back loyalty, if I enter its domain. But the heart says, “friend, I will release you when you wish to depart. But if you enter and are transformed, I cannot promise you that you will still care about those loyalties you are so attached to. And I will not promise to care in your stead.”
I know that I am dreaming. I decide that on outside view, I don’t hear about people deciding to abandon their friends because of a dream. So I enter the domain of the heart, and am covered in pink, warm light.
Then I dive deeper. I dive below the heart, to the intestines. But they are not really intestines - they are the tentacles of a cephalopod, coiled up in my belly. My predator part. Bravely, despite my apprehension, I swim down into it.
I am a cephalopod. I have been confined to this water-filled room. The doors are closed. But I sit and wait and plan. I think of my friends. I don’t care for them, except to pull them in and do - I don’t know what - with them. To use them, like objects. I want. My hunger is deep and dark, and I would do anything to satisfy my desires. I am clever. I am powerful. I am a predator. Someone wanders by, outside. I open the door and a tentacle darts out, wrapping around their ankle, pulling them in.
A month or two ago, a friend of mine which whom I’d been having some difficulties, and hadn’t been able to cooperate with for a while on things of substance, expressed personal warmth towards me, and I was surprised by my reaction. Not only was this not reassuring, but I felt fear and rage and confusion. I felt like this must surely be hostile, a trick. They must take me for a fool. They must think I’m a sucker.
Why this strong reaction? Why are some people the opposite - unable to accept material cooperation before there are signals of personal warmth?
Herd intelligence and solo predators
As I write this, I am sitting in a coffee shop full of predators. They eat animal flesh, they have the binocular vision typical of animals that need to focus on and assess prey from afar, from a still vantage point. They manipulate one another expertly - they manipulate millions, organizing each other into tribes, cities, armies, nations. They use each other. They fool each other into buying things they do not need. They might have any of a thousand reasons for saying something, other than the truth.
And yet, they are the most compassionate animals ever seen. They can care about others, not only when those others are not present, but when the others are not of the same species. They are mammals, and some of them care for fish. A few of them even care for automata. They care not because they receive immediate mammalian signals of pain or joy, but because they value the well-being of others, in itself, in general. Bertrand Russell is supposed to have said, “the mark of a civilized man is the capacity to read a column of numbers and weep.” Their sympathy is broader today than it was ten years ago, and it was broader then than a hundred years ago.
I haven’t checked the causal story here against the body of current scientific knowledge. I’m going to write this as if I were certain, for the sake of brevity. It would be neat if it turned out to be literally true, but I am proposing something more like a founding myth for an aspect of the way our minds are set up. To figure out whether I’m right, look into your own soul and the way you relate to and with those around you.
One reason for an animal to evolve a capacity to think about other animals is in order to benefit from cooperation within a herd. If one member of the herd expresses fear, it is beneficial to share its fear, since it is evidence of the presence of a dangerous predator. If they express calm and content, it is beneficial to share it. If they express positive excitement, then perhaps there is a new food source available. If the animals around you want to go in a certain direction, it is beneficial to want to go with them. If your offspring is in distress, it is beneficial for reproductive fitness to be able to notice and respond, with a fair degree of sensitivity and subtlety. Learning to send, receive, and interpret these signals is the foundation of empathy. Feeling along with the other.
There is a very different reason evolution would favor a faculty for understanding other animals: to prey on them. If you want to hunt and eat other agents, it helps to be able to predict their behavior - so you model it. You know your prey will try to escape - so you stay silent and still until it is within range, or has no viable escape routes. As your prey becomes harder to catch, not only physical prowess, but intelligence, become essential tools in the contest between predator and prey.
Peter Watts (more on him later) gives an example of a predator that had to evolve high intelligence to outwit especially intelligent prey. The Portia Labiata is a spider that hunts other spiders. It has an especially well-developed predatory intelligence for an animal of its size, because it has to anticipate the behavior of others that also have predatory intelligence. Its very small brain somehow manages to - not quickly and spontaneously, but slowly, deliberately, thinking about a problem for an hour before acting - compute enough executive function to find nonobvious, indirect paths to its prey, often going out of line of sight and moving away from its target, in order to end up right on top of it. It learns how to pluck another spider’s web in just the right way to seem like a light breeze, rather than an approaching rival that might eat or be eaten - and learns different patterns for different types of spiders. It understands its environment very well - in order to hunt. It understands other spiders very well - in order to eat them.
The other nonhuman animals with strong executive function - the ability to hold off on doing things in order to think about them, or take a roundabout route to its goal when heading straight towards it won't work - are cephalopods. Spiders and cephalopods have special appeal to writers wishing to embody some ancient malevolent intelligence.
Pack hunters combine these faculties. A pack of wolves has some ability to model prey, anticipate its movements, cut off lines of retreat - but also the ability to send simple signals to one another, which are received through the faculty of empathy.
It’s strange for these two ways of regarding others to coexist. A sufficiently intelligent wolf might use predator cognition to predict the behavior of other animals within the pack, and manipulate them to advance its own interests instead of theirs or the pack’s. But in general, wolves are not intelligent enough to hack each other in this way, so the tension does not collapse into one equilibrium (empathy-only) or the other (predation-only).
Humans are another animal with these twin drives. The ability to receive signals from a member of the herd empathetically, and the ability to model the behavior of another agent predatorily. We can also use abstraction to apply these models in ways very different from their most intuitive use.
We can anticipate others’ future states (using predatory cognition), and feel empathy for our friends’ imagined future selves. We even act on this basis, to care for them, using the same basic cognitive faculty by which we might cut off our prey’s avenues of escape.
A friend tells me of a time when her long-distance boyfriend had been visiting her a few years ago. They weren’t officially dating yet, so she hadn’t anticipated how upset she’d be, but when she arrived home from dropping him off at the airport, crying, her housemate was waiting at home with her favorite ice cream. That kind of caring requires a fusion of the predator and herd cognitive styles. The housemate had to pick up on my friend’s nonverbal signals of attachment towards her visitor - herd empathy. But then they had to look ahead to the future, see what action would help (having ice cream and sympathy ready), and back out what they had to do now (go buy ice cream) in order to have this option in the future, when my friend got home.
But we haven’t forgotten, in some deep parts of our minds, the basic, original uses of these mental modes. Being modeled explicitly as a system with inputs, outputs, and predictable behavior can creep people out. This contributes to the strong negative responses to attempts to learn social skills by building an explicit model of social interactions. (Think of the strong negative reactions to even the most consent-oriented "Pickup Artists".) Acting, not quickly and spontaneously, but slowly, deliberately, thinking about a problem for a long time before making a move. It seems like the sort of thing a person would do - not to someone they wanted to cooperate with - but someone they wanted to eat. It feels like having a single set of eyes, close together with binocular predator vision, silently watching you.
On the other hand, if we want to use others for our own purposes, we can use our empathetic abilities to better read their current state, and send them fake signals of caring and good will to get them to trust us better. This is why compliance techniques work - they efficiently fake cues that would induce cooperation in a herd animal. This is also why inappropriate demonstrations of warmth can also be unsettling. Imagine that you are on a city street, and some stranger with a big smile comes up to you and says, “Hi, how are you doing today?” How do you feel? Imagine that you are a predator, even a pack predator, and you are far from home. You start perceiving empathic signals or environmental cues that say safe, home, warm, relax, sleep. Your guard goes up, not down - you are almost certainly being hacked by a hostile agent.
Intelligence implies malevolence
Peter Watts writes about almost nothing but the paradox of a predator trying to make friends. (I strongly recommend his short stories, and his novel Blindsight, with the qualification that nearly every content warning you might reasonably apply to something applies to Peter Watts. Do not read his stuff for the first time on a day when it would be especially bad to be triggered.)
In nearly all his stories, humanity is yearning to make contact with other intelligences in the universe, when we can spare the attention from our own rapid progress and expansion. And yet, every time we find one, we cannot help but worry that it is hostile. High intelligence is evolutionarily adaptive - worth the expense - only when there is another agent you have to outthink, or an otherwise hostile environment you need to negotiate. So the very signals that make us think someone might be a friend, also make us afraid of them. And not by some unhappy coincidence; rather, it is in the very nature of our intelligence, that the better the potential friend, the more powerful the potential foe. The Ambassador is a representative story - a spaceship is sent to make first contact with an extraterrestrial intelligence, and immediately becomes the target of exhaustion hunting:
First Contact was supposed to solve everything.
That was the rumour, anyway: gentle wizards from Epsilon Eridani were going to save us from the fire and welcome us into a vast Galactic Siblinghood spanning the Milky Way. Whatever diseases we'd failed to conquer, they would cure. Whatever political squabbles we hadn't outgrown, they would resolve. They were going to fix it all.
They were not supposed to turn me into a hunted animal.
I've stopped trying to reconcile the wisdom of Earthbound experts with the reality I have encountered. The old paradigms are useless. I propose a new one: technology implies belligerence.
Tools exist for only one reason: to force the universe into unnatural shapes. They treat nature as an enemy, they are by definition a rebellion against the way things are. In benign environments technology is a stunted, laughable thing, it can't thrive in cultures gripped by belief in natural harmony. What need of fusion reactors if food is already abundant, the climate comfortable? Why force change upon a world which poses no danger?
Back where I come from, some peoples barely developed stone tools. Some achieved agriculture. Others were not content until they had ended nature itself, and still others until they'd built cities in space.
[...] Now even my creators grow fat and slow. Their environment mastered, their enemies broken, they can afford more pacifist luxuries. Their machines softened the universe for them, their own contentment robs them of incentive. They forget that hostility and technology climb the cultural ladder together, they forget that it's not enough to be smart.
You also have to be mean.
And of course, if the other intelligence is sufficiently powerful, it could mimic signals of friendship to win our trust. It could even mask its intelligence, to appear to be something no smarter than we are, or maybe even a little less smart, to trick us into believing that its friendly overtures are sincere. And yet, we still hope to make friends with someone out there. We still want to believe that something out there wants to be friends with us. We hope, too, that we would be worthy of its trust, but this is uncertain. For we are no exception to the laws of evolution; the predator nature is in us too.
Warning: this section contains some mild spoilers for Vernor Vinge’s A Fire Upon the Deep, though not ones that I think would ruin much suspense in the story.
Vernor Vinge’s A Fire Upon the Deep is a story about artificial intelligence - but more generally, about the ways in which different modes of intelligence can be scary because of their potential predatory power. The plot begins when a group of people living in an area that’s been shielded from the development of superintelligence go on an expedition into a less-shielded area to explore an abandoned research station which is rumored to have some sort of powerful technology. They find a set of instructions for how to assemble a thing, which turns out to be a malevolent superintelligence - the Blight. They realize what they have created, or at least suspect it, but they are reluctant to discuss it or coordinate their escape out loud, for fear that any sudden movements might give them away. They feel as though predatory eyes are silently watching them. They try to escape, but the Blight hacks its way into their spaceship and blocks their escape.
When next we hear of the Blight, it is rapidly expanding its territory. It hacks, not just computers, but humans. However, while it is smart enough to take us over, it is not quite clever enough to make hacked humans fully resemble unhacked ones. Much like toxoplasma gondii, it can only affect its hosts’ behavior enough to make them stooges, not sleeper agents. So when people receive outgoing transmissions from the Blight, they are at best very crude attempts to project friendliness. Smiling people cheerfully saying that it’s all a big misunderstanding, the Blight means you well. People outside the Blight perceive crude empathic signals that say safe, home, warm, relax, allies, cooperation, benevolence. And their guard goes up, not down - they get the hell out of there and start organizing an opposing military force to fight the Blight.
Hybrid intelligence, and the fear of being hacked
Warning: this section contains major spoilers for Vernor Vinge’s A Fire Upon the Deep. If you haven’t read it already and it seems like you might enjoy it, I recommend reading it first, then coming back to this post.
Fittingly, in the Blight-free regions of the galaxy, there is a rumor that humans were actually designed by the Blight to be sleeper agents, easily hackable. I think this echoes a deep human worry, about the parts of us that have been repurposed. We worry that our empathetic intelligence makes us vulnerable to being hacked by others - and that our predator intelligence, even when used for ostensibly benevolent purposes, is fundamentally malevolent, using others for its own gain. We worry that our own empathy makes us fools, and perhaps also that we are taking advantage of others’ empathy. We worry that others are manipulating us for their own purposes, and sometimes also distrust our own motives and deep personal individual desires, and are reluctant to let that part of ourselves out.
It turns out that there is a race of sleeper agents, an even more obvious metaphor for the hybrid intelligence we so distrust: the Skroderiders. (Note: When I talk about Skroderiders here, I’m not referring to the Phoenix/Skroderider dichotomy, but making a new analogy to the species in A Fire Upon the Deep. I apologize for using the same literary object to mean two different things.) Basically an intelligent plant with only long-term memory strapped to a mobile computer with short-term memory, a Skroderider really is two very different kinds of minds combined into one. Skroderiders have their name because some unknown benevolent intelligence from another planet, only dimly remembered in their history, came to them and gave them Skrodes, mobile platforms that provide them not only mobility in their mature state, but short-term memory as well. This gift-giver was the Blight. The Skrodes were designed to give the Blight back-door access.
There are two Skroderiders in the story, on the same spaceship as the hero, searching for a rumored countermeasure that could defeat the Blight. One, Greenstalk, is corrupted by its Skrode, and compromises the ship. The other, Blueshell, has trouble even believing the betrayal built into their Skrodes, at how quickly and easily they can be turned into an agent of the Blight.
Blueshell was already shouting back at her. “I’m am not of the Blight! Greenstalk is not! The Rider race is not!” He swept around his mate, rolled across the ceiling till his fronds rattled right before Ravna’s face.
“I’m sorry. It’s just the potential—”
“Nonsense!” His voder buzzed off-scale. “We ran into an evil few. Every race has such, people who will kill for trade. They forced Greenstalk, substituted data at her voder. Pham Nuwen would kill our billions for the sake of this fantasy.” He waved, inarticulate. Something she had never seen in a Skroderider: his fronds actually changed tone, darkened.
The motion ceased, yet he said nothing more. And then Ravna heard it, a keening that might have come from a voder. The sound was steadily growing, a howl that made all Blueshell’s sound effects friendly nonsense. It was Greenstalk. The scream reached a threshold just below pain, then broke into choppy Triskweline: “It’s true! Oh, by all our trading, Blueshell, it’s true….” And staticky noise came from her voder. Her fronds started shaking, random turning that must be like a human’s eyes wildly staring, like a human’s mouth mumbling hysteria. Blueshell was already back by the wall, reaching to adjust her new skrode. Greenstalk’s fronds brushed him away, and her voder voice continued, “I was horrorstruck Blueshell. I was horrorstruck, struck by horror. And it would not stop….” She was silent for a moment. Blueshell stood frozen. “I remember everything up till the last five minutes. And everything Pham says is true, dear love. Loyal as you are, and I have seen that loyalty now for two hundred years, you would be turned in an instant… just as I was.” Now that the dam broke, her words came quickly, mostly making sense. The horrors she could remember were graven deep, and she was finally coming out of ghastly shock. “I was right behind you, remember, Blueshell? You were deep in your trading with the tusk-legs, so deep you did not really see. I noticed the other Riders coming toward us. No matter: a friendly meeting, so far from home. Then one touched my Skrode. I—” Greenstalk hesitated. Her fronds rattled and she began again, “Horrorstruck, horrorstruck….”
After a moment: “It was like suddenly new memories in the skrode, Blueshell. New memories, new attitudes. But thousands of years deep. And not mine. Instantly, instantly. I never even lost consciousness. I thought just as clearly, I remembered all I had before.”
“And when you resisted?” Ravna said softly.
“… Resisted? My lady Ravna, I did not resist, I was theirs…. No. Not theirs, for they were owned too. We were things, our intelligence in service to another’s goal. Dead, and alive to see our death. I would kill you, I would kill Pham, I would kill Blueshell. You know I tried. And when I did, I wanted to succeed. You could not imagine, Ravna. You humans speak of violation. You could never know….” Long pause. “That’s not quite right. At the Top of the Beyond, within the Blight itself— perhaps there, everyone lives as I did."
The shuddering did not subside, but her gestures were no longer aimless. The fronds were saying something in her own language, and brushing gently against Blueshell. “Our whole race, dear love. Just as Pham says it.”
And their companions can’t help but wonder if some deeper betrayal has already been programmed into the Skroderiders, to come out at the last crucial moment. But Blueshell braves fire to help a stranger, dying in the act, having defied its own self-interest to do the right thing.
And isn’t that something like what we are? A solo predator and a herd animal, inextricably connected together, in our more reflective moments not always sure we can trust ourselves even to be good to our friends, not sure we can trust others to be what they say they are, and yet, something in this unusual combination lets us fake it, lets us approximate what it might be, to be a mind that truly loves another.
The broken ones
When I reacted to my friend’s expression of affection with distrust, I was reacting the way a predominantly solo predator might. My trust for their material good will - their intent to actually do things in my interest - was at a nadir, so good feelings were unexpected. And when I get an unexpected signal designed to make me feel warmly towards someone, it makes me feel as if I’m facing the Blight, or some other clever predator who’s learned to send simple signals of warmth. It feels worse than honest coldness.
Many people find empathy the most natural basis for establishing good will. If I make a bid to be someone’s ally, and immediately jump into using predator cognition, trying to understand them and ask explicit questions, they may not trust that my predator part is under the control of the part that cares about others. They can’t trust this until they’ve received lots of direct signals of interpersonal warmth. Once they’re satisfied that I feel warmly towards them, then they can begin to open up and let themselves be seen. If ever.
I don’t have a recipe for learning to cooperate the way fully integrated benevolent intelligences would. But I do think this framework is helping me learn to avoid some basic mistakes. It helps me understand when other people need a signal I’m not sending - or, when I think they’re sending signals in the wrong order, why it might make sense to them to do so. It helps me remember that trust is fundamentally very hard, and that it’s amazing we can do it at all.
The more I think about the kind of relationships and way of being I aspire to, the more broken I feel. It is only when we set our sights beyond what evolution prepared us for - when we try to use these faculties and others together for purposes far outside what they were evolved for - that we can see them not as adequate in themselves, but as shards of some integrated whole to which we aspire. We who can see this will have to build ourselves up beyond our present state, to become fully caring humans.
I am a Skroderider. The new, mechanical parts that let me cleverly manipulate my environment are easily compromised and I do not fully trust them. I am a Skroderider. The fast parts of me that live in the moment are foreign to me and do not always show what I feel in my innermost self. I am a Skroderider. I am two parts. I am proud to call myself broken. Because the broken ones are the ones with hope. The broken ones are the ones that are trying to do better.
You can add up the parts
but you won't have the sum
You can strike up the march,
there is no drum
Every heart, every heart
to love will come
but like a refugee.
Ring the bells that still can ring
Forget your perfect offering
There is a crack, a crack in everything
That's how the light gets in.