Monthly Archives: May 2014

Not Enough About Too Much

Existential risk is the risk of an event that would wipe out humanity. That means, not just present lives, but all the future ones. Populations can recover from lots of things, but not from complete extinction. If you value future people at all, you might care a lot about even a small reduction in the probability of an extinction event for humanity.

There are two big problems with reasoning about existential risk:

1) By definition, we have never observed the extinction of humanity.

This means that we don't have an uncensored dataset to do statistics on - extrapolating from the past will give our estimates anthropic bias: people only observe the events that don't wipe out humanity, but events that wipe out humanity are possible. Therefore, our past observations are a biased subset that make the universe look safer than it is.

2) Our intuitions are terrible about this sort of thing.

To reason about existential risk, we have to tell stories about the future, think about probabilities, sometimes very small ones, and think about very large numbers of people. Our brains are terrible at all these things. For some of these beliefs, there's just no way to build a "feedback loop" to test your beliefs quickly in small batches - and that's the main way we know how to figure out when we're making a mistake.

Moreover, we have to do this on a topic that evokes strong emotions. We're not talking about the extinction of some random beetle here. We're talking about you, and me, and everyone we know, and their grandchildren. We're talking about scary things that we desperately don't want to believe in, and promising technologies we want to believe are safe. We're talking about things that sound a lot like religious eschatology. We're talking about something weird that the world as a whole hasn't quite yet decided is a normal thing to be worried about.

Can you see how rationality training might be helpful here?

I'm writing this on a flight to Oakland, on my way to CFAR's workshop to test out in-development epistemic rationality material. A few months ago, I expressed my excitement at what they've already done in the field of instrumental rationality, and my disappointment at the lack of progress on training to help people have accurate beliefs.

During conversations with CFAR staff on this topic, it became clear to me that I cared about this primarily because of Existential Risk, and I strongly encouraged them to develop more epistemic rationality training, because while it's very hard to do, it seems like the most important thing.

In the meantime, I've been trying to figure out what the best way is to train myself to make judgments about existential risks, and about the best thing I can do to help mitigate them.

It turns out that it's really hard to fix something without a very specific idea of how it's broken. So I'm going to use the inadequate tools I have, on this flight, without access to experts or the ability to look up data, to build the best bad estimate I can of:

Continue reading

Why I Am Not Celebrating International Tell Your Crush Day

Some sensible people have promoted International Tell Your Crush Day. Basically, the idea is to have a day where everyone reveals their secret crushes to the people they have crushes on. Here's a description from the source itself:

ITCYD was hatched out of a simple and honest desire to see more people be open and honest about the world around them. Specifically, the people part of that world. Everyone, even curmudgeons who will claim otherwise, get crushes on people. And, yes, everyone gets crushed on at different times. Wouldn’t it be rad to be able to tell people you had sparklies for that you had sparklies for them? Wouldn’t it be super-amazing if people would do the same for you?

….for those of you keeping score at home, the correct answer is “YES!”

Some of you, who don't know me well yet, will have a wrong idea of why this makes me feel incredibly sad, anxious, and scared. Here are some things I am not worried about:

  • That I will be the object of an unwanted or inconvenient crush.
  • That someone I have a crush on will not reciprocate my feelings.
  • That I would face some sort of adverse social consequences for telling someone I have a crush on them
  • That I'm not allowed to play this game because I have a girlfriend
  • That I am allowed to play this game (for many reasons including the ones the ITYCD website lists) but people think I'm not
  • That I don't know how to tell people how I feel about them

None of these things is distressing to me. Here's why ITYCD makes me want to go into a dark room and curl up into a ball and hide:

Continue reading

Not even a real links post, just a blatant ad

Scott says:

There’s a charity thing going on today where whoever gets the most unique donors of $10 or more by midnight tonight wins $250,000 a bundle of useful services supposedly valued at $250,000.

MIRI, the Machine Intelligence Research Institute, is a group that works on ensuring a positive singularity and friendly AI. They were part-founded by Eliezer Yudkowsky, they’re closely affiliated with Less Wrong, and they briefly employed me). They are currently thirteen only eight donors away from the top place, with only four hours left to go (I think they’re on California time).

If you are concerned about the far future, please consider giving them a quick $10 donation at this link to tip them over the edge.

EDIT1: Someone claimed anonymous donations don’t count, so you might want to donate under your real name

EDIT2: Or if you’re interested in eye care for poor Indian children, you can donate to the current leader, Sankara Eye Foundation. Everyone else seems too far behind to catch up

I am not totally sure that MIRI is always the most effective use of your charitable dollars, but I am pretty sure that under these advantageous circumstances, that first $10 has a darned high expected value.

[Offer retracted by MIRI's request. If you already gave because of what I said, email me and we'll work something out.]