|October 24th, 2011|
Just as I don't think it makes sense to value the happiness of people far away less than that of people close by, I can't see why I should value the happiness of future people less. If there might be a very large number of people in the future, it's possible that the most good I could be doing would be in making sure we don't wipe ourselves out before expanding off the earth.
 I need to update my internal stat from 6.5 billion.
 In don't strictly mean 'humans' here. A person who's mind had been uploaded to a computer and running there would count. As would, I think, and fully artificial intellegence.
 Or at the even higher end: tiny self replicating machines of human origin could colonize the universe.