{"items": [{"author": "Patrick", "source_link": "https://www.facebook.com/jefftk/posts/342865409121599?comment_id=342872269120913", "anchor": "fb-342872269120913", "service": "fb", "text": "This postulates more of a mind/body duality than I accept. I think sensation is fundamental to being human -- which is not to say that a person who loses sensation would cease to be human, or that a partial loss of sensation amounts to a partial loss of humanity.<br>Still, the flow of information from the body shapes consciousness. And the expectation of death -- not just the risk of it at a specific point, but the general inevitability of it -- is also fundamental.", "timestamp": "1341856832"}, {"author": "David&nbsp;Chudzicki", "source_link": "https://plus.google.com/106120852580068301475", "anchor": "gp-1341857597395", "service": "gp", "text": "Another example pointing to preferences as more intuitively valuable than utility [edit: poor word choice, see Jeff below]?", "timestamp": 1341857597}, {"author": "Jeff&nbsp;Kaufman", "source_link": "https://plus.google.com/103013777355236494008", "anchor": "gp-1341858459564", "service": "gp", "text": "@David&nbsp;Chudzicki\n\u00a0I don't see what about this suggests valuing preferences over happiness. \u00a0Explain?\n<br>\n<br>\n(Assuming you mean \"happiness\" instead of \"utility\". \u00a0Generally whatever you value gets called \"utility\", and could be happiness, preference satisfaction, or something more complex.)", "timestamp": 1341858459}, {"author": "David&nbsp;Chudzicki", "source_link": "https://plus.google.com/106120852580068301475", "anchor": "gp-1341858941129", "service": "gp", "text": "Re utility yeah, oops. Didn't mean utility. Maybe happiness, though if we're starting to generalize away form \"people\" to arbitrary computational processes that may be \"valuable\" in some way, 'happiness' starts to sound odd.\n<br>\n<br>\nRe preferences-- it seems relevant that that particular process (the uploaded secluded person) has preferences for remaining alive. Destroying it and replacing it with other new ones (even if they're happier and/or there are more of them) doesn't seem good.", "timestamp": 1341858941}, {"author": "Allison", "source_link": "https://plus.google.com/103741579182942078941", "anchor": "gp-1341858970362", "service": "gp", "text": "I could easily write a chatterbot that essentially says two things: \"leave me alone\" and \"don't shut me down.\" \u00a0If you try and cull an EM down, you might end up with something equivalent in value to that chatterbot. \u00a0How do you even begin to determine value? \u00a0\n<br>\n<br>\nRelatedly, If I were given the choice between reliving the happiest points in my life in emulation, or death, I'd pick death, because living on repeat isn't living, it's remembering.", "timestamp": 1341858970}, {"author": "Jeff&nbsp;Kaufman", "source_link": "https://plus.google.com/103013777355236494008", "anchor": "gp-1341864379840", "service": "gp", "text": "@Allison\n\u00a0\"living on repeat isn't living, it's remembering\"\n<br>\n<br>\nTo be fair, though, remembering can be pleasant too.", "timestamp": 1341864379}, {"author": "Allison", "source_link": "https://plus.google.com/103741579182942078941", "anchor": "gp-1341866456075", "service": "gp", "text": "@Jeff&nbsp;Kaufman\n\u00a0I agree, remembering can be pleasant. \u00a0However, I'd rather make new memories most of the time; growth and change is important to me. \u00a0", "timestamp": 1341866456}, {"author": "David&nbsp;Chudzicki", "source_link": "https://plus.google.com/106120852580068301475", "anchor": "gp-1341867903311", "service": "gp", "text": "@Jeff&nbsp;Kaufman\n\u00a0(\n@Allison\n) exactly! preferences", "timestamp": 1341867903}, {"author": "Andrew", "source_link": "https://www.facebook.com/jefftk/posts/342865409121599?comment_id=342987559109384", "anchor": "fb-342987559109384", "service": "fb", "text": "So, Jeff, let's say I package your happy fulfilled person-vm in a fork bomb:<br><br>int main()<br>{<br>  &lt;instantiate vperson image&gt;<br>  while(1) fork();  <br>}<br><br>So I kick off this thingy and in a few ticks I have an enormous glut of emulated people.  Do they have moral weight?", "timestamp": "1341878289"}, {"author": "Jeff&nbsp;Kaufman", "source_link": "https://www.facebook.com/jefftk/posts/342865409121599?comment_id=342991622442311", "anchor": "fb-342991622442311", "service": "fb", "text": "@Andrew: your system has some capacity.  Assume it's set up so that all the person images equally timeshare, so one person gets full capacity while N people get ~1/N capacity.  I would be inclined to say that the whole system, whether running one person at speed X or N people each at speed X/N, carries about the same moral weight.", "timestamp": "1341879465"}, {"author": "Andrew", "source_link": "https://www.facebook.com/jefftk/posts/342865409121599?comment_id=342992152442258", "anchor": "fb-342992152442258", "service": "fb", "text": "Jeff, we live in a world, a system that has some capacity, and there are reasonable concerns about wasting it - earth day, ecology, global warming, zero population growth, all that.  Does our world have a constant global moral weight that is divided by the number of its inhabitants?", "timestamp": "1341879664"}, {"author": "Jeff&nbsp;Kaufman", "source_link": "https://www.facebook.com/jefftk/posts/342865409121599?comment_id=343003342441139", "anchor": "fb-343003342441139", "service": "fb", "text": "@Andrew: the computer example is relatively simple, the real world isn't.  Especially important is that our emulated people weren't interacting; once people interact the total number and whether they are identical starts mattering, probably a lot.  But yes: at some point we have enough people on the earth that adding another person harms other people more than it benefits the person being added.", "timestamp": "1341883260"}, {"author": "Chris", "source_link": "https://plus.google.com/117346402173047680184", "anchor": "gp-1341940961252", "service": "gp", "text": "I find it interesting that you consider copying a human's personhood into a computer.\u00a0 It seems much more likely to me that we'll get artifical people that aren't copies of humans much sooner and I don't see any reason why they would have any less value than a copy of a human.", "timestamp": 1341940961}, {"author": "Jeff&nbsp;Kaufman", "source_link": "https://plus.google.com/103013777355236494008", "anchor": "gp-1341946222384", "service": "gp", "text": "@Chris\n\u00a0Reasoning about a person copied onto a computer is much easier, and I would expect any conclusions to cross-apply to non-human-derived artificial people.", "timestamp": 1341946222}, {"author": "Jesper", "source_link": "https://www.facebook.com/jefftk/posts/342865409121599?comment_id=343311045743702", "anchor": "fb-343311045743702", "service": "fb", "text": "With space expansion (almost) unlimited amounts of people could be created without harming any existing people.", "timestamp": "1341953154"}, {"author": "Chris", "source_link": "https://plus.google.com/117346402173047680184", "anchor": "gp-1341955147344", "service": "gp", "text": "I think if your arguments depend on the fact that the artificial person is a copy of a human than they won't cross-apply by default.\u00a0 I guess I could see it.\u00a0 You need to argue two things.\u00a0 Firstly that someone made of silicon is not fundamentally different from someone made of carbon and then that someone who thinks differently than you.\n<br>\n<br>\nHowever, whether it makes the reasoning easier or not, I think that artificial people who aren't copies of humans will come earlier.\n<br>\n<br>\nI'm also tempted to point out that you said a copy of a person and not of a human.\u00a0 I'm not sure if this reveals a bias in our language that person = human or whether it was intentional since once they're copied, they won't quite be human any more.", "timestamp": 1341955147}, {"author": "Jeff&nbsp;Kaufman", "source_link": "https://plus.google.com/103013777355236494008", "anchor": "gp-1342785629594", "service": "gp", "text": "@Lucas\n\u00a0My computer example included that you had some em that wanted to be cut off from the rest of the world and disable input and output. \u00a0Once you have this, you can run as many copies as you want at whatever speeds you want and I think simple\u00a0arithmetic\u00a0applies to their morality (2x speed means twice as good, half as many is half as good.) \u00a0Not having any communication is key for this simplification.", "timestamp": 1342785629}, {"author": "Jeff&nbsp;Kaufman", "source_link": "https://plus.google.com/103013777355236494008", "anchor": "gp-1342873792306", "service": "gp", "text": "@Lucas\n\u00a0Right. \u00a0They might not be happy, but 2x speed means whatever they are it matters twice as much.", "timestamp": 1342873792}]}