{"items": [{"author": "Amelia", "source_link": "https://www.facebook.com/jefftk/posts/341037182631539?comment_id=341039889297935", "anchor": "fb-341039889297935", "service": "fb", "text": "Perhaps it depends on context and what it is you want to say?", "timestamp": "1339076113"}, {"author": "Jesse", "source_link": "https://www.facebook.com/jefftk/posts/341037182631539?comment_id=341047992630458", "anchor": "fb-341047992630458", "service": "fb", "text": "Hard to quantify is not the same thing as failing to capture.  All those events had probabilities - we just have no simple way of measuring them.  And in any complex model there should be chances of unknown/unpredictable events occurring.  <br><br>From there, expected value is the expected benefit from occurrence times the probability it will come to pass... now you get to deal with public goods and social welfare being hard to quantify... but again not impossible.  EV also changes dramatically on time of observation.<br><br>Change takes hold when the expected value of a positive outcome reaches a noticeably non-zero range.  At this point people who had held off getting involved because they saw it as impossible decide \"that would be worth taking a risk\".  That is called a groundswell.", "timestamp": "1339077418"}, {"author": "Gabe", "source_link": "https://www.facebook.com/jefftk/posts/341037182631539?comment_id=341068479295076", "anchor": "fb-341068479295076", "service": "fb", "text": "I was expecting something much less coherent.  In layman's terms he's saying \"Big changes are hard to predict. And I'm smarter than you.\"", "timestamp": "1339080338"}, {"author": "Sasha", "source_link": "https://www.facebook.com/jefftk/posts/341037182631539?comment_id=341090992626158", "anchor": "fb-341090992626158", "service": "fb", "text": "I think the short answer is 'you don't'. There's always going to be someone more receptive to your ideas. Our (well, my) natural instinct when faced with naked ideology is to try to argue against it (to resolve my cognitive dissonance? Not honestly sure why), so I really have to force myself to find something more sensible to do with my time, like finding more receptive people. <br><br>It will be interesting to see whether the OLR piece yields any new members; presumably the average reader is more likely to be persuaded than the average debater (the age-old justification for fruitless internet debate: 'I might persuade observers'), but I don't know of any evidence for the strength of this effect.", "timestamp": "1339083228"}, {"author": "Sasha", "source_link": "https://www.facebook.com/jefftk/posts/341037182631539?comment_id=341108202624437", "anchor": "fb-341108202624437", "service": "fb", "text": "A further reason to avoid it is the tendency of ideologues to make grandiose sounding semifactual claims that might be provably false or just ill-supported by evidence (where the grander they are, the less plausible they become). <br><br>Sometimes these might not even relate to the real logic of the argument, but if you don't address them, given that they raised the point, they/their audience are likely to feel you've weaseled out. <br><br>But if you want to address them properly you need to put in a *lot* of research, often not only on the best references on the subject but on their favourite ones. So now the opportunity cost of successfully persuading someone gets even higher...", "timestamp": "1339085491"}, {"author": "Sasha", "source_link": "https://www.facebook.com/jefftk/posts/341037182631539?comment_id=341110245957566", "anchor": "fb-341110245957566", "service": "fb", "text": "Mat Nazarian, because I wrote this with the above debate in mind, but I just realised it characterises the exchange we had on libertarianism pretty well/my reasons for not wanting to continue it - as a utilitarianism I think fighting against political ideologies is a poor use of time, but fighting *for* them seems even worse.", "timestamp": "1339085737"}, {"author": "Chris", "source_link": "https://www.facebook.com/jefftk/posts/341037182631539?comment_id=341135732621684", "anchor": "fb-341135732621684", "service": "fb", "text": "The thing I see as missing from that discussion is the fact that utility has to be based on current knowledge and that as knowledge changes, understood utility changes and that's okay.  And also that there's a cost to gaining knowledge, so you have to decide whether to gain more knowledge based on your current knowledge of how likely it is that gaining the new knowledge will change your beliefs.<br><br>On the other hand, I've been leaning toward a belief that we all have built in beliefs that are more emotional and built in than we want to believe and that we tend to come up with rational explanations to justify our beliefs more than that we use the rationality to affect our beliefs.  I will say though that Jeff's utilitarian giving system is one of the better counterarguments I've seen to this idea.", "timestamp": "1339089182"}]}