{"items": [{"author": "Topher", "source_link": "https://www.facebook.com/jefftk/posts/666168964932?comment_id=666184828142", "anchor": "fb-666184828142", "service": "fb", "text": "This seems overlook the possibility that data-collection itself is ethically problematic\u2014there are some things we might legitimately not want a big corporation finding out about us. Yeah, that horse mostly left the barn a long time ago, but this can be seen as a way to covertly collect information people didn't think they were sharing.", "timestamp": "1404138925"}, {"author": "Jeff&nbsp;Kaufman", "source_link": "https://www.facebook.com/jefftk/posts/666168964932?comment_id=666185287222", "anchor": "fb-666185287222", "service": "fb", "text": "@Chris: If facebook had instead posted an article saying \"we calculated people's posts indicate 1% more happiness than this time last year\" I don't think people would say that was unethical.  You're right that collecting data could be ethically problematic, and I should have said more about that, but I don't think that's what's going on with this fb study.", "timestamp": "1404139200"}, {"author": "Ralph", "source_link": "https://www.facebook.com/jefftk/posts/666168964932?comment_id=666186439912", "anchor": "fb-666186439912", "service": "fb", "text": "Two thoughts:<br>1.  A very big problem with informed consent in social experiments it that knowing you're being studied may well skew the results.<br>2.  Using the results from a study like this presumes that the algorithm, writers are correct in their assumptions about our \"best\" emotional state.  (Are we really better off when we're \"happy\"?)", "timestamp": "1404140261"}, {"author": "Jeff&nbsp;Kaufman", "source_link": "https://www.facebook.com/jefftk/posts/666168964932?comment_id=666187028732", "anchor": "fb-666187028732", "service": "fb", "text": "@Ralph: Normally you deal with (1) by being very vague about what you're studying.  The people who think facebook should have gotten consent for this study are mostly arguing that they should have put a banner up saying something like \"we're going to be running an experiment on moon, click here to opt out\".  Show this to both the experimental and control groups and you likely have no effect on people's actions.<br><br>As for (2), the study was partly to understand whether the people who say \"using fb makes you sad because you see all these happy posts from other people\" are correct.  But the algorithm they used to determine the level of positivity in the posts isn't designed for snippets of text this short and isn't very accurate.  Though as long as it's picking up something their sample is probably large enough to make up for imprecision here.", "timestamp": "1404140760"}, {"author": "Jim", "source_link": "https://www.facebook.com/jefftk/posts/666168964932?comment_id=666187203382", "anchor": "fb-666187203382", "service": "fb", "text": "I agree with the overall moral framework in which randomizing between acceptable options is acceptable. That said, it is unethical to use sentiment analysis to bias someone's reading without their knowledge, full stop. It's a direct attack on their ability to build true beliefs. If the sentiment of newsfeeds is decoupled from the sentiment of the actual news, then people lose the ability to detect a very important type of change in the world. If negative articles are filtered out, it impedes their ability to act. If negative articles are favored, it threatens their mental health.<br><br>By the way, you gave an example of favoring posts that mention one political candidate over another. *China actually does that and has been doing it for years*. Furthermore, Facebook is sufficiently opaque that it could do that, and no one would be able to prove it.<br><br>I reached this post through an RSS subscription to your blog. I'm commenting through the Facebook post because that maximizes the number of people who will see it, and they'll be mirrored onto your blog, but I'm going to be avoiding the Facebook feed in the future.", "timestamp": "1404140861"}, {"author": "Daniel", "source_link": "https://www.facebook.com/jefftk/posts/666168964932?comment_id=666187338112", "anchor": "fb-666187338112", "service": "fb", "text": "i think you're saying that the purpose or reason behind treating someone a certain way is irrelevant to the ethics of the act. i don't think that would be very hard to disprove.", "timestamp": "1404140937"}, {"author": "Jeff&nbsp;Kaufman", "source_link": "https://www.facebook.com/jefftk/posts/666168964932?comment_id=666187692402", "anchor": "fb-666187692402", "service": "fb", "text": "@Daniel: \"I think you're saying that the purpose or reason behind treating someone a certain way is irrelevant to the ethics of the act\"<br><br>I don't think that.  In the post, among other things, I say that fb favoring sad posts simply because they enjoy making people miserable would not be ok.", "timestamp": "1404141109"}, {"author": "Daniel", "source_link": "https://www.facebook.com/jefftk/posts/666168964932?comment_id=666187867052", "anchor": "fb-666187867052", "service": "fb", "text": "ok. a friend of mine (and others) made the most interesting point that such manipulations seem more acceptable for commercial purposes than for research purposes. i wonder why that is! i definitely feel that way.", "timestamp": "1404141248"}, {"author": "Jeff&nbsp;Kaufman", "source_link": "https://www.facebook.com/jefftk/posts/666168964932?comment_id=666187991802", "anchor": "fb-666187991802", "service": "fb", "text": "@Jim: \"it is unethical to use sentiment analysis to bias someone's reading without their knowledge\"<br><br>This is a strong argument, and a good reason to be upset about the study.  Though I think http://www.jefftk.com/.../like-and-the-suppression-of... was already having this effect, and without the knowledge of most readers.<br><br>\"mirrored onto your blog\"<br><br>If you want to avoid fb more, you can also get comments onto my blog by posting on g+ or sending me an email asking me to post them for you.", "timestamp": "1404141327"}, {"author": "Jeff&nbsp;Kaufman", "source_link": "https://www.facebook.com/jefftk/posts/666168964932?comment_id=666188321142", "anchor": "fb-666188321142", "service": "fb", "text": "@Daniel: \"such manipulations seem more acceptable for commercial purposes than for research purposes\"<br><br>Say fb were to run an experiment like this because they were trying to figure out if they should apply sentiment analysis to posts, suppressing sad ones in order to make people use facebook more.  I don't think the people upset about this study would be much happier knowing it was done for purely commercial purposes.", "timestamp": "1404141502"}, {"author": "Daniel", "source_link": "https://www.facebook.com/jefftk/posts/666168964932?comment_id=666188391002", "anchor": "fb-666188391002", "service": "fb", "text": "i'm not sure. also, who says they're not?", "timestamp": "1404141546"}, {"author": "Jim", "source_link": "https://www.facebook.com/jefftk/posts/666168964932?comment_id=666190726322", "anchor": "fb-666190726322", "service": "fb", "text": "@Jeff: \"Though I think http://www.jefftk.com/.../like-and-the-suppression-of... was already having this effect, and without the knowledge of most readers.\"<br><br>I think this is different. The concept of upvoting is simple, common, and shared among many sites, so if someone doesn't know about it, it's their own problem. Likes/upvotes are delegating control of the feed to people that users are more inclined to trust, and their bias is mainly on the interestingness axis, rather than the sentiment axis.<br><br>\"If you want to avoid fb more, you can also get comments onto my blog by posting on g+ or sending me an email asking me to post them for you.\"<br><br>The issue is that I strongly suspect comments posted off-Facebook (ie, through G+) won't be seen by the majority of people reading the post, because they're reading comments... in their Facebook feeds.", "timestamp": "1404142485"}, {"author": "Alan", "source_link": "https://www.facebook.com/jefftk/posts/666168964932?comment_id=666215446782", "anchor": "fb-666215446782", "service": "fb", "text": "I believe predictably plausible (side) effects of the experimentation are also important not just motivations. In this case the threat to mental health. But in the sandwich example if the company were to use a known toxin at sufficiently high levels to have negative effects this would be clearly immoral even if their intention was simply to figure out which sandwich people liked more.", "timestamp": "1404153977"}, {"author": "Kelly", "source_link": "https://www.facebook.com/jefftk/posts/666168964932?comment_id=666215631412", "anchor": "fb-666215631412", "service": "fb", "text": "Jim, you seem to be saying that facebook's feed promotion algorithm causes you to not have true beliefs about the world. it is already the case facebook does not show you every post that is available to you. what Jeff is trying to say is that if Facebook tweaked their engine to show *everyone* more sad posts because maybe they found out that people who see more sad posts feel less left out, and then are more likely to use Facebook, this would not be unethical. and if Facebook did not modify their engine, this would not be unethical. both options are not unethical therefore, what Facebook did is not unethical. <br><br> if you are using the selection of posts in your feed to try and build true beliefs about the world based on some frequencies of posts, without knowing how their engine works, then you are reasoning from unrepresentative samples.", "timestamp": "1404154096"}, {"author": "Alan", "source_link": "https://www.facebook.com/jefftk/posts/666168964932?comment_id=666218106452", "anchor": "fb-666218106452", "service": "fb", "text": "@Daniel it's definitely not my personal feelings that commercial purposes are inherently more acceptable though it is my impression that this is true for many people (or at least people I interact most with, I wouldn't be surprised if this is less true in countries other than the US). In particular people seem to think of things like \"make the product get used more\" as synonymous with \"make the product better from the point of view of the user\"... a pretty glaring example of a situation where this wouldn't be the case would be figuring out how to increase the severity of withdrawal symptoms for nicotine products.", "timestamp": "1404155342"}, {"author": "Jos\u00e9", "source_link": "https://www.facebook.com/jefftk/posts/666168964932?comment_id=666222792062", "anchor": "fb-666222792062", "service": "fb", "text": "Jeff, did you read this? :)  http://psychcentral.com/.../emotional-contagion-on.../", "timestamp": "1404157640"}, {"author": "David", "source_link": "https://www.facebook.com/jefftk/posts/666168964932?comment_id=666228670282", "anchor": "fb-666228670282", "service": "fb", "text": "Jeff - read Cicero's \"On Duties\" (Latin: De Officiis) for a lengthy discussion of the idea you postulated.", "timestamp": "1404159024"}, {"author": "Jeff&nbsp;Kaufman", "source_link": "https://www.facebook.com/jefftk/posts/666168964932?comment_id=666234039522", "anchor": "fb-666234039522", "service": "fb", "text": "@Jos\u00e9: That article makes good points, but it's way too negative and snarky.  Yes, the measure of sentiment they used is designed for larger works, but I'm not aware of one designed for shorter posted and they didn't reference one. Also see the second half of my response to Ralph above.<br><br>As for the size of the effect, the change they made was also small. It's still a small effect, and without such a large sample that wouldn't have been able to pick it up, but its not irrelevantly small.", "timestamp": "1404161650"}, {"author": "Mad", "source_link": "https://www.facebook.com/jefftk/posts/666168964932?comment_id=666271519412", "anchor": "fb-666271519412", "service": "fb", "text": "I think it's important to separate the ethics of the study itself from concerns about the process they followed in performing it. Michelle Meyer has noted that studies like this one are worth having, but she has also stated that this would be considered \"human subjects research\" according to the Common Rule.<br><br>Facebook doesn't have to follow the Common Rule, so the process they followed was to pay no mind to academic standards for conduct of human subjects research, justifying it as \"internal review\" and citing the ToS as \"informed consent\". (Basically everyone familiar with human subjects research finds the ToS = \"informed consent\" claim to be ridiculous.) They also told the Cornell IRB that this was \"pre-existing data\" (generated in response to their \"advice\"!), thus getting that IRB to dismiss their involvement as \"not human subjects research\" (because Facebook was doing the \"human subjects research\"). They also cited that IRB determination of exemption for the publisher (PNAS). Some have characterized this approach as \"IRB laundering\" (i.e. suggest a study, then help analyze data = not getting your hands dirty with \"human subjects research\" and thereby circumventing the IRB process).<br><br>I don't consider informed consent to have been necessary (this could have been performed as \"deceptive research\", but there are also rules for how to conduct that as well!) BUT I think the cavalier ignorance of their approach and justifications should concern us a lot. Even we conclude the outcome was okay in this case, and even if we think it's possible to ensure ethical research in the absence of an IRB, they clearly don't understand the existing processes for assuring research is conducted in an ethical manner.", "timestamp": "1404177122"}, {"author": "Christian", "source_link": "https://www.facebook.com/jefftk/posts/666168964932?comment_id=666325411412", "anchor": "fb-666325411412", "service": "fb", "text": "Facebook is almost certainly running studies that see whether reducing sad stories will increase user engagement with facebook and those critics who criticize the paper aren't expressing concern about it.<br><br>Modern websites like Facebook run a lot of A/B split tests on various algorithms. The idea of not participating in experiments that do A/B split tests while being on facebook shows a misunderstanding about Facebooks nature.<br><br>The thing that's special about this episode is that the results of the experiments get publically released as an academic paper and are open to public scrutiny. Facebook get's punished for sharing their research with the public domain. The results of this criticism won't be less experiments closeness. Facebook will put less research in the public domain and keep more research results in house.", "timestamp": "1404218181"}, {"author": "Sam", "source_link": "https://plus.google.com/107810223191541179750", "anchor": "gp-1404244136198", "service": "gp", "text": "Does your statement have context.   I'm sure I could think of some contexts where I'd disagree but it might not matter", "timestamp": 1404244136}, {"author": "Jeff&nbsp;Kaufman", "source_link": "https://plus.google.com/103013777355236494008", "anchor": "gp-1404304737737", "service": "gp", "text": "@Sam\n\u00a0See the linked post:\u00a0\nhttp://www.jefftk.com/p/ethics-of-experimentation", "timestamp": 1404304737}, {"author": "Sam", "source_link": "https://plus.google.com/107810223191541179750", "anchor": "gp-1404338056064", "service": "gp", "text": "I think I do disagree here.\n<br>\nI'm assuming that A is show users positive posts and B is show users negative posts.\n<br>\nThe issue is that users expected FB to show them both positive and negative posts roughly with proportion to the positiveness of posts their friends made. \u00a0The expectation was that how positive/negative the posts on my feed are is some indication of how positive/negative the posts of all of my friends are. \u00a0FB was of course under no obligation to fulfill this expectation but by deviating from this expectation they manipulated their users in way that is IMO unethical.\n<br>\n<br>\nI'd like to use an unrelated example though to try to debunk your initial claim. \u00a0Let's assume that A is giving a child access to food and B is giving a child access to water. \u00a0Doing both A and B is ethical but not doing one of them at random is not. \u00a0The randomness doesn't help anything.\n<br>\n<br>\nWhat FB did was nowhere near as severe as my example of course and it should go without saying that I am significant less bothered by this then I am by starving children. \u00a0I just don't buy your argument for it being ethical.", "timestamp": 1404338056}, {"author": "Josh", "source_link": "https://www.facebook.com/jefftk/posts/666168964932?comment_id=666555485342", "anchor": "fb-666555485342", "service": "fb", "text": "I would love it if I got everything my friends posted. I set up e-mail specifically so that I do, because Facebook's web UI can't be trusted. I guess I don't know if I can trust their e-mail notification interface, because how the fuck would I even know.<br><br>I really, really hate that Facebook has become a way for people to communicate about important life events, because it is explicitly and intentionally and systematically interfering with your ability to communicate with your friends, for their own benefit and to your obvious detriment.<br><br>All that said, of course this is ethical; you agreed to the terms of service.", "timestamp": "1404354193"}, {"author": "Jeff&nbsp;Kaufman", "source_link": "https://www.facebook.com/jefftk/posts/666168964932?comment_id=666580605002", "anchor": "fb-666580605002", "service": "fb", "text": "@Josh: I'm think for most people fb is an improvement. Filtering by 'like's let's people keep up with many more people than email does, once you consider that most people have much lower email tolerance than we do.", "timestamp": "1404374831"}, {"author": "Josh", "source_link": "https://www.facebook.com/jefftk/posts/666168964932?comment_id=666630100812", "anchor": "fb-666630100812", "service": "fb", "text": "Sure, but there's no reason they couldn't let you configure how the web UI looks. If their UI worked like LJ/DW, where each N posts show up on an actual different page, and you can filter whose posts do and don't show up however you'd like, I'd have no issue with their web site. And if they want to have a \"let us help you find the most interesting posts\" option, and even make it the default, that's great, as long as there's another option.<br><br>They aren't interested in that; they want you to accept that it's good for them to be filtering their posts for you, so you won't complain when they do it in ways that turn out not to be so good for you. \"Oh yeah, I guess they sort of blew it that time, but this feature is still totally worth it to me!\"", "timestamp": "1404414491"}, {"author": "Jeff&nbsp;Kaufman", "source_link": "https://plus.google.com/103013777355236494008", "anchor": "gp-1404467665427", "service": "gp", "text": "@Sam\n\u00a0Sorry, I'm not claiming that if it is ethical to do both A and B together than it is ethical to do them both on their own. \u00a0I agree that's not true, and your food-and-water example shows that. \u00a0I'm claiming that if it's ethical to choose either A or B and just do that, then it is also ethical to randomize your choice doing either A or B on a per-person basis.", "timestamp": 1404467665}, {"author": "Christian", "source_link": "https://www.facebook.com/jefftk/posts/666168964932?comment_id=666744257042", "anchor": "fb-666744257042", "service": "fb", "text": "Josh There a reason why facebook outperform networks like diaspora. Facebook is highly optimised to be engaging for users. It wents through lots and lots of A/B testing to get to the stage it's at. <br><br>Google famously A/B tested the shades of the color blue in their logo. Do you want to have a query that let's you specify the shade of blue that the logo has when you use Google?", "timestamp": "1404495860"}, {"author": "Sam", "source_link": "https://plus.google.com/107810223191541179750", "anchor": "gp-1404705143628", "service": "gp", "text": "I agree and I don't think that adequately describes the circumstances of the recent Facebook experiment because of the expectations involved.", "timestamp": 1404705143}, {"author": "Jeff&nbsp;Kaufman", "source_link": "https://plus.google.com/103013777355236494008", "anchor": "gp-1404757755034", "service": "gp", "text": "@Sam\n\u00a0I'm saying it's fine to be angry at facebook for intentionally manipulating emotions, but (via this argument) not for running an experiment.", "timestamp": 1404757755}, {"author": "Sam", "source_link": "https://plus.google.com/107810223191541179750", "anchor": "gp-1404950869852", "service": "gp", "text": "I missed the part of your argument where you explain how participants needn't consent to being the subjects of a scientific study.", "timestamp": 1404950869}]}