TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Facebook emotion manipulation study: an explanation

72 pointsby mrmaddogalmost 11 years ago

18 comments

yazinsaialmost 11 years ago
This comment on the piece aptly sums up my response, it&#x27;s by Sean Tucker:<p>Adam, I don&#x27;t know you -- I came here from the Buzzfeed article criticizing the ethics of the study that linked to this post, but it appears we do have a friend in common.<p>I just have to ask - you honestly had a hypothesis that amounted to &#x27;perhaps we can make people more depressed,&#x27; and decided to test it on a group that hadn&#x27;t consented to the experiment, with no way to track its impact on their actual lives, only on the language they used in their Facebook posts? And you ask us to trust that this passed an internal review so it&#x27;s ethical?<p>Please take a moment to step back and consider that. That appears to have been the train of thought that led to this.<p>That&#x27;s appalling. Completely appalling. The Atlantic piece is right -- there&#x27;s absolutely no way this passes APA deceptive research standards.<p>Beyond that, you&#x27;ll never know what impact this actually had on depressed people. You can only measure what they posted to Facebook, which isn&#x27;t a particularly meaningful or realistic indicator of their emotional state.<p>If this passed an internal review board, that&#x27;s only proof that Facebook&#x27;s internal review standards aren&#x27;t what they need to be.<p>You&#x27;re in a position of extraordinary power, with access to more subscribers than any other field study in history, a larger population than most nations, and subject only to how you review yourselves. You could deceive yourself into believing you have informed consent because everyone clicked &#x27;accept&#x27; on the Terms of Service years ago, but there&#x27;s no way even you think that&#x27;s a meaningful standard.<p>I trust you&#x27;re a reasonable person who doesn&#x27;t set out to cross ethical boundaries. But on this one, I think Facebook needs to admit it did and make some changes. This study was unethical by any reasonable standard. There&#x27;s nothing wrong with admitting that and figuring out a way to do better.<p>There&#x27;s a lot wrong with going ahead with anything like this, ever again.
评论 #7963938 未加载
r0h1nalmost 11 years ago
Wow, what an amazingly tone-deaf post. It demeans Facebook users who were offended at the company&#x27;s actions into simpletons who did not really understand what the study was really about. Sample this:<p>&gt; <i>Nobody&#x27;s posts were &quot;hidden,&quot; they just didn&#x27;t show up on some loads of Feed.</i><p>How is hiding any different from not showing up?<p>&gt; <i>And at the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect it.</i><p>Not what your own study claimed.<p>&gt; <i>I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused.</i><p>We are not sorry about the research, but only for &quot;the way the paper described it&quot;.<p>&gt; <i>In hindsight, the research benefits of the paper may not have justified all of this anxiety.</i><p>In hindsight, our users are hyperventilating frogs. They should learn how to relax in the nice warm(ing) Facebook waters.
评论 #7963930 未加载
评论 #7963954 未加载
评论 #7968846 未加载
评论 #7963892 未加载
gemmaalmost 11 years ago
<i>I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety.</i><p>Ignoring the classic &quot;I&#x27;m sorry you were freaked out&quot; non-apology here, this response completely misses the point, and tries to re-frame the concept of informed consent into an inconsequential piece of red tape that anxious people worry about.<p>People were made subjects of a scientific experiment without their knowledge or consent. It doesn&#x27;t matter that Facebook took steps to make the changes as small as possible; it doesn&#x27;t matter that the effect on the individual subjects may have been minor; it doesn&#x27;t matter that the results were interesting. It was an ethical breach, and this tone-deaf response is fairly unsettling.
评论 #7963822 未加载
OrwellianChildalmost 11 years ago
Am I alone in being completely un-surprised and accepting of Facebook doing research in this way on its users? The News Feed as an interface for using FB is and has always been an evolving product and in the control of FB. They&#x27;ve been constantly messing with it, A&#x2F;B testing different combinations of user&#x2F;advertiser content and post&#x2F;comment structure. They do this to test outcomes, and achieve goals that most likely include (and are certainly not limited to):<p><pre><code> - Optimizing ad targeting - Maximizing click-thru - Maximizing engagement - Minimizing abandonment&#x2F;bounce rates </code></pre> The News Feed already doesn&#x27;t show you <i>all</i> your friends&#x27; posts and hasn&#x27;t for quite some time. How they choose to &quot;curate&quot; what they do show is going to be dictated by their incentives&#x2F;needs.<p>Getting outraged about any of this seems akin to getting pissed that the new season of your favorite TV show sucked...<p>Edited for formatting...
评论 #7963918 未加载
gammaratoralmost 11 years ago
When addressing criticism, it&#x27;s not great to start &quot;Ok so.&quot; (Or &quot;Calm Down. Breathe. We Hear You&quot; [1], to dig farther back in time.)<p>[1] <a href="http://www.facebook.com/notes/facebook/calm-down-breathe-we-hear-you/2208197130" rel="nofollow">http:&#x2F;&#x2F;www.facebook.com&#x2F;notes&#x2F;facebook&#x2F;calm-down-breathe-we-...</a>
评论 #7963837 未加载
potatoliciousalmost 11 years ago
I&#x27;m going to try and avoid the minefield that is this research vs. ethics.<p>One thought I&#x27;ve had is that the blowback against this incident is less about the research itself and how ethical it is, and more about perception of Facebook in general. My suspicion is a lot of the opposition at this point comes from long-simmering distrust of Facebook and the increasingly negative perception of its brand - this incident is merely the straw that broke the camel&#x27;s back, for some.<p>And if the popular response to this revelation reflects people&#x27;s general views on Facebook, it&#x27;s not good for the company.
评论 #7964065 未加载
评论 #7964538 未加载
nnessalmost 11 years ago
Not a single point in that response directly addresses why there was no informed consent in the study. There are reasons why research goes under ethical review before it is conducted, and this is certainly going to be a tough lesson for Facebook.
评论 #7963931 未加载
评论 #7963801 未加载
评论 #7963825 未加载
brainsareneatalmost 11 years ago
I wonder where the line is between A&#x2F;B testing and &#x27;psychological experimentation&#x27; and when it&#x27;s been crossed. Was it crossed just because it was published in PNAS? The outraged don&#x27;t seem to think so.<p>What if I&#x27;m Amazon or Yelp, and I want to choose review snippets? Is looking for emotionally charged ones and testing to see how that impacts users wrong?<p>What if it&#x27;s more direct psychological manipulation? What if I run a productivity app, and I want to see how giving people encouraging tips, like &#x27;Try starting the day by doing one thing really well.&#x27; impacts their item completion rate. I&#x27;m doing psychological experimentation. I&#x27;m not getting my users&#x27; permission. But I am helping them. And it&#x27;s a valid question - maybe these helpful tips actually end up hurting users. I should test this behavior, not just implement it wholesale.<p>It seems like Facebook had a valid question, and they didn&#x27;t know what the answer was. Did they go wrong when they published it in PNAS? Or was it wrong to implement the algorithm in the first place? I don&#x27;t think it was.
评论 #7963914 未加载
natural219almost 11 years ago
For posterity&#x27;s sake, I want to clarify what is going on here.<p>What&#x27;s been happening, over the last five years, is that American society has become more trigger-happy in deducing &quot;accurate&quot; moral conclusions from following online media outlets.<p>I outlined some of this in cjohnson.io&#x2F;2014&#x2F;context, although I didn&#x27;t appreciate the full power of this conclusion at the time, and so the essay mostly falls short to explain the entirety of what&#x27;s happening currently.<p>In a nutshell: The Web has broken down barriers between contexts that used to live in harmony, ignorant of each other. Now, as the incompatibility of these contexts come into full focus, society has no choice but to accept the fluidity of context in the information age, or tear itself apart at the seams.<p>All that was needed to precipitate the decline of Facebook (oh yes, Facebook is going down, short now while supplies last) was some combination of words and contexts that fully elucidate the power of online advertising &#x2F; data aggregation to have real impact upon people&#x27;s lives. Put in terms that the &quot;average person&quot; can understand, the impact of this story will be devastating. I feel so bad for the Facebook PR team -- they&#x27;re simply out of their league here.<p>The reason <i>this</i> scandal will be the one we read about in the history books is because it provides the chain link between two separate, but very powerful contexts: 1, the context of Nazi-esque social experimentation, and 2, the run-of-the-mill SaaS-style marketing that has come to characterize, well, pretty much every large startup in the valley.<p>We&#x27;ve reached a point where nobody knows what is going to happen next. Best of luck, people.
smokeyjalmost 11 years ago
&gt; The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product.<p>That.. or getting users to click more ads.<p>Maybe ad inventory can be optimized for people in an certain emotional state, and the user&#x27;s wall can be used to induce the most profitable state of mind for fb&#x27;s available ad inventory. That would be awesome in an evil villain kind of way.
评论 #7963838 未加载
zarothalmost 11 years ago
It seems to be that Facebook did something like an A&#x2F;B test of their ranking algorithm, and then did a cohort study to see if there were any longer term impact.<p>If what they did requires informed consent, what about when the end goal is maximizing click through rate, e.g. by inciting an emotional response?<p>Let&#x27;s say FB finds that certain types of posts in the feed cause a nearby ad to be clicked on more. They determine this through pervasive testing of everything that you see and do on their site. They could then adjust their algorithm to account for this behavior to increase clicks&#x2F;profit.<p>I think the actions FB takes to monetize the user base are not only more intrusive by far, they are actively searching for and exploiting these effects for profit. If informed consent for TFA is not ridiculous, then I think we have much bigger problems on our hands? What am I missing about the informed consent issue?
评论 #7963894 未加载
gouggougalmost 11 years ago
For those like me who lived under a rock, here&#x27;s the study: <a href="http://www.pnas.org/content/early/2014/05/29/1320040111" rel="nofollow">http:&#x2F;&#x2F;www.pnas.org&#x2F;content&#x2F;early&#x2F;2014&#x2F;05&#x2F;29&#x2F;1320040111</a>
评论 #7963933 未加载
xenadu02almost 11 years ago
Simple fix:<p>&quot;We&#x27;re conducting a scientific study on mood for the next three months. Would you mind participating? The study won&#x27;t require you to do anything special or take any further action. Y&#x2F;N?&quot;<p>There we go kids, easy as pie; it&#x27;s called consent to participate! Pretty easy to do without spilling the beans on the study.
评论 #7964227 未加载
akerl_almost 11 years ago
Based on the other comments, it appears that the difference between kosher A&#x2F;B testing and unethical experimentation with dire mental health consequences is either &quot;Did you publish a paper when you were done&quot; or &quot;Are you a big, noteworthy company&quot;.
评论 #7965905 未加载
cwal37almost 11 years ago
Universities and think tanks have internal review boards for a reason. The fact that Facebook can make use of its users&#x27; data in other (business-related) avenues does not excuse them from the most basic of research ethics.<p>It did pass the muster of Cornell&#x27;s review board, but that was after the data was actually collected, which amounts to ex post facto approval.
datakidalmost 11 years ago
There&#x27;s not really any excuse you can give at all, FB. I&#x27;m astounded that you didn&#x27;t apologise without reservation and implement an end use agreement change that specified you wouldn&#x27;t do it again.<p>Messing with people&#x27;s mental health is outrageous.
onewaystreetalmost 11 years ago
People calling this unethical have to explain why it is but A&#x2F;B testing is not.
评论 #7963899 未加载
评论 #7963842 未加载
评论 #7965597 未加载
评论 #7963844 未加载
评论 #7963890 未加载
alxalmost 11 years ago
Does someone remember the name of the NSA program concerning psychological manipulation on social networks ?
评论 #7963969 未加载