TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Facebook's unethical experiment manipulated users' emotions

130 pointsby sculpturealmost 11 years ago

24 comments

mabboalmost 11 years ago
155,000 users for each treatment of the experiment, says the paper. Let&#x27;s presume random selection, and that the occurrence rate of mental disorders is the same for Facebook users as the general public (probably not too far off). Then Facebook intentionally downgraded the emotional state of:<p>10,000 sufferers of Major Depressive Disorder<p>5,000 people with PTSD<p>4,000 bipolar disorder sufferers<p>1,700 schizophrenics<p>and a plethora of others with various mental disorders.<p>11&#x2F;100,000 people commit suicide each year in America. How many were part of that treatment of the experiment, without consent or knowledge?<p>As a scientist, I&#x27;m fascinated by the research. As a human being, I&#x27;m horrified it was ever done.<p><a href="http://www.nimh.nih.gov/health/publications/the-numbers-count-mental-disorders-in-america/index.shtml" rel="nofollow">http:&#x2F;&#x2F;www.nimh.nih.gov&#x2F;health&#x2F;publications&#x2F;the-numbers-coun...</a>
评论 #7962654 未加载
评论 #7960153 未加载
评论 #7960626 未加载
ajaysalmost 11 years ago
I&#x27;m puzzled about the outrage.<p>FB _already_ filters out updates based on some blackbox algorithm. So they tweaked the parameters of that algorithm to filter out the &quot;happier&quot; updates, and observed what happens. How is this unethical? The updates were posted by the users&#x27; friends! FB didn&#x27;t manufacture the news items; they were always there.<p>I detest FB as much as the next guy, but this is ridiculous.
评论 #7959947 未加载
评论 #7959952 未加载
评论 #7960702 未加载
评论 #7959960 未加载
TeMPOraLalmost 11 years ago
I strongly hope that they won&#x27;t care about any of this &quot;outrage&quot; and continue to do more and more experiments. Maybe even open it up as a platform for scientists to conduct studies.<p>Facebook is in the unique position of possessing data that can be orders of magnitude more useful for social studies than surveys of randomly picked college students that happened to pass through your hallway. There&#x27;s lot of good to be made from it.<p>But the bigger issue I see here is why it&#x27;s unethical to &quot;manipulate user emotions&quot; for research, when every salesman, every ad agency, every news portal and every politician does this to the much bigger extent and it&#x27;s considered fair? It doesn&#x27;t make much sense for me (OTOH I have this attitude, constantly backed by experience, that everything a salesmen says is a malicious lie until proven otherwise).
评论 #7960030 未加载
评论 #7960079 未加载
评论 #7960603 未加载
espeedalmost 11 years ago
<i>Facebook’s Unethical Experiment: It intentionally manipulated users’ emotions without their knowledge.</i><p>I&#x27;m not defending Facebook or the experiment, but if you&#x27;re going to call them out for &quot;manipulating users&#x27; emotions without their knowledge&quot;, then you need to call out every advertising, marketing, and PR firm on the planet, along with every political talkshow, campaign, sales letter, and half-time speech...
评论 #7960020 未加载
评论 #7959974 未加载
评论 #7959972 未加载
kyroalmost 11 years ago
To say this response is unnecessary and unfounded is disingenuous. Marc Andreessen (whom I respect) tweeted <i>&quot;Run a web site, measure anything, make any changes based on measurements? Congratulations, you&#x27;re running a psychology experiment!&quot;</i> I could not disagree more.<p>This isn&#x27;t simply a matter of looking at metrics and making changes to increase conversion rates. The problem is that the whole of users have come to expect Facebook to be a place where they can see any and all of their friends&#x27; updates. When I look at an ad, I know I am being manipulated. I know I&#x27;m being sold something. There is no such expectation of manipulative intent of Facebook, or that they&#x27;re curating your social feed beyond &quot;most recent&quot; and &quot;most popular&quot;, which seemingly have little to do with post content and are filters they let you toggle.<p>What FB has done is misrepresent people and the lives they&#x27;ve chosen to portray, having a hand in shaping their online image. I want to see the good <i>and</i> the bad that my friends post. I want to know that whatever my mom or brother or friend posts, I&#x27;ll be able to see. Someone&#x27;s having a bad day? I want to see it and support that person. That&#x27;s what&#x27;s so great about social media, that whatever I post can reach everyone in my circle, the way I posted it, unedited, unfiltered.<p>To me this is a disagreement between what people perceive FB to be and how FB views itself. What if Twitter started filtering out tweets that were negative or critical of others?
评论 #7960324 未加载
评论 #7959993 未加载
评论 #7960008 未加载
评论 #7960696 未加载
评论 #7960043 未加载
masnickalmost 11 years ago
I just wrote a blog post about the ethical&#x2F;professional obligations of the researchers associated with this study: <a href="http://www.maxmasnick.com/2014/06/28/facebook/" rel="nofollow">http:&#x2F;&#x2F;www.maxmasnick.com&#x2F;2014&#x2F;06&#x2F;28&#x2F;facebook&#x2F;</a><p>When you publish a paper, you are supposed to write in the body of the manuscript if it&#x27;s been approved by an IRB and what their ruling was. I&#x27;m surprised it was published without this, even though it apparently was?<p>It&#x27;s also appropriate to address ethical issues head-on in a paper about a study that may be controversial from an ethical perspective.<p>If it really was approved by an IRB, then the researchers are ethically in the clear but totally botched the PR.<p>If not, then I think the study was not ethical.
评论 #7961733 未加载
mryallalmost 11 years ago
The difference between this experiment and advertising or A&#x2F;B testing is _intent_. With A&#x2F;B testing and advertising, the publisher is attempting to sway user behaviour toward purchasing or some other goal which is usually obvious to the user.<p>With this experiment, Facebook are modifying the news feeds of their users specifically to affect their emotions, and then measure impact of that emotional change. The intention is to modify the feelings of users on the system, some negatively, some positively.<p>Intentional messing with human moods like this purely for experimentation is the reason why ethics committees exist at research organisations, and why informed consent is required from participants in experiments.<p>Informed consent in this case could have involved popping up a dialog to all users who were to be involved in the experiment, informing them that the presentation of information in Facebook would be changed in a way that might affect their emotions or mood. That is what you would expect of doctors and researchers when dealing with substances or activities that could adversely affects people&#x27;s moods. We should expect no less from pervasive social networks like Facebook.
azakaialmost 11 years ago
Oh, please.<p>Every single time Facebook changes anything on their site it &quot;manipulates users&#x27; emotions&quot;. Show more content from their friends? Show less? Show more from some friends? Show one type of content more, another less? Change the font? Enlarge&#x2F;shrink thumbnail images? All these things affect users on all levels, including emotionally, and Facebook does such changes every day.<p>Talking about &quot;informed consent&quot; in the context of a &quot;psychological experiment&quot; here is bizarre. The &quot;subjects&quot; of the &quot;experiment&quot; here are users of Facebook. They decided to use Facebook, and Facebook tweaks the content it shows them every single day. They expect that. That is how Facebook and every other site on the web (that is sophisticated enough to do studies on user behavior) works.<p>If this is &quot;immoral&quot;, then an website outage - which frustrates users hugely - should be outright evil. And shutting down a service would be an atrocity. Of course all of these are ludicrous.<p>The only reason we are talking about this is because it was published, so all of a sudden it&#x27;s &quot;psychological research&quot;, which is a context rife with ethical limitations. But make no mistake - Facebook and all other sophisticated websites do such &quot;psychological research&quot; ALL THE TIME. It&#x27;s how they optimize their content to get people to spend more time on their sites, or spend more money, or whatever they want.<p>If anyone objects to this, they object to basically the entire modern web.
评论 #7960033 未加载
amezaalmost 11 years ago
I&#x27;m torn about this. In some ways, I can see how mental health issues can be detected which can hopefully help us avoid these horrifying events (mass shootings off the top of my head). But then again, I can see how the Army or the government in general can control any type of popular uprisings. FB, Twitter, etc have given us tools to connect and join in efforts to fix what is wrong (I&#x27;m thinking the Middle East though that can be said about the Tea Party or even Occupy movement). If the price is right, FB can hand over that power (i.e. NSA) or through these secret courts, the Army&#x2F;government can have direct control of FB. It&#x27;s crazy to think that this only occurs in countries like Russia and China but wake up America! This is happening here as well!
ianstallingsalmost 11 years ago
You know why I think they are doing this? Because there have been studies showing that people are miserable on facebook (see below) and I think people are starting to pick up on it. So FB feels some pressure to lighten the mood a bit. But as usual they do it with the subtlety of a drunken fool.<p>Also, the comparison to an A&#x2F;B test is a false one. This is specifically to alter the moods of the user and test the results in a study, not to improve the users experience or determine which app version works better.<p>Regarding the study mentioned above: <a href="http://www.newyorker.com/online/blogs/elements/2013/09/the-real-reason-facebook-makes-us-unhappy.html" rel="nofollow">http:&#x2F;&#x2F;www.newyorker.com&#x2F;online&#x2F;blogs&#x2F;elements&#x2F;2013&#x2F;09&#x2F;the-r...</a>
resdirectoralmost 11 years ago
&gt; Facebook intentionally made thousands upon thousands of people sad.<p>Hang on. Wasn&#x27;t the experiment to see <i>whether</i> users would post gloomier or happier messages respectively? This very different from <i>intentionally</i> making people sad.
评论 #7959984 未加载
mullingitoveralmost 11 years ago
This study really makes me feel vindicated for unfollowing all of my friends along with every brand on facebook. I could&#x27;ve been part of the study but I&#x27;d never know, since the only way I see my friends&#x27; posts is to visit their pages directly where I can see them all unfiltered. I&#x27;ve been doing this for the past six months and it has dramatically improved the way I interact with the site. I can still get party invites and keep in touch with people, but I&#x27;m immune to the groupthink.
mullingitoveralmost 11 years ago
I have a feeling a lot of college courses on research methods are going to use this as an example of a grave ethics breach for years to come. With an experiment group as large as they used, statistically it&#x27;s almost inevitable that someone in that group will commit suicide in the near future. If that person is in the group that was targeted for negative messages, even a rookie lawyer could make a sound case before a jury that Facebook&#x27;s researchers have blood on their hands.
评论 #7962029 未加载
dansoalmost 11 years ago
FWIW, the HN discussion on the study published on PNAS here:<p><a href="https://news.ycombinator.com/item?id=7956470" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=7956470</a>
评论 #7960016 未加载
ispolinalmost 11 years ago
So does this mean that people can increase their happiness by using plugins that hide negative posts from their social media?
评论 #7960640 未加载
评论 #7961172 未加载
deepsunalmost 11 years ago
Author falsely assumes that people changes their sharing behavior due to changes in their mood. More likely they just feel like &quot;everyone&#x27;s posting cats on Facebook, so that&#x27;s a place for sharing cats, let me do too&quot;, or otherwise.
nichodgesalmost 11 years ago
Before and&#x2F;or after the fact, research participants are made aware that they were part of a psychology experiment.<p>I wonder if Facebook plans on alerting subjects of this experiment to their participation?
jevgenialmost 11 years ago
Isn&#x27;t Slate in the business of exactly that: manipulating their readers emotions?
评论 #7960554 未加载
falconfunctionalmost 11 years ago
I just use Facebook to bookmark youporn at this point
onewaystreetalmost 11 years ago
Been kind of surprised there hasn&#x27;t been more of a reaction to this. I guess the Internet has reached peak Facebook outrage.
pvdmalmost 11 years ago
Another nail in the FB coffin.<p>Edit: for me at least.
评论 #7960004 未加载
xyclosalmost 11 years ago
people still use facebook?
dreamfactory2almost 11 years ago
&gt; &quot;If you are exposing people to something that causes changes in psychological status, that’s experimentation&quot;<p>Or art, or journalism, or advertising, or football etc.
hawkicealmost 11 years ago
Every business that makes sense will try to make its customers happier.<p>Showing people bad news to get more engagement has roughly the same moral standing as the evening news.<p>I guess I don&#x27;t get it.<p>[It must be wrong because they learned something from it, I guess?]
评论 #7959928 未加载