TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

How Facebook got addicted to spreading misinformation

128 pointsby 7d7nabout 4 years ago

16 comments

contemporary343about 4 years ago
&quot;In 2017, Chris Cox, Facebook’s longtime chief product officer, formed a new task force to understand whether maximizing user engagement on Facebook was contributing to political polarization. It found that there was indeed a correlation, and that reducing polarization would mean taking a hit on engagement. In a mid-2018 document reviewed by the Journal, the task force proposed several potential fixes, such as tweaking the recommendation algorithms to suggest a more diverse range of groups for people to join. But it acknowledged that some of the ideas were “antigrowth.” Most of the proposals didn’t move forward, and the task force disbanded.<p>Since then, other employees have corroborated these findings. A former Facebook AI researcher who joined in 2018 says he and his team conducted “study after study” confirming the same basic idea: models that maximize engagement increase polarization. They could easily track how strongly users agreed or disagreed on different issues, what content they liked to engage with, and how their stances changed as a result. Regardless of the issue, the models learned to feed users increasingly extreme viewpoints. “Over time they measurably become more polarized,” he says.&quot;<p>Facebook has repeatedly shown that it will maximize its bottom line and growth (in particular, as measured by engagement metrics) above everything else. Fair enough, they&#x27;re a for-profit corporation! I just can&#x27;t really take seriously their claims of &#x27;self-regulation&#x27;. They simply can never be trusted to self-regulate because it&#x27;s just not in their interest, and they&#x27;ve never shown meaningful actions in this regard. They will only respond to the legal stick - and it is most certainly coming.
评论 #26426813 未加载
评论 #26426953 未加载
评论 #26428777 未加载
评论 #26426903 未加载
评论 #26434602 未加载
评论 #26427638 未加载
thitcanhabout 4 years ago
Has anyone ever looked at history and figured out that humans are pretty damn awful? Facebook is just a catalyzer just like Twitter is. Every platform has their own disgusting self-centered bubbles and that’s not because of the platform.<p>The bar to get banned from any online platform is pretty high. Just look at how long it took for the most prominent Twitter offender to get rejected.<p>You can ban all public forums you want, then people will just continue sharing BS on WhatsApp without you even noticing.<p>The problem isn’t Facebook but people. Governments should punish people, not outsource the policing to companies.
评论 #26428368 未加载
评论 #26428358 未加载
评论 #26427056 未加载
sodality2about 4 years ago
♪<i>We know it&#x27;s not right</i>♪<p>♪<i>We know it&#x27;s not funny</i>♪<p>♪<i>But we&#x27;ll stop beating this dead horse</i>♪<p>♪<i>when it stops spitting out money</i>♪
评论 #26426740 未加载
coldcodeabout 4 years ago
When your founder and CEO makes growth the only thing that matters, everything else is not important. If selling people lies makes more money, sell more lies, etc. If Facebook charged people $1 a month per account, and deleted all the ads, the data selling, the lie promotions, etc, 95% of their company could be laid off. They would also make less money for sure, and that&#x27;s something Zuckerberg could never support.
syamilmjabout 4 years ago
You could go on Facebook with no opinion on anything and leave an extremist.
评论 #26429895 未加载
评论 #26426702 未加载
mooneaterabout 4 years ago
Killer article here by Karen Hao, hard hitting ending.<p>Wonder what fb thought they were getting into here with MTR!
strangeloops85about 4 years ago
One take-away I had from reading this excellent and detailed article is the fundamental tension in Facebook&#x27;s approach to moderating content:<p>They rely on a hammer to catch what is objectionable or misinformation, but optimize for engagement maximally with everything else. However the hammer misses plenty of newly emerging inflammatory content (for example, anti-Rohingya content would look very different from anything prior), and this content gets maximally amplified because the overall newsfeed algorithms optimize for engagement.<p>An alternate approach would be to sacrifice engagement overall (perhaps only a little!) to reduce the very real and negative consequences of undesirable content that slips past the detection algorithms (which it always will). I suspect some of the fixes that were proposed, but never implemented, effectively did this. But they were shot down because, well, the top-line engagement numbers would dip.
adjkantabout 4 years ago
Despite the title, this is very much more than fluff and would highly recommend reading it in full.
ctocoderabout 4 years ago
When the 1st Facebook API was released, I helped build the largest application called SuperWall. The way to increase virality and activity on the wall, was to allow a copy paste virus to perpetuate throughout the network. The DAU with this virus that &quot;Zuckerberg is going to delete your account if you do not log in&quot; kept DAU and virial growth at peak levels.<p>Facebook could do this too and get the same results. More DAU more money.
blendoabout 4 years ago
“I think what happens to most people who work at Facebook—and definitely has been my story—is that there’s no boundary between Facebook and me,” he says. “It’s extremely personal.”
threesmegisteabout 4 years ago
Again facebook news. Tricky as always. There is only one evil in big tech. Others are angel. Dont say they all same. These kind of frequent news will guide you. And you start to say “ X is also evil i know but not much like facebook”. And that what they want us to think. On safari you dont need to run faster than lion. Just dont be the last among runners. When somebody talk about privacy, sentences start with Facebook and the rest is unimportant. Basic media manupilation and psychology.
sneakabout 4 years ago
Why does everyone seem to be convinced that the spread of misinformation is somehow suddenly a problem? It has been widespread for centuries in our societies and with the exception of a few cases (say, Galileo, crusades, et c) it hasn&#x27;t really consistently caused any major issues.<p>This feels very much to me like a &quot;something must be done!&quot;
评论 #26465157 未加载
评论 #26429396 未加载
ldboothabout 4 years ago
amazing how the salary changes the employee&#x27;s perspective.<p>it has been said elsewhere: engagement = addiction<p>dial up the addiction, dial up the profit. &quot;Regulators! mount up&quot;
ilamontabout 4 years ago
<i>The algorithms that underpin Facebook’s business weren’t created to filter out what was false or inflammatory; they were designed to make people share and engage with as much content as possible by showing them things they were most likely to be outraged or titillated by.</i><p>I find it interesting that this article does not even mention some of the other platforms that have also grown off this strategy, and have contributed to the societal problems named in the article.<p>I joke with my wife that you can innocently watch a YouTube press conference featuring our moderate Republican governor discussing COVID policy one day, and before you know it you&#x27;re being recommended videos by rabid anti-vaxxers and QAnon deep state conspiracy theorists.<p>It&#x27;s not just a Facebook problem or an &quot;AI&quot; problem, it&#x27;s a platform problem which requires wider solutions than the broken internal approaches described in this article.
评论 #26426554 未加载
评论 #26426307 未加载
throwitaway1235about 4 years ago
Stop trying to police what people think. It&#x27;s disgusting.
andyxorabout 4 years ago
A better question is how MSM got addicted to spreading misinformation.<p>&#x27;Hate speech&#x27; these days means &#x27;speech I hate and want silenced&#x27;. Freedom of speech protections exist for this exact reason.<p>Why should a social media app engage in political censorship and selecting right vs. wrong opinion?