"In 2017, Chris Cox, Facebook’s longtime chief product officer, formed a new task force to understand whether maximizing user engagement on Facebook was contributing to political polarization. It found that there was indeed a correlation, and that reducing polarization would mean taking a hit on engagement. In a mid-2018 document reviewed by the Journal, the task force proposed several potential fixes, such as tweaking the recommendation algorithms to suggest a more diverse range of groups for people to join. But it acknowledged that some of the ideas were “antigrowth.” Most of the proposals didn’t move forward, and the task force disbanded.<p>Since then, other employees have corroborated these findings. A former Facebook AI researcher who joined in 2018 says he and his team conducted “study after study” confirming the same basic idea: models that maximize engagement increase polarization. They could easily track how strongly users agreed or disagreed on different issues, what content they liked to engage with, and how their stances changed as a result. Regardless of the issue, the models learned to feed users increasingly extreme viewpoints. “Over time they measurably become more polarized,” he says."<p>Facebook has repeatedly shown that it will maximize its bottom line and growth (in particular, as measured by engagement metrics) above everything else. Fair enough, they're a for-profit corporation! I just can't really take seriously their claims of 'self-regulation'. They simply can never be trusted to self-regulate because it's just not in their interest, and they've never shown meaningful actions in this regard. They will only respond to the legal stick - and it is most certainly coming.
Has anyone ever looked at history and figured out that humans are pretty damn awful? Facebook is just a catalyzer just like Twitter is. Every platform has their own disgusting self-centered bubbles and that’s not because of the platform.<p>The bar to get banned from any online platform is pretty high. Just look at how long it took for the most prominent Twitter offender to get rejected.<p>You can ban all public forums you want, then people will just continue sharing BS on WhatsApp without you even noticing.<p>The problem isn’t Facebook but people. Governments should punish people, not outsource the policing to companies.
♪<i>We know it's not right</i>♪<p>♪<i>We know it's not funny</i>♪<p>♪<i>But we'll stop beating this dead horse</i>♪<p>♪<i>when it stops spitting out money</i>♪
When your founder and CEO makes growth the only thing that matters, everything else is not important. If selling people lies makes more money, sell more lies, etc. If Facebook charged people $1 a month per account, and deleted all the ads, the data selling, the lie promotions, etc, 95% of their company could be laid off. They would also make less money for sure, and that's something Zuckerberg could never support.
One take-away I had from reading this excellent and detailed article is the fundamental tension in Facebook's approach to moderating content:<p>They rely on a hammer to catch what is objectionable or misinformation, but optimize for engagement maximally with everything else. However the hammer misses plenty of newly emerging inflammatory content (for example, anti-Rohingya content would look very different from anything prior), and this content gets maximally amplified because the overall newsfeed algorithms optimize for engagement.<p>An alternate approach would be to sacrifice engagement overall (perhaps only a little!) to reduce the very real and negative consequences of undesirable content that slips past the detection algorithms (which it always will). I suspect some of the fixes that were proposed, but never implemented, effectively did this. But they were shot down because, well, the top-line engagement numbers would dip.
When the 1st Facebook API was released, I helped build the largest application called SuperWall. The way to increase virality and activity on the wall, was to allow a copy paste virus to perpetuate throughout the network. The DAU with this virus that "Zuckerberg is going to delete your account if you do not log in" kept DAU and virial growth at peak levels.<p>Facebook could do this too and get the same results. More DAU more money.
“I think what happens to most people who work at Facebook—and definitely has been my story—is that there’s no boundary between Facebook and me,” he says. “It’s extremely personal.”
Again facebook news. Tricky as always. There is only one evil in big tech. Others are angel. Dont say they all same. These kind of frequent news will guide you. And you start to say “ X is also evil i know but not much like facebook”. And that what they want us to think. On safari you dont need to run faster than lion. Just dont be the last among runners. When somebody talk about privacy, sentences start with Facebook and the rest is unimportant. Basic media manupilation and psychology.
Why does everyone seem to be convinced that the spread of misinformation is somehow suddenly a problem? It has been widespread for centuries in our societies and with the exception of a few cases (say, Galileo, crusades, et c) it hasn't really consistently caused any major issues.<p>This feels very much to me like a "something must be done!"
amazing how the salary changes the employee's perspective.<p>it has been said elsewhere:
engagement = addiction<p>dial up the addiction, dial up the profit.
"Regulators! mount up"
<i>The algorithms that underpin Facebook’s business weren’t created to filter out what was false or inflammatory; they were designed to make people share and engage with as much content as possible by showing them things they were most likely to be outraged or titillated by.</i><p>I find it interesting that this article does not even mention some of the other platforms that have also grown off this strategy, and have contributed to the societal problems named in the article.<p>I joke with my wife that you can innocently watch a YouTube press conference featuring our moderate Republican governor discussing COVID policy one day, and before you know it you're being recommended videos by rabid anti-vaxxers and QAnon deep state conspiracy theorists.<p>It's not just a Facebook problem or an "AI" problem, it's a platform problem which requires wider solutions than the broken internal approaches described in this article.
A better question is how MSM got addicted to spreading misinformation.<p>'Hate speech' these days means 'speech I hate and want silenced'. Freedom of speech protections exist for this exact reason.<p>Why should a social media app engage in political censorship and selecting right vs. wrong opinion?