"In 2017, Chris Cox, Facebook’s longtime chief product officer, formed a new task force to understand whether maximizing user engagement on Facebook was contributing to political polarization. It found that there was indeed a correlation, and that reducing polarization would mean taking a hit on engagement. In a mid-2018 document reviewed by the Journal, the task force proposed several potential fixes, such as tweaking the recommendation algorithms to suggest a more diverse range of groups for people to join. But it acknowledged that some of the ideas were “antigrowth.” Most of the proposals didn’t move forward, and the task force disbanded.<p>Since then, other employees have corroborated these findings. A former Facebook AI researcher who joined in 2018 says he and his team conducted “study after study” confirming the same basic idea: models that maximize engagement increase polarization. They could easily track how strongly users agreed or disagreed on different issues, what content they liked to engage with, and how their stances changed as a result. Regardless of the issue, the models learned to feed users increasingly extreme viewpoints. “Over time they measurably become more polarized,” he says."<p>Facebook has repeatedly shown that it will maximize its bottom line and growth (in particular, as measured by engagement metrics) above everything else. Fair enough, they're a for-profit corporation! I just can't really take seriously their claims of 'self-regulation'. They simply can never be trusted to self-regulate because it's just not in their interest, and they've never shown meaningful actions in this regard. They will only respond to the legal stick - and it is most certainly coming.