We have. It's just not lucrative enough to widely <i>implement</i> the solution.<p>Controversial content breeds engagement, which is, I would say, one of the most important metrics companies like Meta, Google, or Twitter (sorry, X) rely on.<p>X would allow a literal nazi posting freely on their site, with the occasional post deletion, as long as it generates impressions. They could remove the content, they choose not to.
My experience with ravers back in the 90's was that events that were mostly social and less than 200 people would self moderate. The less social and the larger they were the more explicit moderation you needed.<p>Problem with social media is they try hard to create false/non communities at scale to sell to advertisers. The social part is just pure manipulation to get people to look at ads. True social is problematic for them so they suppress it.
Because it's incredibly difficult and no one can agree on what the "right" solution is.<p><a href="https://www.techdirt.com/2019/11/20/masnicks-impossibility-theorem-content-moderation-scale-is-impossible-to-do-well/" rel="nofollow noreferrer">https://www.techdirt.com/2019/11/20/masnicks-impossibility-t...</a>