I've worked on content moderation systems for some Really Big Platforms and it is definitely not uncommon to have a list of "super popular" content that is exempt from the automated banning systems, rather they get escalated to a human for review. This is implied in the article, but for more context, in my experience, this is usually because of a few reasons:<p>1) Automated systems usually have thresholds trained/set on content of lower popularity, and they often fall apart when popular content gets flagged, so the precision of such automated bans in those cases is not always good.<p>2) Brigading of popular content is a thing, and a giant pain in the ass to detect reliably.<p>3) And probably most important: the impact on the business for a false positive ban via automation on popular content is very high, so a slight delay in ban issuing pending human review is usually not that big a deal.<p>The last one implies "rules for thee and not for me" and that is 100% the case - it's not pretty, but these services are a business and unfortunately they will always bias towards their bottom line. The reality is that popular content gets special privileges, it sucks and it is unfair to up and comers, but that is sadly the name of the game.