> "there’s about to be an uprising of the Buddhists, so make sure that you are armed and go to this place"<p>> Facebook's systems detected what was going on and stopped the messages from going through<p>Wwwwhaaat?! Some people may have just seen that message and interpreted it as "shit hit the fan, let's hide my family in a safe place until this cools down", even if it was intended as a "call to violence". Censoring messages like those could have just as well <i>caused deaths</i> because innocent people just didn't get the heads up.<p>Corporations should clearly define themselves as either <i>"medium companies"</i> and stay completely neutral to whatever flows through their platform as long as it not "explicit content" (yes, this include allowing "hate speech" as long as it's toned down, because that "hate speech" can also contain useful information, and it's not something clearly identifiable), or <i>"message companies"</i>, in which case they can clearly take sides in conflicts, but also be responsible (legally) for their actions.<p>This muddy "middle ground" position that some companies take is "the root of all evil". Either <i>let anything happen</i> (including bad things), or <i>pick a side,</i> so that you can later be judged according to the side you picked. It's condescending to imagine that you're actually smart enough to "properly filter" information. You're not, or you're a tyrant imposing his value system on others.<p>I have more sympathy for a corporation that does evil deeds in the service of profit, than for one that interferes in "muddy" ways in social issues and prevents clarity and free flow of information. Sometimes this flow of information cause blood to be spilled, but sometimes problems get solved this way, if a society is not evolved enough to solve them in more peaceful ways. Toning down discussions and letting tensions accumulate is worse.