Just imagine the harm seeing a breast can do to a vulnerable young child. /s<p>Facebook needs to literally grow up, removing porn is one thing but clinical images, mothers feeding their children and such should not even be up for discussion.<p>It's a fine line between protecting your users from seeing offensive content and outright censorship, good to see them doing the right thing in this case, pity it is still on a 'case-by-case' basis instead of a healthy review of their policies.<p>The main criterion seems to be 'is the internet raising a large enough stink'? If yes then restore the image.
I find it surprising that this is newsworthy.<p>It appears that Facebook never argued that the image was in breach of its policies. Just that some software it runs had a bug that miss classified this image.<p>Then when challenged they apologized and approved the add.<p>So to me the summary appears to be "Software company has bug that effected one customer, apologies and fixes the issue" which must happen every hour of every day...<p>Am I missing something?....
"Breast cancer awareness" is a thing only because it is a form of signaling that privileges female bodies. That's why it has such tremendous buy-in despite already saturated "awareness" going back 20 years, or the fact that there are a half-dozen causes of death with greater preventibility and lethality even just considering women.<p>I'm really tired of people engaging in pointless signaling campaigns and expecting to get points for being So Brave in the face of ~ universal consensus that they are correct, or taking minor bureaucratic snafus like this as evidence that they are somehow not in a position of complete victory.
Every time I see these I think the same thing: this shouldn't be an issue. If we weren't allowing so few players to define so much of our experience of the internet, it wouldn't matter that much what any single one of them decides to censor. Hopefully it will be that way again someday.
This reminds me of the incident with the Norwegian newspaper posting the "Napalm Girl" photo.<p>Facebook is trying to automate the detection of illegal/unwanted images, and it seems extremely difficult to detect the context of the image to the extent that you can differentiate between acceptable images of human bodies, and unacceptable images (which would be, I assume, the vast, vast majority of such images posted).<p>I wonder how they could proceed with this- maybe with some sort of anomaly detection, where you do a first pass to detect all images containing the unwanted features (e.g. naked bodies), and then a second pass to try and detect the activity that's going on, or to detect if the image is famous (e.g. a picture of David, the famous Italian statue, would be acceptable, while a photo of a naked man in the same position would presumably not be).<p>[1] <a href="http://www.siliconbeat.com/2016/09/12/sheryl-sandberg-responds-to-norway-pm-over-facebook-photo-censorship/" rel="nofollow">http://www.siliconbeat.com/2016/09/12/sheryl-sandberg-respon...</a>
As long as they remove breast pictures instead of pictures of beheadings (I complained several times as they appeared in my newsfeed, but they were always deemed as "ok"), our world is not going anywhere in terms of peace.
I understand why it is this way. "Perfect is the enemy of good" and all that. It's probably much cheaper to run a system that is imperfect.<p>But it bothers me that we leave so much of our discourse to such imperfect systems.
I do agree this is ridiculous and they should be embarrassed about this kind of stuff, but I do want to remind people that Facebook is an advertising product selling "you" to advertisers. They have to please their advertisers, not users. I am glad to see people complaining. I hope people continue to complain and news outlets like this ridicule them, but until we stop giving our information away for free to these corporations, they are going to continue to do immoral and dangerous activities. Facebook is not the problem here. We are!
Facebook is eventually going to get into more systematic mess over this sort of rubbish.<p>Have one single clear principle and apply is consistently. Change the principle don't make exceptions if needed. "Educational videos wont be removed" could have been a good policy that Google has had for Youtube.<p>Or even "No Breasts"can be a good policy too. If you want to show breast cancer videos do it on you tube, shoot it with a prop or link to another page. I dont see why that does not work.