> <i>All the images remain hidden until they are found to be NSFW or not</i><p>The demo screenshot/page seems to only show NSFW images, so the demo doesn't convey how quickly this classifier can operate. The demo page should have some non-NSFW images to show off how quickly SFW images are revealed, since the default is to block all images.
I’ve jokingly / seriously been waiting for things like this. I’m the type to go overboard with filtering and mute lists on twitter, but images are harder to deal with. I did have a 60,000 line pastebin that was used in conjunction with a Chrome extension to block Wojack and Pepe memes on 4chan using md5 hashes, but something not rely on specific hashes is obviously superior.<p>One day someone will release the ‘detect and block anything resembling kpop’ extension for Twitter and I’ll be happy.
From experience 99.9% of the time unless I visit a porn site purposely or some warez site by accident, or questionable reddit channels, I see no NSFW. However I do see some benefits if this was for kids.
Shameless plug - I've been working on something similiar, but with a different aim.<p><a href="https://github.com/nmurray1984/porn-blocker-chrome-extension" rel="nofollow">https://github.com/nmurray1984/porn-blocker-chrome-extension</a><p>I found that the hardest problem to solve was the prevanlence of false positives. Even if you have a low false positive rate, it's still very likely to have an image blocked regularly just due to volume.<p>For that reason, the focus of my plugin is - for users that are OK with it - contributing URLs that are not NSFW but have been blocked. The extension includes a right-click menu option to do so.
Why would I care about not seeing NSFW images?<p>What I really <i>do</i> care about is not seeing worthless clickbait. That makes me endlessly angry. Could you build a filter like that on top of this?
This extension reminded me of that Black Mirror episode where a mother filtered what her daughter could see and hear.<p>Great work with TensorFlow by the way, can't wait to see this technology maturing over the years.
Missed opportunity - where is the browser extension that makes all of the images NSFW?<p>You leave your laptop unlocked and suddenly it looks like the nude bomb went off (<a href="https://m.imdb.com/title/tt0081249/" rel="nofollow">https://m.imdb.com/title/tt0081249/</a>).
I've been wanting for something like this ever since we by accident looked at our 9 year old daughter's screen. She likes to draw characters from her favorite cartoon and searches for images on the web. Apparently there's a porn actress with the same name as one of the characters... DDG was set to "strict", so no full nudity or any explicit act, but still <i>very</i> NSFW.
This would be too late in my workplace, as they'd be logged that I downloaded the images. That I didn't see them in my browser would be irrelevant.
I was qurious about the classifier, and as expected it appears to be based on crawling to get the unsafe images - I doubt it's quite in compliance with neither copyright or gdpr..<p>The classifier lib:<p><a href="https://github.com/alex000kim/nsfw_data_scraper" rel="nofollow">https://github.com/alex000kim/nsfw_data_scraper</a><p>Announcement points to this data source (while the readme hints at a "premium" classifier):<p><a href="https://github.com/infinitered/nsfwjs" rel="nofollow">https://github.com/infinitered/nsfwjs</a>
Anyone know of a tool which can process videos (movies) and black/blur out nude scenes and make the film "family friendly"? The processing doesn't have to be realtime...<p>Just to expand: there are many excellent films which are not "family friendly" only because of 1 or 2 nude scenes which aren't even germane to the overall plot in many cases. I often wished there was a tool or SaaS that could detect such scenes and cut or blur them out. Would save a lot of manual processing.
If you want to make videos family friendly, it's probably better to aim for removing violence than nudity, as one is clearly worse for the psych than the other.<p>Expansion: how are nude scenes not family friendly? Not trying to start any flame wars here, just trying to understand how one or two scenes with nudity suddenly means the video is not family friendly? We all go through life seeing nudity, so should be fine. Violence however is something we both strive and should avoid as much as possible.
Now just make a version that whites-out NSFW text in e-books and market it to Utahns as a way to read PG-13 versions of Game of Thrones books and you'll be rich.<p>However, you'll probably also be sued into oblivion by angry authors/publishers who don't like people modifying what they read in any way.
I happen to not be afraid of boobs, so ever since my days on Reddit I wish for blocking of random gore instead. This is weirdly pertinent sometimes on DDG's image search.<p>Edit: to clarify, I'm not afraid of some killing either, thanks to the pop culture of the past seventy years or so. Now, why eye-hurting images of bodily damage pop up on rather innocent searches—that's a haunting mystery. On Reddit, the ‘NSFW’ label is used equally for a vaguely sexually suggestive shape or a close-up more suitable for a surgical journal. As if I didn't get plenty of suggestiveness just from music videos anyway! So my long-standing wish was for an ‘NSFL’ filter instead.
Has anyone confirmed it isnt tagging non-pornographic images? Breastfeeding, breast exam tutorials, sand dunes, etc<p>Has anyone tested to see if it correctly handles PoC? That demo site it points to almost exclusively light skinned folk.<p>These are the basic things I’d want confirmation of.