Now if only they would make one that detected violent and hateful content.<p>You know, things people should <i>actually</i> be concerned about children seeing.
Even a really close up picture of a face will be seen as nudity. Trying to detect a body (or parts of a body) and determine how much skin is visible would probably be a better approach.<p>Also quite relevant is this site: <a href="http://www.yangsky.com/researches/physicallinguistics/PLUnderstand/humanbody/humanbody.htm" rel="nofollow">http://www.yangsky.com/researches/physicallinguistics/PLUnde...</a>
Besides a "breast detector/nipple detector" he also created a couple of weird detectors like a "cowgirl sex position detector" among others.
(The link is sfw, the individual detectors are not)
The algorithm is mostly based on this paper:
<a href="http://www.math.admu.edu.ph/~raf/pcsc05/proceedings/AI4.pdf" rel="nofollow">http://www.math.admu.edu.ph/~raf/pcsc05/proceedings/AI4.pdf</a>
but there are some steps open. I've implemented this algorithm because it's not as hardware intensive as the usual nude detection algorithms (such as searching for specific body parts)
Interesting concept.Scanning through the code, reading just the comments, it seems to base the nude-not-nude decision by the amount of skin shown.<p>Hmm.. I guess it depends on what your definition of nudity is.