Can you consider giving a score instead of a binary decision?
Or differentiate between, say, beach pics and true 18+ content.<p>Edit: The accuracy leaves a lot to be desired - you can paste any images from clothing stores of models in dresses or tank tops and it will flag it as NSFW.<p>The "AI" seems to replicate an Islamic fundamentalist - a woman in a burqa did pass as SFW ;)<p>Google offers a hosted SafeSearch version [1] which has a lot more nuance: <a href="https://cloud.google.com/vision/docs/reference/rpc/google.cloud.vision.v1#google.cloud.vision.v1.SafeSearchAnnotation" rel="nofollow">https://cloud.google.com/vision/docs/reference/rpc/google.cl...</a>
It's a good idea in theory, but what if it shuts down? And what about the privacy implications of sending all images to some server across the world?