TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Stable Diffusion Public Release

6 pointsby madmax108over 2 years ago

1 comment

O__________Oover 2 years ago
&gt;&gt; “The safety checker: Following the model authors&#x27; guidelines and code, the Stable Diffusion inference results will now be filtered to exclude NSFW content. Any images classified as NSFW will be returned as blank. To check if the safety module is triggered programmaticaly, check the nsfw_content_detected flag like so: Potential NSFW content was detected in one or more images. Try again with a different prompt and&#x2F;or seed.&quot;<p>— That’s disappointing and appears conflict my understanding of Stability.AI’s founder claim that limits like these would not be injected into the code; this based on an interview he did here:<p><a href="https:&#x2F;&#x2F;m.youtube.com&#x2F;watch?v=YQ2QtKcK2dA" rel="nofollow">https:&#x2F;&#x2F;m.youtube.com&#x2F;watch?v=YQ2QtKcK2dA</a><p>If I am correct, makes me question all the other claims he made, which is unfortunate.<p>___<p>Edit — Here is a direct link to point in the interview I was referring to above:<p><a href="https:&#x2F;&#x2F;youtu.be&#x2F;YQ2QtKcK2dA?t=701" rel="nofollow">https:&#x2F;&#x2F;youtu.be&#x2F;YQ2QtKcK2dA?t=701</a>
评论 #32554804 未加载