The title and story are rather misleading, he's been charged with “exposing a child to harmful material” and “sexual contact with a child under age 13" rather than for generating the images per-se.
With an image of a real scene, I suppose it would be a defense against a child porn charge to prove that the participants were not minors. But there could be no such defense if the porn is generated. It becomes a matter for judges and juries to decide if a given picture of a person is over or under a particular age. I wonder if it would be a defense to prove that the <i>prompt</i> did not designate an illegal age.<p>It may be a useful liability hedge to provably include the prompt in generated content, like as a watermark. But since existing generation isn't repeatable, and the prompt can't reliably generate the same image, is that even possible?
it seems to me like we're missing an opportunity to give these people a victim-free outlet for their urges.<p>i'm not really educated on what treatment looks like for these people, and harms to society from AI CSAM should obviously be considered too.<p>but some kind of "prescription" or otherwise regulated generation of these images should at least be considered, i think.
"Today’s announcement sends a clear message: using AI to produce sexually explicit depictions of children is illegal"<p>Not with the charges given. Prosecutors seem keen to push the normalisation angle, but would then have to explain why FPS games aren't also illegal. Unless they can prove real CSAM was used to train the model so they can draw a line from the material to the defendant's production.<p>Let's see if there are more charges to come.
I guess there are two cases, a Wisconsin state case (March, out on bail, still pending) and this federal case. Apparently it is not double jeopardy to re-arrest for the same thing (or different statutes, in this case) if one is state and one federal.