More precisely, she wants the distribution of nudes without consent, real or faked, to be considered an offense. This isn't about the tool, but what's being done with it.
I think outlawing this tool would be counter-productive.<p>One of the reasons nudes being released is damaging is because it's a rare enough and noteworthy event. If because of this tool everyone has nudes of them floating around then it would become a normal thing and would actually remove most of the damage around <i>real</i> nudes leaking by providing plausible deniability (assuming anyone ever cares at this point - if the world is drowning in nudes of everyone, the real thing will probably go unnoticed anyway).<p>Outlawing the tool wouldn't actually stop malicious usage of it but because only criminals would use it it would make its (rarer) usage more damaging than if anyone could legally use such a tool and nudes stop being a noteworthy event.
In America, the federal child pornography law applies only to depictions of an actual child (and you have to know it, for possession offenses, though that’s another matter). But the Justice Department has long taken the position that an image of a clothed child that’s altered to then make the child look nude—-they used to call these “morphed” images—-counts. I don’t think it’s ever been definitively resolved by the Supreme Court, and I don’t know what the courts of appeals have said, but tools like DeepSukebe have made that argument way more appealing. I’d bet that this is where regulation will begin: images of children. That has always been a domain where American courts have been extremely reluctant to intervene; for example, <i>any</i> visual depiction of a seventeen year old engaged in sex is proscribable without resort to the ordinary inquiry into whether the work as a whole is “obscene,” etc.<p>But under reigning American First Amendment law, it gets a lot harder to explain why a law like the one being proposed here would be acceptable. The Supreme Court has, for example, held that the distribution of animal-cruelty videos cannot be forbidden. And it’s not clear to me how one could proscribe the distribution of an imaginary visual depiction of an adult who was nude. You could call it defamatory, I suppose, but if it’s concededly fictional… I don’t know.
It will be interesting to see how this kind of thing plays out. I’m sure it’s quite distressing if a tool like this is used on your photo and then potentially shared in your friendship group. Hopefully we very quickly get to the point that nobody will ever be able to know if a photo is real or fake and it’s just not considered an issue therefore. Policing it seems like it would be extremely difficult. Maybe we police the intent? In other words you can produce the images but if you use it maliciously against a person then there is a crime.
Looking forward to seeing how this resolves.<p>I remember when deepfakes first was released there was a group who would deepfake coworkers, Facebook friends, etc for a really low cost (like $100) as long as the target had a few hundred public photos.<p>This is without consent as well, but it’s also not real. It seems like the equivalent of imagining people nude. Kind of creepy if I know it’s happening but not truly a violation of my privacy.
There is a continuum there between harmless to deeply offensive. The exact location in the continuum will at the very least depend on the person being subjected to this treatment and the cultural context.<p>The "AI" aspect will amplify the offense because of how life-like the end result can be.
So... Streisand effect an accident that makes this capability more prevalent and disturbing?<p>OR<p>Streisand effect intentional to make this so common that it no longer draws attention