Fake porn of celebrities has been prevalent on the Internet for a long time. I don't have precise statistics, but I think it might predate the web. For the most part it hasn't been particularly controversial. Now that advances in technology have significantly improved the quality, everybody suddenly has a problem with it.<p>Let's back this up a few steps. You're an actor, you appear on television. By being filmed and accepting the pay offered to you, you're agreeing to allow these images of you be disseminated to the general public, for their enjoyment. But what if somebody finds you attractive, and looks at your picture whilst... you know. Can you sue? No, they're well within their rights to do so. What if they cut your face out and place it over a Playboy centerfold? Same deal. Several technogical innovations later, here we are. Fundamentally, nothing has changed. Fundamentally, people are still 100% within their rights to combine images legally obtained in this way. And post them online. This may not be what the Internet was created for, but this was always what it was used for.
There's no way of putting this tech back in the box. And it's only going to get more powerful.<p>What if someone combines it with some kind of "deep ageing", to create artificial child porn? Sexual representations of children, even if completely artificial and involving no victims, are illegal in many jurisdictions, but not all.<p>We're only scratching the surface of what semantic editing of video is going to be capable of. It's a very big barrel of worms.
I'm on the fence on this one. If I draw a stick figure and put your name above it, is that unethical? What if I add a speech bubble saying something risqué? What if I'm a talented artist and I draw something pretty life-like? I don't see the lines here being too clear without claiming down on all expression.
What's scare me the most it's not the porn. But the fact that make this kind of tech it's kinda-avaliable to anyone with a GPU and a few hours of learning and training.<p>The only thing that make this not a treat to your regular folk it's the fact that needs a lot of images references to make the model, but imagine a politician or activist, they have a lot of images on the net; So this can take the fake news to another level. Yeah if this happens the news media and legal system, will probably not take shady videos seriously without verification (specially now, that the algorithm it's still is in the middle of the uncanny valley, so for the moment it's easy to recognize without experts need). But think about your friend, uncle or cousin that shares his "echo chamber" posts on facebook?<p>Or who knows, maybe I should't binge black mirror, and this only gets limited to porn and good uses. Like a new era for stunts in movies.
So, subreddits that do "safe for work" deepfakes are still around and allowed. This tells me that the technology will just get better and better and the use for it to create NSFW deepfakes will likewise get better and better (while existing underground?) until it really is impossible even for an expert to tell that they are fakes. That is what I assume will happen anyway.
These bannings and the process making the news will definitely have the Streisand effect. The community might move on from Reddit but so many people have been made aware of the possibilities that it's impossible to keep a lid on it.
So glad that they banned this crap. This is actually a fun project to get started with deep learning as you paste Nicholas Cage in every movie you can imagine. Unfortunately it's been dominated by all this porn creepiness. Hopefully the discussion sorrounding this will get friendlier from now on.
It seems to have gone unnoticed, but since a week or so Github requires sign-in for the deepfakes repos (<a href="https://github.com/deepfakes" rel="nofollow">https://github.com/deepfakes</a>). Note that these are public repositories.