TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Fake celebrity porn is blowing up on Reddit, thanks to artificial intelligence

226 pointsby LearnerHerzogover 7 years ago

26 comments

procedural_loveover 7 years ago
&gt; We assume, too, that face swapping is the end game, but it’s clearly just the beginning.<p>Isn&#x27;t the end game an endless stream of personalized content for everyone? Wherein the entire corpus of human-created media becomes a training set for our fantasies.<p>It is interesting how entertainment is again pushing the boundary of technology. Soon enough this push to make face editing tools for porn more accessible to everyone will allow anyone to:<p>1) Replace their ex-husband&#x27;s face in their old family videos with their new husband&#x27;s face.<p>2) Create a viral video of Donald Trump murdering someone.<p>3) Be the star of their favourite movie, porn or otherwise. (What&#x27;s the effect this would have on people&#x27;s memories, when they actively see themselves doing everything James Bond does, for instance? Shooting people, being generally powerful, and &quot;getting the girl&quot;?)
评论 #16227549 未加载
评论 #16227201 未加载
评论 #16227147 未加载
评论 #16227832 未加载
评论 #16227431 未加载
评论 #16227277 未加载
评论 #16227121 未加载
评论 #16227076 未加载
评论 #16230626 未加载
评论 #16227417 未加载
hirundoover 7 years ago
Technology is degrading the value of photo and video evidence (and probably audio too) asymptotically toward that of famously unreliable testimony from memory. Criminality becomes less risky and&#x2F;or innocence becomes less protective. Law becomes less effective. A bad result, to the extent that the law isn&#x27;t an ass.<p>On the plus side artistic tools that help materialize internal life become more effective. We can interact with our dreams and fantasies more readily, to potentially therapeutic benefit.<p>It&#x27;s hard to say whether this trend holds more danger or promise.
评论 #16227081 未加载
评论 #16227734 未加载
评论 #16227387 未加载
评论 #16227111 未加载
评论 #16228447 未加载
评论 #16227380 未加载
评论 #16227600 未加载
r3blover 7 years ago
The was a pretty good discussion yesterday in &#x2F;r&#x2F;cyberpunk about the possible consequences of this: <a href="https:&#x2F;&#x2F;www.reddit.com&#x2F;r&#x2F;Cyberpunk&#x2F;comments&#x2F;7sexm6&#x2F;deepfakes_porn_fakery_realistically_pasting&#x2F;" rel="nofollow">https:&#x2F;&#x2F;www.reddit.com&#x2F;r&#x2F;Cyberpunk&#x2F;comments&#x2F;7sexm6&#x2F;deepfakes...</a><p>&gt; The subreddit &#x2F;r&#x2F;Deepfakes became very active very fast and new deepfakes are submitted every day with varying degrees of realism. The most scary part is that ANYONE can be deepfaked, not just celebrities. Provided you have the right hardware (because neural networks demand beefy video cards for training) you could train a model of your friend and paste her face onto a porn video and boom. All you have to do is download a browser extension that downloads all photos from someone&#x27;s instagram and work from there. Nobody is safe from this.<p>&gt; I think this here is as cyberpunk as it gets. The technology is 4 months old and has already yielded extremely realistic results. Think of what we will have one year from now. Something like this matches both the high tech and the low life aspect of the cyberpunk genre.<p>EDIT: Pasted the wrong link.
评论 #16227080 未加载
评论 #16226863 未加载
no1youknowzover 7 years ago
When I look at the future, I think back at these videos.<p>Hells Club:<p>Part 1: <a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=QajyNRnyPMs" rel="nofollow">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=QajyNRnyPMs</a><p>Part 2: <a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=wfYlTtA7-ks" rel="nofollow">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=wfYlTtA7-ks</a><p>Where I would like this to go. Is either being able to take scenes from different films and create mashups like this.<p>Or perhaps, getting a whole bunch of extras. Narrating lines and acting in-front of basic sets with green screens. Then putting the faces of recognizable actors and using something like Lyrebird for the voices. Where actors have sold the rights of their faces, voices and personality for cheap.<p>Now you have a $100m movie for the cost of $100k.<p>A similar premise of the film: The Congress.<p>-----<p>I really think in about 5 years, when the software is there and the dedicated IaaS to train the sets is commonly available. We&#x27;ll start to see some really cool stuff.
评论 #16227177 未加载
ryanmarshover 7 years ago
So I clicked the link in the article (for science, so you don’t have to) and I’m blown away. People are doing this on home computer rigs? I thought I was going to find some really crappy paste jobs but instead I found myself having to completely second guess what I was seeing. Some of the videos of course suffer from odd minor defects that give up their authenticity but others were flat out as real as anything else I’ve ever seen.<p>Now I’m concerned about the implications of this. We already know any image can be faked and almost any video but we also laugh at people who say the moon landings were faked. Given this though how could anyone believe video evidence of say, the president with Stormy Daniels, which is a matter of unfortunate import with real consequences?<p>How hard would it be to fake an international incident from multiple vantage points?
评论 #16227383 未加载
Raphmediaover 7 years ago
I&#x27;ve been thinking about this for a long while. I think this is good.<p>With face recognition, old pictures you might have posted online are very easy to find. Some ex-boyfriend shared a naked picture of you? You are screwed.<p>Now, you can simply say that it is a deepfake. Everyone will have naked pictures of &quot;themselves&quot; online, even if they are fake.
评论 #16227911 未加载
评论 #16230392 未加载
tomaskafkaover 7 years ago
Non-NSFW sample: <a href="https:&#x2F;&#x2F;www.reddit.com&#x2F;r&#x2F;deepfakes&#x2F;comments&#x2F;7sjkw5&#x2F;ilm_fixed_that_for_you_edition&#x2F;" rel="nofollow">https:&#x2F;&#x2F;www.reddit.com&#x2F;r&#x2F;deepfakes&#x2F;comments&#x2F;7sjkw5&#x2F;ilm_fixed...</a><p>&gt; Top is original footage from Rogue One with a strange CGI Carrie Fisher. Movie budget: $200m<p>&gt; Bottom is a 20 minute fake that could have been done in essentially the same way with a visually similar actress. My budget: $0 and some Fleetwood Mac tunes
评论 #16228117 未加载
dictumover 7 years ago
First step towards this: <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=6272626#6272744" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=6272626#6272744</a>
gitgudover 7 years ago
A quote that struck me from previous discussion on the topic:<p>@ekimekim 44 days ago<p>&quot;We&#x27;ve already seen this with images and Photoshop. Society and their heuristics of belief will adjust as these new capabilities become widespread.<p>What&#x27;s more troubling is that as media becomes falsifiable, solid evidence of...well, anything, becomes hard to have.<p>The ultimate loser there is the truth, sadly.&quot;
GuiAover 7 years ago
The inevitable outcome is that no recorded media will be taken at face value unless there is immense proof in some way of its veracity.<p>In the short term, this will probably lead to all kinds of terrible things (kids getting bullied through computer generated imagery of them, people being fired for videos of them saying things they never said, jealous spurned lovers attempting to break apart marriages with fake videos, etc.)<p>In the long term, it might actually be a good thing - instilling a strong sense of caution for anything that claims to be recorded from the real world.
评论 #16226942 未加载
评论 #16226963 未加载
wasxover 7 years ago
This is... scary. The potential applications of this technology extend far beyond porn. How long until intelligence agencies are using this sort of technology to sabotage political opponents?
评论 #16226944 未加载
评论 #16227176 未加载
评论 #16227630 未加载
评论 #16227611 未加载
braindongleover 7 years ago
An interesting angle here is the arms race between manipulation and forensics. In image forensics, clever people are using clever techniques to keep us tethered to some notion of authenticity in digital media. Like this guy: <a href="http:&#x2F;&#x2F;www.cs.dartmouth.edu&#x2F;farid&#x2F;downloads&#x2F;publications&#x2F;wifs17.pdf" rel="nofollow">http:&#x2F;&#x2F;www.cs.dartmouth.edu&#x2F;farid&#x2F;downloads&#x2F;publications&#x2F;wif...</a><p>These emerging video manipulation tools open new frontiers for related forensics research. In 10 years, when we see a video of someone doing something horrible, these people are perhaps our only hope of knowing whether or not what we&#x27;re seeing ever happened.
bob_theslob646over 7 years ago
This seems like the beginning of a giant problem. First we had Adobe able to replicate a user&#x27;s voice after listening to it for 20 to 40 minutes, now this.<p>I guess vein scanning is going to happen sooner or later in order for personal verification.
TaylorAlexanderover 7 years ago
Interesting.<p>It seems &#x2F;r&#x2F;deepfakes (NSFW) is where the content is at (I assume the article doesn’t link to that, but haven’t checked).
herogreenover 7 years ago
Maybe one day everyone will be using cameras with a combination of digital signatures and watermarking technologies.<p>That said not being able to definitely classify videos between &quot;faked&quot; ones and &quot;original&quot; ones could help people suffering from revenge porn (or political manipulations)
评论 #16227083 未加载
评论 #16228221 未加载
评论 #16227499 未加载
dontreactover 7 years ago
Between VR and the possibility to train up person-specific porn generators, porn is only going to become more and more of a super stimulus that is too hard to resist. Best to try and quit fully now.
Koshkinover 7 years ago
John C. Wright, in his book <i>The Golden Age</i>, has worked out a pretty complete picture of the ability to mold one&#x27;s perception of reality to one&#x27;s liking.
ProxCoquesover 7 years ago
It&#x27;s interesting that the issue of ensuring authenticity might also swing into censorship too: if politicians only issued certificates for images of themselves doing good things, they can then dismiss those of them doing bad things as fake because the creator couldn&#x27;t produce the cert.
评论 #16227202 未加载
Faaakover 7 years ago
This technology makes me afraid because it deconstructs many core value I have.<p>You can do plenty of things with it. Create fake porn among others. But is it ethical ? Is it ethical to create fake porn with a celebrity face ? With your ex-girlfriend&#x27;s ? What about if you keep it for your own personal use ?<p>But then were are the limits ? What about a &quot;teen&quot; ? What about fake child pornography from pictures of childs you found on the internet that permit to alleviate yourself without causing any harm to others ?<p>I would at first think its okay if you keep it for private and own use, but then that becomes scary because we&#x27;ve been &quot;trained&quot; to think that child porn is not okay.<p>Don&#x27;t know why this is been downvoted ?
评论 #16227650 未加载
评论 #16226959 未加载
jonduboisover 7 years ago
This is great for celebrities... From now on, if they get their phones hacked and get their private videos stolen and shared publicly, they can (plausibly) claim that it&#x27;s not real.
paul7986over 7 years ago
And this is why we need a verified identity system on the Internet.<p>With such in place you will know who created the video as their verified identity is attached to it. If there&#x27;s no verified identity attached to it the video wont hold any weight. Same goes for everything done on the iNet .. you can use it anonymously where what you say or do doesn&#x27;t hold much or any weight vs. commenting,posting, etc using your verified ID does.<p>This just one solution that could help with all the fakery on the iNet and the mayhem it brings and will continue to bring but worse.
评论 #16227758 未加载
JetSpiegelover 7 years ago
After over 100 years of having a massive professional industry dedicated to creating fake sequences of images for fun and profit, another decrease in the capital needed to create films doesn&#x27;t seem such a revolutionary thing.<p>Just like people watched that first film of the train moving towards the camera and freaked out, while today that seems quaint, humans as pattern-matchers extraordinaiers will find a way to discern the fakes.
peterjleeover 7 years ago
Finding the origin of contents are already a challenge these days. A random idea I have is creating a blockchain where content creators can register their creations to prove their origin. Also, camera manufacturers can get involved and build in a hardware that signs every picture and video captured by that camera which can then be registered to the blockchain.
评论 #16227706 未加载
sjg007over 7 years ago
Well definitely black mirror but you can now recreate a passed loved one virtually.
grinsekatzeover 7 years ago
...and the “nothing” is claiming all of Phantastica.
cobbzillaover 7 years ago
&quot;DNA or it didn&#x27;t happen&quot;
评论 #16228966 未加载