This entire debacle sounds like the inhabitants of the Amazon rainforest brainstorming "ways to fight photographic cameras that steal your soul".<p>Pretty soon anyone will be able to download a pornifyer for personal use, and it will be completely legal to use as long as they don't publish the results. The non-consensual pornography Armageddon is greatly exaggerated, nobody will care about such "creations", it will be just another dual use technology.
From the article: "With current techniques, it will be hard for victims to identify who has assaulted them and build a case against that person."<p>The author is trying to claim it is as serious as assault. That is madness. It's defamation at most.<p>Some very primitive instinct is involved in anything to do with sex, that explains why we are so irrational about it.
The only thing about this that I’m concerned about is what happens when ransomware levels up and starts creating plausible scenarios of impropriety with photorealistic evidence.<p>Imagine you get a text one day:<p>“This is Sharon, I need you to send me $1000 via crypto or I’ll send these photos of us at the hotel to your wife”. Followed by sexually explicit photos of you and a woman who works at your company, but you’ve never even met in person, only ever zoom meetings.<p>You call the number and the person who answers sounds just like Sharon. She ignores everything you say, tells you that you have an hour to send the money or she’s going to call your wife, tell her “everything” and send the photos then hangs up.
The whole controversy about Taylor Swift having AI-generated images made about her seems astroturfed. How is this any different from the fake celebrity nudes that have been floating around the internet since the Usenet days? If anything, this is an improvement, as people can be much more creative than simply making someone nude by putting their head on someone else.