i am sympathetic to this concern and the fact that real harms are already being ignored. much technological violence becomes normalized this way. but along other lines it's interesting how the rejection of respectability politics insulates a large segment of the American radical scene from deepfake attacks and other lies.<p>why worry about enemies faking some kind of sexual scandal when you're already doing porn on the side? why worry about drug scandals when prohibition is a crime and it's cool to party? why avoid destruction when the tv is already saying you burned down a whole city? why avoid fighting the cops when you know they're just going to beat the shit out of you and hold a press conference saying you deserved it? it's funny how these kind of media attacks eventually just admit the slanders into acceptability.<p>edit: and of course now hn's reactionary squad have flagged the post. cool
I understand the sentiment behind the post, but what exactly is the author proposing? The tech is out there, most people seem aware such a thing exists and yet she wants us all to be angry at something...?<p>> millions of women who will reconsider whether they want to be in a public facing role at all — is an existential problem for representative democracy.<p>Wow, really? Seems like needless fear mongering to me
I'm not convinced this will be as big of a problem for national security as suggested. Pornographic images can't be broadcast and can't be shared on Facebook. Twitter can already label deep fakes iirc.<p>The focus on women politicians is a but off too to my mind, a deep fake of a male politician performing some kind of taboo or salacious that could also be damaging.
The article title is an odd frame around this issue: if anything, the viral Tom Cruise deep fakes <i>revitalized</i> attention toward potential misuse of the technology (and the current use cases of deepfake porn, which was covered by mainstream media, e.g. <a href="https://www.buzzfeednews.com/article/janelytvynenko/telegram-deepfake-nude-women-images-bot" rel="nofollow">https://www.buzzfeednews.com/article/janelytvynenko/telegram...</a> )<p>A good Twitter thread on the Tom Cruise deepfakes by the author of that article: <a href="https://twitter.com/JaneLytv/status/1365362169827762184" rel="nofollow">https://twitter.com/JaneLytv/status/1365362169827762184</a>
Yep, its a problem. The major porn sites have decided to block this for celebrities at least. Not sure whether they also detect / fight deepfakes of non-celebrities. afaik, deepfakes are against their ToS and their seems to be decent enforcement. I think the porn sites have done decently here.<p>That doesn't do much about the Telegram groups, and any dedicated channels and decentralized channels. Thing is, I am not sure what could be done there. Revenge porn seems like a worse problem, with roughly the same distribution efforts. Hence, anything that could tackle deepfake porn should also be able to tackle revenge porn. And afaik revenge porn is far from a solved problem. Hence it seems like this is a hard problem to solve, and one that is already indirectly being worked on.<p>I guess that is how this will be resolved, as a side-effect of the solution to revenge porn. I can't really imagine much except for making it (more?) illegal. Thing is, you gotta get enforcement, and you probably have to do it without entrapment.<p>I guess my point is "Issue is real, some things are being done. A solution for what remains seems difficult, and might come from a solution to revenge porn".
What the hell. I feel bad for any celebrities that are targets of deep fakes especially in a pornographic context. But, I think reasonable people can agree that state sponsored deep fake propaganda and mass manipulation is far more dangerous to society than a fake sex tape.
What could "the law" possibly do about this?<p>I mean, you can basically run deep fake algos on your machine at home with pictures you find freely available online. This way you don't even have to share that fake stuff online and still everyone who wants it has it.
When I got online in the late 90s the internet was already filled with fake images of celebrities. This isn't a new issue. Sure, there are images that can be identified as fake based on the content, like Vladimir Putin riding a bear, but for 20 years the internet has been filled with images that the average person cannot determine are real or fake.<p>Yes, technology will improve, we'll get realistic fake video and audio, and it'll become easier and easier to generate. If anything, that'll simply make people more skeptical of fakes, since they'll be able to generate their own in a few minutes on their phone. The US government isn't going to see a YouTube deep fake of Putin declaring nuclear war and take action.<p>As the saying goes, "extraordinary claims require extraordinary evidence". Audio and video will just no longer be extraordinary evidence in the future.