TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

MP Maria Miller wants AI 'nudifying' tool banned

38 pointsby unklefolkalmost 4 years ago

11 comments

JulianMorrisonalmost 4 years ago
More precisely, she wants the distribution of nudes without consent, real or faked, to be considered an offense. This isn't about the tool, but what's being done with it.
评论 #28060097 未加载
评论 #28060059 未加载
评论 #28060021 未加载
评论 #28060056 未加载
Nextgridalmost 4 years ago
I think outlawing this tool would be counter-productive.<p>One of the reasons nudes being released is damaging is because it&#x27;s a rare enough and noteworthy event. If because of this tool everyone has nudes of them floating around then it would become a normal thing and would actually remove most of the damage around <i>real</i> nudes leaking by providing plausible deniability (assuming anyone ever cares at this point - if the world is drowning in nudes of everyone, the real thing will probably go unnoticed anyway).<p>Outlawing the tool wouldn&#x27;t actually stop malicious usage of it but because only criminals would use it it would make its (rarer) usage more damaging than if anyone could legally use such a tool and nudes stop being a noteworthy event.
isamuelalmost 4 years ago
In America, the federal child pornography law applies only to depictions of an actual child (and you have to know it, for possession offenses, though that’s another matter). But the Justice Department has long taken the position that an image of a clothed child that’s altered to then make the child look nude—-they used to call these “morphed” images—-counts. I don’t think it’s ever been definitively resolved by the Supreme Court, and I don’t know what the courts of appeals have said, but tools like DeepSukebe have made that argument way more appealing. I’d bet that this is where regulation will begin: images of children. That has always been a domain where American courts have been extremely reluctant to intervene; for example, <i>any</i> visual depiction of a seventeen year old engaged in sex is proscribable without resort to the ordinary inquiry into whether the work as a whole is “obscene,” etc.<p>But under reigning American First Amendment law, it gets a lot harder to explain why a law like the one being proposed here would be acceptable. The Supreme Court has, for example, held that the distribution of animal-cruelty videos cannot be forbidden. And it’s not clear to me how one could proscribe the distribution of an imaginary visual depiction of an adult who was nude. You could call it defamatory, I suppose, but if it’s concededly fictional… I don’t know.
评论 #28060304 未加载
评论 #28060153 未加载
评论 #28060220 未加载
评论 #28060116 未加载
basiswordalmost 4 years ago
It will be interesting to see how this kind of thing plays out. I’m sure it’s quite distressing if a tool like this is used on your photo and then potentially shared in your friendship group. Hopefully we very quickly get to the point that nobody will ever be able to know if a photo is real or fake and it’s just not considered an issue therefore. Policing it seems like it would be extremely difficult. Maybe we police the intent? In other words you can produce the images but if you use it maliciously against a person then there is a crime.
评论 #28060017 未加载
评论 #28059899 未加载
prependalmost 4 years ago
Looking forward to seeing how this resolves.<p>I remember when deepfakes first was released there was a group who would deepfake coworkers, Facebook friends, etc for a really low cost (like $100) as long as the target had a few hundred public photos.<p>This is without consent as well, but it’s also not real. It seems like the equivalent of imagining people nude. Kind of creepy if I know it’s happening but not truly a violation of my privacy.
评论 #28059960 未加载
Joakalalmost 4 years ago
Wow, talk about technology making hijab and similar clothes obsolete which will be a massive culture shock.
sorokodalmost 4 years ago
There is a continuum there between harmless to deeply offensive. The exact location in the continuum will at the very least depend on the person being subjected to this treatment and the cultural context.<p>The &quot;AI&quot; aspect will amplify the offense because of how life-like the end result can be.
评论 #28060178 未加载
knipsteralmost 4 years ago
So... Streisand effect an accident that makes this capability more prevalent and disturbing?<p>OR<p>Streisand effect intentional to make this so common that it no longer draws attention
评论 #28060561 未加载
Tychoalmost 4 years ago
Just wait till the AI visual paternity tests get here.
评论 #28060131 未加载
评论 #28060074 未加载
dannywalmost 4 years ago
How effective could it be short of an international convention?<p>Reality is, unless most countries ban it, it&#x27;s gonna be on the internet.
评论 #28060350 未加载
villgaxalmost 4 years ago
Wow, next up Photoshop filters&#x2F;brushes too?
评论 #28060597 未加载
评论 #28060219 未加载