TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Revealed: The names linked to ClothOff, the deepfake pornography app

67 点作者 hboon大约 1 年前

10 条评论

Silasdev大约 1 年前
It&#x27;s inevitable. We are in a normalization phase right now. Soon, kids will just shrug at stuff like that, because they will all know it&#x27;s likely fake and therefore not care much about it.<p>However, I feel really bad for those who has to be the ones to go through the pain of this phase though.
评论 #39559138 未加载
评论 #39559388 未加载
ein0p大约 1 年前
You can do this with software and models off GitHub quite easily. My wife and I tried it, and it inpaints a body that, while seamless and realistic, looks nothing like her own. In fact changing clothes looks much more realistic (you can do that too, with the same software). It’s gotten to the point where 4chan folks are trying to start a trend they call “DignifAI”, and put clothes onto naked pictures of women from OnlyFans and inpaint cute kids in their arms. Would that be called non-consensual dignity, I wonder?
评论 #39559285 未加载
评论 #39560601 未加载
评论 #39559326 未加载
评论 #39559380 未加载
Nevermark大约 1 年前
I have to think that soon nobody is going to care, because that’s the only real solution.<p>The law should forbid&#x2F;punish harassment, and children need to learn not to harass.<p>But the cost of generic-task photo manipulation, like this, is going to zero. We are only one browser “no cloth” plugin away from being able to surf a world of nude people.<p>Most people are going to take this stuff as seriously as someone drawing horns or a mustache on a picture of them.
ako大约 1 年前
Maybe we should just all flood the internet with fake pictures&#x2F;videos of ourselves, faces on top of the most beautiful bodies that can be imagined. This way everyone seeing it will know it’s fake, and start to ignore it. I don’t really see the harm of a nude picture…
评论 #39559261 未加载
neom大约 1 年前
AI-generated fake nude photos of girls from Winnipeg school posted online:<p><a href="https:&#x2F;&#x2F;www.cbc.ca&#x2F;news&#x2F;canada&#x2F;manitoba&#x2F;artificial-intelligence-nude-doctored-photos-students-high-school-winnipeg-1.7060569" rel="nofollow">https:&#x2F;&#x2F;www.cbc.ca&#x2F;news&#x2F;canada&#x2F;manitoba&#x2F;artificial-intellige...</a>
评论 #39559415 未加载
spaceman_2020大约 1 年前
As the father of a baby girl, the future scares me sometimes.<p>I don’t want the internet to be an over regulated, bureaucratic mess, but the way AI is going, it might just have to be.<p>Deepfakes, voice clones, extremely sophisticated hacks and phishing…it looks scarier than ever.
评论 #39559008 未加载
评论 #39572394 未加载
评论 #39559155 未加载
评论 #39559183 未加载
Animats大约 1 年前
This is just annoying. It&#x27;s fake evidence that&#x27;s a big worry. Not much from a camera can be trusted any more.
评论 #39559568 未加载
mkoryak大约 1 年前
When we had kids, my partner and I decided that we would never upload their pictures to any social media sites, or any other semi-public sites.<p>It was a good idea, and now it&#x27;s becoming an even better idea.
评论 #39559147 未加载
评论 #39559167 未加载
评论 #39559051 未加载
apantel大约 1 年前
We are going down the rabbit hole big time.
verisimi大约 1 年前
The thing is, is this story even true?<p>We know that media outlets want some sort of legislative process that will designate their content as truth (tm), to create a sort of moat from unsanctioned information. To get support for that, you need the stories.<p>This story reads like investigative journalism, but is it really? The journalism seems self-serving. Real, free investigators would research even the &#x27;good guys&#x27; (tm) - eg they would have connected zelensky to the Panama papers or whatever other scandals there are.<p>It seems the media only research when it is in their favour, and yet we are meant to think they are appropriate handlers of truth. The chutzpah is quite funny.