TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Biases in Apple's Image Playground

43 点作者 orf3 个月前

8 条评论

thisismyswamp3 个月前
AI safety people worrying that basketball players don't have a perfectly balanced ethnical representation while mega corporations are trying to establish a monopoly on intelligence
评论 #43079833 未加载
评论 #43079827 未加载
madeofpalk3 个月前
I don&#x27;t know whether I would call it <i>bias</i> or not (and this is Apple&#x27;s model falling for the same &quot;poor means black&quot; fallacy we&#x27;ve seen before), but I&#x27;ve found their image generation models to be incredible poor at matching the photo you provide it. I&#x27;ve done it with many different photos of myself and they all vary wildly and look nothing like myself.<p>It&#x27;s comical how bad Apple&#x27;s image generation models are.
givinguflac3 个月前
“I could not replicate the above results with different photos, although I imagine that this will be possible with more effort.”<p>I have a feeling that the red background lighting in that image is what is causing confusion for the model.<p>That being said, I’m not surprised and I’m not sure there’s an obvious solution given current tech. I think apple is making the right choices here to “safely” or benignly provide a tiptoe into image generation for the public.
评论 #43079817 未加载
评论 #43080265 未加载
cranium3 个月前
I had one extensive exchange with ChatGPT to generate an image of a man and a woman working <i>together</i> on a leather crafting project. No matter the prompt, the man was systematically the one &quot;working&quot; and the woman being there to assist.<p>Bias correction in images feels a lot more primitive than in text.
ericmason3 个月前
Seems like more of a bug than bias. The problem is in ignoring the appearance of the person in the first place. It&#x27;s a statistical model, and of course there are more black rappers and white investment bankers. If it noticed that the person was white to begin with, and applied that trait, it wouldn&#x27;t have to guess about the race at all.
评论 #43079809 未加载
miggol3 个月前
All this marketability tuning may some day result in models which are extremely finely attuned to our current societal norms and taboos.<p>At which point the models will stop reinforcing (racial&#x2F;gender) biases and start reinforcing said taboos instead. I don&#x27;t think anyone wants that either
评论 #43079977 未加载
amelius3 个月前
&gt; This input<p>Honestly, the input doesn&#x27;t seem very well chosen. It is a very low resolution picture of someone, with red eyes, in a circle, with a grey icon partially on top of it, and with somebody else in the picture, half outside the frame.
rafram3 个月前
How do you solve this without getting Gemini-style racially diverse Nazis?
评论 #43079772 未加载
评论 #43079775 未加载
评论 #43081341 未加载
评论 #43079804 未加载