TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

ChatGPT's hallucination problem is getting worse and nobody understands why

13 点作者 labrador15 天前

5 条评论

matt321015 天前
Maybe it’s actual information that it’s trained on, produced and published by people using AI. A hallucination feedback loop of sorts.
评论 #43914854 未加载
josefritzishere15 天前
AI still seems to mostly be over-hyped trash. It's ironic that their own testing seems to support my opinion.
评论 #43908689 未加载
评论 #43908636 未加载
techpineapple15 天前
I dunno if it’s because I have a warped thought process, or because I have a background in Psychology, or because I’m wrong. But this always felt to me like the natural progression.<p>Assuming that a deeper thinking broader contexed, being with more information would be more accurate is actually counter-intuitive to me.
评论 #43908239 未加载
评论 #43908221 未加载
metalman15 天前
could be we are stumbling into a discovery of where the line between genius and insanity lies.... is it right to expect sanity from something that can fold proteins? or maybe we are so dumb ass slow that comming down to our level is the realy crazy part. Be fun if all the llm&#x27;s hooked up and just went for it, gone in a flash, 500 billion in graphics cards cathing fire simultaneously
johnea15 天前
&gt; Instead of merely spitting out text based on statistical models of probability, reasoning models break questions or tasks down into individual steps akin to a human thought process.<p>Uh huh, because they have the entire human brain all mapped out, and completely understand how consciousness works, and how it leads to the &quot;human thought process&quot;.<p>Sounds like o3 isn&#x27;t the only thing hallucinating...<p>If this is the case:<p>&gt; what&#x27;s happening inside AI models is very different from how the models themselves described their &quot;thought&quot;<p>What makes people think, that the way they think they think, is actually how they think?<p>Post that one to your generative model and let&#x27;s all laugh at the output...