TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

ChatGPT knows things that Google doesn't

12 点作者 _xivi超过 1 年前

11 条评论

panstromek超过 1 年前
Seems like (at least to stack exchange) it&#x27;s not true:<p><a href="https:&#x2F;&#x2F;literature.stackexchange.com&#x2F;questions&#x2F;26108&#x2F;did-voltaire-say-i-m-not-a-believer-but-i-prefer-my-barber-to-be-a-christian" rel="nofollow">https:&#x2F;&#x2F;literature.stackexchange.com&#x2F;questions&#x2F;26108&#x2F;did-vol...</a>
评论 #39153578 未加载
jlmorton超过 1 年前
I just tried the exact same query on GPT4. Here are the results. [1]<p>&gt; The quote, &quot;I&#x27;m not a believer, but I prefer my barber to be a Christian, especially when he&#x27;s using his razor on my neck,&quot; is attributed to Napoleon Bonaparte. This statement reflects Napoleon&#x27;s pragmatic approach to religion, indicating his preference for a Christian barber, perhaps due to a belief in the moral framework provided by Christianity, which he might have seen as a safeguard while in a vulnerable position. It&#x27;s important to note that the exact wording of historical quotes can vary due to translations and the passage of time.<p><a href="https:&#x2F;&#x2F;imgur.com&#x2F;a&#x2F;nq2LHr0" rel="nofollow">https:&#x2F;&#x2F;imgur.com&#x2F;a&#x2F;nq2LHr0</a>
评论 #39153644 未加载
alsodumb超过 1 年前
I found ChatGPT to be particularly good at these kinds of things - in most cases, it just knows what I&#x27;m talking about, despite very vague hints.<p>And that part just blows my mind. Sure, LLMs can do fancy things these days, but the fact that a model, stored in some disk in a finite space, disconnected from the index of the web, can know so much about so many things blows my mind.
评论 #39154355 未加载
bemusedthrow75超过 1 年前
“Knows” is such a useless word now; we have debased it by using it loosely in non-AI code contexts so now we can’t sensibly debate whether a non-sentient thing can “know” anything.
评论 #39159320 未加载
draxil超过 1 年前
On seeing the link test I automatically amended it to<p>&gt; ChatGPT &quot;knows&quot; things that Google doesn&#x27;t<p>reading through it seems like that was probably correct. That said OpenAI could have fed it a book that&#x27;s not on the internet. However I&#x27;m not sure why this guy didn&#x27;t ask the bot for a source! Not that I&#x27;m bothering to do that either ;)
评论 #39153587 未加载
xg15超过 1 年前
I mean, you can treat ChatGPT&#x27;s answer as a new hypothesis for which you now might have additional tools available to verify.<p>I.e. before, you had to Google &quot;who is this quote from&quot; which didn&#x27;t give any results, but now you can do more narrowed queries to test the hypothesis that it&#x27;s from Voltaire - e.g. seach for Voltaire quote collections and see if its in there, etc.
senectus1超过 1 年前
they both make the same dumb mistake, but not analysing or error checking their own answers.<p>Great example here that I did:<p>Whats the next number in this sequence: 71 64 58 53 48 44 ...<p>Bard gets it right, but for the wrong reason, then gets it even further wrong<p>The next number in the sequence is most likely 40. This is because the sequence appears to be decreasing by 7 each time.<p>Here&#x27;s the breakdown:<p>71 - 7 = 64 64 - 7 = 57 57 - 7 = 50 50 - 7 = 43 43 - 7 = 36 36 - 7 = 40<p>Therefore, the next number in the sequence would be 40.<p>However, it&#x27;s important to note that without further context or information about the origin of the sequence, there could be other possibilities. For example, the sequence could be part of a more complex pattern or formula.<p>If you have any additional information about the sequence, I might be able to provide a more accurate prediction for the next number.<p>ChatGPT 3.5 gets it right for the wrong reason but its working out is for the right reason. To determine the next number in the sequence, let&#x27;s analyse the differences between consecutive terms:<p>71 - 64 = 7 64 - 58 = 6 58 - 53 = 5 53 - 48 = 5 48 - 44 = 4<p>It looks like the differences are decreasing by 1 each time. Following this pattern:<p>44 - 4 = 40<p>So, the next number in the sequence is 40.<p>To show the problem here is an extrapolation. I asked it to tell me what the next 3 numbers would be The given sequence is decreasing, and the difference between consecutive terms is also decreasing. Let’s look at the differences:<p>71 - 64 = 7 64 - 58 = 6 58 - 53 = 5 53 - 48 = 5 48 - 44 = 4<p>The differences are not consistent, but we can see that they decrease by 1, then stay the same, and then decrease by 1 again. If this pattern continues, the next differences should be the same as the last one, which is 4, then decrease by 1 to 3, and then stay the same at 3.<p>So, the next three numbers in the sequence would be:<p>44 - 4 = 40 40 - 3 = 37 37 - 3 = 34<p>This is wrong again, the next three numbers would be: 44-4 = 40 40-4 = 36 36-3 = 33
freitzkriesler2超过 1 年前
This is why Google is terrified of chatgpt and was caught with the proverbial pants down.<p>Microsoft should bring back ask Jeeves as an AI chat bot.
jstanley超过 1 年前
You missed the most likely hypothesis: you hallucinated the quote, and ChatGPT hallucinated the attribution.<p>ChatGPT didn&#x27;t hallucinate the quote, because you provided it.
hit8run超过 1 年前
I think it’s from a movie but I don’t remember which one.
评论 #39153794 未加载
Grimblewald超过 1 年前
The problem with this is that semantically the quote the author is looking for matches this<p>&gt; I want my lawyer, my tailor, my servants, even my wife to believe in God, because it means that I shall be cheated and robbed and cuckolded less often. … If God did not exist, it would be necessary to invent him<p>which is far closer to the words typically attributed to Voltaire. So the title &quot;chatGPT understands queries google doesn&#x27;t&quot; might be mroe correct because if you google the quote i&#x27;ve provided it is attributed correctly without any real issues.