TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Microsoft's AI shopping announcement contains hallucinations in the demo

90 点作者 craigts将近 2 年前

11 条评论

cryptozeus将近 2 年前
Is it just me or does everyone trust AI opinions less and less ? Every time I ask it to find top 5 of something, I go and double check myself and almost always find it to be wrong. For example try searching for top 5 restaurants around me in bard. Some of them dont even exist lol and some are just random if you cross verify with actual popularity from yelp etc.
评论 #36909398 未加载
评论 #36908973 未加载
评论 #36909402 未加载
评论 #36910178 未加载
评论 #36910517 未加载
评论 #36910277 未加载
phyzome将近 2 年前
There is information, here, in the observations that all these &quot;AI&quot; demos contain blatant inaccuracies, with apparently no fact-checking having taken place. It&#x27;s clear that these companies (Microsoft, Google, OpenAI) do not care about accuracy, correctness, or the truth. It is not part of their business model.<p>There is no respect for your time, your safety, your reputation. Your role as a customer is to be conned into using the products for long enough that a return on investment can be made; the companies will pivot to a new product as soon as the untrustworthiness of the old one becomes common knowledge.<p>Short-term thinking. Desperation.
imchillyb将近 2 年前
A hallucination is an unexpected emergence.<p>The &#x27;making up&#x27; facts, because it cannot determine a fact from fiction, is entirely expected behavior.<p>There is no &#x27;hallucination&#x27; as the behavior is anticipated, expected, and entirely within normal operations processes.<p>The bullshit comes from there being no model of trust these AIs subscribe to. I&#x27;d love-love-love to see these AI producers be held to some responsibility to verification of truth and ethics.<p>These companies&#x2F;universities&#x2F;groups allowing their applications to bold-face-lie (misrepresent data with authority) to citizens should be top-priority to bash-in-the-face by legislators around the world.
评论 #36909415 未加载
评论 #36910684 未加载
评论 #36910736 未加载
pmontra将近 2 年前
Bing works for Microsoft and basically that&#x27;s an ad. Wouldn&#x27;t any human paid by Microsoft say in an ad that Surface Headphones 2 are the best ANC headphones?
aeirjtaweraew将近 2 年前
Pretty soon some LLM owner is going to use the argument &quot;Everyone is allowed to have their own opinions, and LLMs are too, their responses don&#x27;t have to line up with someone else&#x27;s preferences.&quot;
评论 #36910511 未加载
siva7将近 2 年前
Opinion pieces like shopping recommendations are quite hard for current LLMs. Either it is a hard fact - or pure creative work - that&#x27;s where AI shines. Anything between and things get tricky
评论 #36909380 未加载
sporadicallyjoe将近 2 年前
Is anyone shipping AI products that DO NOT contain hallucinations? I thought that was pretty much a given.
评论 #36911530 未加载
评论 #36910464 未加载
yonatron将近 2 年前
Yeah. These &quot;lies&quot; are just artifacts of the way that LLMs work. They&#x27;re meant to predict likely text given a prompt. And they do. If tasked with &quot;write some marketing or a buying guide for product X&quot;, they will simulate likely marketing blurbs, nothing yo do with truth, that&#x27;s not their wheelhouse. Predictive is a very different function, algorithm and problem-set than something like &quot;accurately summarize existing reviews&quot;. This is a feature, not a bug. If you use something off label, you&#x27;ll get off label results. MSFT should know better.
predictabl3将近 2 年前
I&#x27;m sorry but watching people talk about the vast majority of the AI landscape is like watching people talk about FSD. Have fun on the hype treadmill.
fizwhiz将近 2 年前
Why hasn&#x27;t their stock plummeted like Google&#x27;s?
barbariangrunge将近 2 年前
Stop calling them hallucinations. If we&#x27;re going to anthropomorphize AIs, let&#x27;s just call it bullshitting and lies. If we&#x27;re not going to anthropomorphize AIs, then we need a different term
评论 #36909633 未加载
评论 #36908641 未加载
评论 #36909690 未加载
评论 #36908639 未加载
评论 #36908581 未加载
评论 #36909536 未加载
评论 #36909192 未加载
评论 #36908758 未加载
评论 #36910218 未加载
评论 #36908659 未加载
评论 #36908860 未加载
评论 #36910013 未加载
评论 #36909949 未加载
评论 #36910098 未加载
评论 #36910004 未加载
评论 #36910525 未加载
评论 #36910620 未加载
评论 #36909540 未加载