TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

The biggest risk of large language models is that they may bury us in lies

44 点作者 garymarcus超过 2 年前

10 条评论

vineyardmike超过 2 年前
I think this will rapidly accelerate a set of closely related trends that were rapidly growing in the internet: curation, trust, and reputation.<p>In a world with the massive aggregating ability of google, amazon, Facebook, etc, we&#x27;ve become deprived already of truth. Whether its the alto-junk filling up search results, fake reviews on cheap knockoffs on amazon, or the latest mis-information campaign on Facebook, it becomes hard to sort through the endless options (curation), trust the reviews, and find reliable sources.<p>People already search <i>on reddit</i> for some review, because they don&#x27;t trust Amazon reviews, but what happens when reddit becomes GPT playgrounds? You&#x27;ll see a further rise in organizations that either sell Curation (Target, Costco), reputation (Apple App Store) or sell Trust (Consumer Reports, Wirecutter). People flock to influencer YouTubers for product reviews, because its harder to fake, and people will build new business around such solutions.
评论 #34064853 未加载
spritefs超过 2 年前
Something to keep in mind with this article (and the myriad of other articles like it): the author has a psych background, not an engineering background. He&#x27;s worked as a consultant, not an engineer. He has a lot of things to say about AI<p>He calls for changes in AI, but never does he talk about concrete implementation (in this article or elsewhere). People like the author of this article, let&#x27;s call them &quot;thought leaders&quot;, what are they actually good for? What do they build? It seems like all they can do is make noise and spend money
Biologist123超过 2 年前
&gt; Systems like ChatGPT are enormously entertaining and even mind-bogglingly human-sounding, but they are also unreliable and could create an avalanche of misinformation.<p>It sounds like it could be used as a “flood the zone” machine in some ways, and led me to wonder whether this capacity was a feature or a bug.
naveen99超过 2 年前
Lies are just a special case of an uncountable infinity of impossibilities. It will only make humans who lie have an existential crisis. People who believe in a consistent world can carry on normally.
janandonly超过 2 年前
3500 years ago, the Assyrians and Babylonians made it policy to always lie (propagandize) about their victories. They had no official losses, they only won closer and closer to home...<p>There is nothing new under the sun.
mharig超过 2 年前
A cheap way to replace politicians.
garymarcus超过 2 年前
How models like GPT-3 could accelerate the production of misinformation, and what we might do about it
schiffern超过 2 年前
&gt; they may bury us in [the wrong people&#x27;s] lies<p>Fixed the headline.
simple-thoughts超过 2 年前
Truth itself is an imaginary construct. All animals are evolved to survive not to seek truth. “Truth seekers” are performing in a status game with the objective to prove genetic strength to prospective mates (however unsuccessful that quest might be). Even if we could know the truth, it would be a waste of valuable resources that could be better spent elsewhere.
评论 #34062798 未加载
评论 #34063172 未加载
评论 #34063542 未加载
评论 #34063818 未加载
评论 #34062605 未加载
maria2超过 2 年前
We’ve been in a post-truth society for some time now. Anything that makes it easier to lie is a positive in my book. Society needs to accept that we’re well past the time where information can be trusted. Either we adapt or die.<p>One thing that helps me is to understand that I really don’t need to know the truth. I wake up. I see my family. I watch my daughter smile. I enjoy the company of friends. I drink a nice glass of wine. What value does the “truth” have in my daily life? Most of the time, what is true and what is lie only comes up in a political discussion, and it almost always is unpleasant.
评论 #34063836 未加载
评论 #34061966 未加载
评论 #34063219 未加载