TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Changing a single number among billions can destroy an AI model

3 点作者 cjrd6 个月前

1 comment

nis0s6 个月前
Interesting observation, I read a similar paper recently which reported LLMs experiencing degradation from compression. To me, these observations indicate that the neural net model isn’t sufficient for mimicking human learning or memory.<p>My pov is that computer intelligence can be better than human intelligence, but it needs to be programmed on flexible hardware, i.e., new connections don’t depend on software alone for formation and maintenance. But the current set of solutions we have for programmable hardware, as seen in FPGAs or ASICs, is also an incomplete solution because the interconnect isn’t flexible. I think we need wetware for training energy efficient and memory resilient AI. If AI is trained using quantum computers, then I don’t think the semiconductor medium used for training will be an issue, but that’s just a guess.