TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

“Scaling Laws” for AI and Some Implications

3 点作者 rememberlenny大约 2 年前

3 条评论

ftxbro大约 2 年前
This is an excellent kind of analysis I was curious about this kind of thing and I don&#x27;t know why I never saw it before.<p>Conclusions are that by 2030 there will be at most 1000x the available computer power and data for AI models unless some super weird things happen.<p>EDIT: oh it&#x27;s submitted by someone at openai no wonder lol
ftxbro大约 2 年前
Imagine writing one of the most insightful takes on GPT and it gets only 2 upvotes and 3 dumb comments all by the same account. Something must be wrong with the title, probably it needs more clickbait. I had expected that by now it would have 300 upthumbs and fifty comments.
ftxbro大约 2 年前
If anyone is interested in this kind of scaling laws there is a reddit (oh no!) for tracking it: <a href="https:&#x2F;&#x2F;old.reddit.com&#x2F;r&#x2F;mlscaling&#x2F;" rel="nofollow">https:&#x2F;&#x2F;old.reddit.com&#x2F;r&#x2F;mlscaling&#x2F;</a>