TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

GPT inputs outgrow world chip and electricity capacity

19 点作者 quirkot超过 1 年前

5 条评论

fulafel超过 1 年前
Badly editorialized title.<p>Regarding the substance of the article, the curve from the 3 data points (1: 100x 2: 25x 3: 25x) could fit lots of different ways besides &quot;growing 30x per generation&quot;.
danielscrubs大约 1 年前
Really wish AMD could hurry up and implement a transparent module for gpu computing into llvm.<p>It’s not good that GPUs are this opaque.
dekhn超过 1 年前
The are all the same arguments that wrongly predicted that DNA sequencing would overtake hard drive storage capacity.<p>People aren&#x27;t going to do truly uneconomic things just to scale language models exponentially.
marsissippi超过 1 年前
Interesting Pull quote:<p>GPT-4 needed about 50 gigawatt-hours of energy to train. Using our scaling factor of 30x, we expect GPT-5 to need 1,500, GPT-6 to need 45,000, and GPT-7 to need 1.3 million.
评论 #39359873 未加载
RecycledEle大约 1 年前
New technologies experience very rapid exponential growth for a while.<p>This should not be a surprise.