TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

GPT inputs outgrow world chip and electricity capacity

19 pointsby quirkotover 1 year ago

5 comments

fulafelover 1 year ago
Badly editorialized title.<p>Regarding the substance of the article, the curve from the 3 data points (1: 100x 2: 25x 3: 25x) could fit lots of different ways besides &quot;growing 30x per generation&quot;.
danielscrubsover 1 year ago
Really wish AMD could hurry up and implement a transparent module for gpu computing into llvm.<p>It’s not good that GPUs are this opaque.
dekhnover 1 year ago
The are all the same arguments that wrongly predicted that DNA sequencing would overtake hard drive storage capacity.<p>People aren&#x27;t going to do truly uneconomic things just to scale language models exponentially.
marsissippiover 1 year ago
Interesting Pull quote:<p>GPT-4 needed about 50 gigawatt-hours of energy to train. Using our scaling factor of 30x, we expect GPT-5 to need 1,500, GPT-6 to need 45,000, and GPT-7 to need 1.3 million.
评论 #39359873 未加载
RecycledEleover 1 year ago
New technologies experience very rapid exponential growth for a while.<p>This should not be a surprise.