TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

World’s biggest chip created to meet demands of AI

10 点作者 banjo_milkman超过 5 年前

5 条评论

pjc50超过 5 年前
The general approach is called &quot;wafer scale&quot;, and it&#x27;s not new: <a href="https:&#x2F;&#x2F;www.extremetech.com&#x2F;extreme&#x2F;286073-building-gpus-out-of-entire-wafers-could-turbocharge-performance-efficiency" rel="nofollow">https:&#x2F;&#x2F;www.extremetech.com&#x2F;extreme&#x2F;286073-building-gpus-out...</a><p>However, one of the longstanding problems is yield. A whole wafer will have a number of defects on it, this is simply unavoidable. This requires that the wafer scale system must be able to disable or disconnect faulty subsystems.<p>The use of this for AI raises the interesting possibility of &quot;learning around&quot; some kinds of defect, although it will still be necessary to disconnect bits with short circuits in them.<p>It&#x27;s also quite expensive simply to buy all that area, at least $10k per wafer. You save a bit on packaging and building a carrier PCB for it, but not a great deal.
AdamJacobMuller超过 5 年前
&gt; It also eats up as much electricity as all the servers contained in one and a half racks<p>Seems like such a large chip is going to pose issues? 1.5 racks, lets generously say they mean lower power racks and are perhaps talking about 7.5kW, in a single chip? Seems like it would require some kind of water block with sub-zero cooling...
rwmj超过 5 年前
Does anyone remember when &quot;wafer-scale integration&quot; was big - in the 1980s? <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Wafer-scale_integration" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Wafer-scale_integration</a>
givinguflac超过 5 年前
This title would also make a good scifi movie subtitle.
bufferoverflow超过 5 年前
Paywalled.<p>Try this:<p><a href="http:&#x2F;&#x2F;archive.li&#x2F;2SOos" rel="nofollow">http:&#x2F;&#x2F;archive.li&#x2F;2SOos</a>