TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Nvidia Grace CPU

418 pointsby intullabout 3 years ago

17 comments

oofbeyabout 3 years ago
NVIDIA continues to vertically integrate their datacenter offerings. They bought mellanox to get infiniband. They tried to buy ARM - that didn't work. But they're building & bundling CPUs anyway. I guess when you're so far ahead on the compute side, it's all the peripherals that hold you back, so they're putting together a complete solution.
评论 #30769990 未加载
andrewstuartabout 3 years ago
This leads me to wonder about the microprocessor shortage.<p>So many computing devices such as Nvidia Jetson and Raspberry Pi are simply not available anywhere. I wonder what&#x27;s he point of bringing out new products when existing products can&#x27;t be purchased? Won&#x27;t the new products also simply not be available?
评论 #30772178 未加载
评论 #30775014 未加载
评论 #30774298 未加载
评论 #30772020 未加载
评论 #30773673 未加载
ksecabout 3 years ago
This is interesting. So without actually targeting a specific Cloud &#x2F; server market for their CPU, which often ends with a chicken and egg problem with HyperScaler making their own Design or Chip. Nvidia manage to enter the Server CPU market leveraging their GPU and AI workload.<p>All of a sudden there is real choice of ARM CPU on Server. ( What will happen to Ampere ? ) The LPDDR5X used here will also be the first to come with ECC. And they can cross sell those with Nvidia&#x27;s ConnectX-7 SmartNICs.<p>Hopefully it will be price competitive.<p>Edit: Rather than downvoting may be explain why or what you disagree with ?
评论 #30772378 未加载
评论 #30770706 未加载
评论 #30771110 未加载
luxuryballsabout 3 years ago
Reading this makes a veteran software developer want to become a scientific researcher.
评论 #30769549 未加载
评论 #30769409 未加载
donatjabout 3 years ago
Maybe it&#x27;s just me, but it&#x27;s just cool to see the CPU market competitive again for the first time since the late 90s.
评论 #30771607 未加载
评论 #30771552 未加载
评论 #30772956 未加载
20220322-beansabout 3 years ago
What are people&#x27;s experience of developing with NVIDIA? I know what Linus thinks: <a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=iYWzMvlj2RQ" rel="nofollow">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=iYWzMvlj2RQ</a>
评论 #30771518 未加载
评论 #30772427 未加载
评论 #30772209 未加载
评论 #30773667 未加载
评论 #30771318 未加载
评论 #30771372 未加载
donkeydougabout 3 years ago
soooo... would something like this be a viable option for a non-mac desktop similar to the &#x27;mac studio&#x27; ? def seems targeted at the cloud vendors and large labs... but it&#x27;d be great to have a box like that which could run linux.
评论 #30770176 未加载
评论 #30770040 未加载
评论 #30770753 未加载
评论 #30775488 未加载
marcodiegoabout 3 years ago
Time to sell intel shares?
评论 #30770901 未加载
kcbabout 3 years ago
Given how larger non-mobile chips are jumping to the LPDDR standard what is the point of having a separate DDR standard? Is there something about LPDDR5 that makes upgradable dimms not possible?
评论 #30769839 未加载
评论 #30769725 未加载
评论 #30770474 未加载
didipabout 3 years ago
heh, does Intel have any chance to catch up? They fell so far behind.
评论 #30771147 未加载
评论 #30771561 未加载
评论 #30773029 未加载
评论 #30771545 未加载
评论 #30773031 未加载
valineabout 3 years ago
Anyone have a sense for how much these will cost? Is this more akin to the Mac Studio that costs 4k or an A100 gpu that costs upward of 30k? Looking for an order of magnitude.
评论 #30770069 未加载
评论 #30770808 未加载
评论 #30770647 未加载
评论 #30770056 未加载
评论 #30770195 未加载
t0mas88about 3 years ago
How likely is it that one of AWS &#x2F; GCP &#x2F; Azure will deploy these? Nvidia has some relationships there for the A100 chips.
评论 #30771218 未加载
评论 #30769822 未加载
评论 #30770729 未加载
评论 #30770600 未加载
GIFtheoryabout 3 years ago
Interesting that this has 7x the cores of a M1 Ultra, but only 25% more memory bandwidth. Those will be some thirsty cores!
评论 #30770931 未加载
评论 #30770797 未加载
评论 #30770744 未加载
评论 #30772695 未加载
userbinatorabout 3 years ago
Who bets that the amount of detailed information they&#x27;ll officially[1] release about it is &quot;none&quot; or close to that? I still think of Torvalds&#x27; classic video whenever I hear about nVidia. The last thing the world needs is more proprietary crap that&#x27;s probably destined to become un-reusable e-waste in less than a decade.<p>[1]<a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=30550028" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=30550028</a>
rsynnottabout 3 years ago
&gt; NVIDIA Grace Hopper Superchip<p>Finally, a computer optimised for COBOL.
评论 #30773740 未加载
评论 #30772959 未加载
评论 #30772709 未加载
bullenabout 3 years ago
I think we&#x27;re all missing the forest because all the cores are in the way:<p>The contention on that memory means that only segregated non-cooporative as in not &quot;joint parallel on the same memory atomic&quot; will scale on this hardware better than on a 4-core vanilla Xeon from 2018 per watt.<p>So you might aswell buy 20 Jetson Nanos and connect them over the network.<p>Let that sink in... NOTHING is improving at all... there is ZERO point to any hardware that CAN be released for eternity at this point.<p>Time to learn JavaSE and roll up those sleves... electricity prices are never coming down (in real terms) no matter how high the interest rate.<p>As for GPUs, I&#x27;m calling it now: nothing will dethrone the 1030 in Gflops&#x2F;W in general and below 30W in particular; DDR4 or DDR5, doesn&#x27;t matter.<p>Memory is the latency bottleneck since DDR3.<p>Please respect the comment on downvote principle. Otherwise you don&#x27;t really exist; in a quantum physical way anyway.
评论 #30772489 未加载
评论 #30773079 未加载
评论 #30772365 未加载
评论 #30772295 未加载
评论 #30772315 未加载
cjensenabout 3 years ago
&quot;Grace?&quot;<p>After 13 microarchitectures given the last names of historical figures, it&#x27;s really weird to use someone&#x27;s first name. Interesting that Anandtech and Wikipedia are both calling it Hopper. What on Earth are the marketing bros thinking?
评论 #30770957 未加载
评论 #30770953 未加载
评论 #30770962 未加载