TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Meta MTIA v2 – Meta Training and Inference Accelerator

189 pointsby _yo2uabout 1 year ago

13 comments

jsheardabout 1 year ago
I like the interactive 3D widget showing off the chip. Yep, that sure is a metal rectangle.
评论 #39993178 未加载
评论 #39995723 未加载
评论 #39993354 未加载
modelessabout 1 year ago
Intel Gaudi 3 has more interconnect bandwidth than this has memory bandwidth. By a lot. I guess they can't be fairly compared without knowing the TCO for each. I know in the past Google's TPU per-chip specs lagged Nvidia but the much lower TCO made them a slam dunk for Google's inference workloads. But this seems pretty far behind the state of the art. No FP8 either.
评论 #39992161 未加载
评论 #39992923 未加载
评论 #39993799 未加载
评论 #39993731 未加载
mlsuabout 1 year ago
Certainly an interesting looking chip. It looks like it&#x27;s for recommendation workloads. Are those workloads very specific, or is there a possibility to run more general inference (image, language, etc) on this accelerator?<p>And, they mention a compiler in PyTorch, is that open sourced? I really liked the Google Coral chips -- they are perfect little chips for running image recognition and bounding box tasks. But since the compiler is closed source it&#x27;s impossible to extend them for anything else beyond what Google had in mind for them when they came out in 2018, and they are completely tied to Tensorflow, with a very risky software support story going forward (it&#x27;s a google product after all).<p>Is it the same story for this chip?
chessgeckoabout 1 year ago
I thought MTIA v2 would use the mx formats <a href="https:&#x2F;&#x2F;arxiv.org&#x2F;pdf&#x2F;2302.08007.pdf" rel="nofollow">https:&#x2F;&#x2F;arxiv.org&#x2F;pdf&#x2F;2302.08007.pdf</a>, guess they were too far along in the process to get it in this time.<p>Still this looks like it would make for an amazing prosumer home ai setup. Could probably fit 12 accelerators on a wall outlet with change for a cpu, would have enough memory to serve a 2T model at 4bit and reasonable dense performance for small training runs and image stuff. Potentially not costing too much to make either without having to pay for cowos or hbm.<p>I&#x27;d definitely buy one if they ever decided to sell it and could keep the price under like $800&#x2F;accelerator.
评论 #39993745 未加载
teaearlgraycoldabout 1 year ago
Still seems pretty primitive. Very cool though.<p>I can only imagine the lack of fear Jensen experiences when reading this.
评论 #39993088 未加载
评论 #39994018 未加载
评论 #39993877 未加载
prng2021about 1 year ago
3x performance but &gt;3x TDP. Am I missing something or is that unimpressive?
jrgdabout 1 year ago
I find it weird that not everyone agree Meta and Facebook and social networks in general are doing some good the the society and our democracies; yet they manage to spend incredible amount of money&#x2F;energy&#x2F;time to develop solutions to problems we aren&#x27;t exactly sure are worth solving…
评论 #39993747 未加载
评论 #39993531 未加载
duchenneabout 1 year ago
Is it possible to buy it?
ein0pabout 1 year ago
Come on, Zuck, undermine Google Cloud and take NVIDIA down a few pegs by offering this for purchase in good quantities.
srousseyabout 1 year ago
Pretty large increase in performance over v1, particularly in sparse workloads.<p>Low power 25W<p>Could use higher bandwidth memory if their workloads were more than recommendation engines.
评论 #39992321 未加载
throwaway48476about 1 year ago
It&#x27;s interesting that they are not separating training and inference.
评论 #39994947 未加载
xnxabout 1 year ago
My mind still boggles that a BBS+ads company would think it needs to design its own chips.
评论 #39992512 未加载
评论 #39992259 未加载
评论 #39992337 未加载
评论 #39992742 未加载
评论 #39992686 未加载
bevekspldnwabout 1 year ago
Pretty fascinating they mention applications for ad serving but not Metaverse.<p>I feel like Zuck figured out he’s just running an ads network, the world is a long way anway from some VR fever dream, and to focus on milking each DAU for as many clicks as possible.
评论 #39997848 未加载
评论 #39997946 未加载