TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

AMD adds support for PyTorch development on select RDNA 3 GPUs with ROCm 5.7

65 点作者 asparagui超过 1 年前

7 条评论

NBJack超过 1 年前
This is fantastic progress. I actually like Nvidia, but anything that levels the playing field and encourages competition here would be a big win.
qris超过 1 年前
Related:<p>2023-06-02: <i>Lisa Su saved AMD – Now she wants Nvidia&#x27;s AI crown</i> <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=36164055">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=36164055</a><p>2023-08-09: <i>Making AMD GPUs competitive for LLM inference</i> <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=37066522">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=37066522</a><p>2023-09-26: <i>ROCm is AMD&#x27;s priority, executive says</i> <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=37663194">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=37663194</a><p>2023-10-06: <i>AMD may get across the CUDA moat</i> <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=37793635">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=37793635</a>
NedCode超过 1 年前
&quot;AMD is determined to continue to make AI more accessible to developers and researchers that benefit from a local client-based setup for ML&quot; it seems false. To me AMD is under investing in that regard for many years now.
pjmlp超过 1 年前
Meanwhile, on CUDA side, pick any NVidia card, not a couple of selected ones.
Reubend超过 1 年前
Is RocM only working on Linux for now? And if so, it is working alright inside of WSL?
blovescoffee超过 1 年前
The article mentions 48gb workflows. Presumably two cards attached? What does this look like in the AMD world, I&#x27;m used to what Nvidia offers
评论 #37907793 未加载
manifoldgeo超过 1 年前
ROCm? I hardly KNOWm!<p>I&#x27;m sorry, I know this is a dumb and off-topic comment more appropriate for Reddit, but I couldn&#x27;t resist. If you check my comment history, you&#x27;ll see I don&#x27;t usually do this. Please forgive.