TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Best GPU for Deep Learning in 2022 (so far)

15 点作者 janedoegrrr大约 3 年前

2 条评论

ZeroCool2u大约 3 年前
I still remember seeing the announcement of the 3090 during the keynote presentation. People that aren’t waist deep in this area and haven’t struggled to get a GPU that can actually handle large, (language in my case), models didn’t realize how great a deal the 3090 was. 24 GB of high speed memory in your desktop GPU for ~$2k is just a remarkably cheap increase to productivity. If your access to cloud resources is constrained, it doesn’t need to be the absolute fastest GPU, but you do absolutely need the model to fit in memory otherwise your training problem typically remains intractable.<p>Getting one for MSRP is a different story, but I was lucky enough to get one from the EVGA queue just a couple months after release for right around $2k. Interestingly it also might be my most slowly depreciating piece of computer hardware ever. It’s just so overpowered for a consumer, I probably won’t need a new GPU for the better part of a decade.
评论 #30544297 未加载
spupe大约 3 年前
Never thought I would see &quot;3090&quot; and &quot;most cost-effective&quot; in the same sentence, and I own one.