TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Ask HN: What is everyone doing with all of these GPUs?

11 pointsby formercoder10 months ago
I know accelerator demand is blowing up. However, there are only a few players training foundation models. What are the core use cases everyone else has for all of these accelerators? Fine tuning, smaller transformer models, general growth in deep learning?

3 comments

lafeoooooo10 months ago
Different scenarios have varying demands for GPU types. For tasks like model inference or basic operations, a CPU or even on-device solutions (mobile, web) might suffice.<p>When a GPU is necessary, common choices include T4, 3090, P10, V100, etc., selected based on factors like price, required computing power, and memory capacity.<p>Model training also has diverse needs based on the specific task. For basic, general-purpose vision tasks, 1 to 50 cards like the 3090 might be enough. However, cutting-edge areas like visual generation and LLMs often require A100s or A800s, scaling from 1 to even thousands of cards.
talldayo10 months ago
Inference. 99% of the customers that aren&#x27;t buying GPUs to train on are either using it for inference or putting it in a datacenter where inference is the intended use-case.
评论 #41276327 未加载
评论 #41277052 未加载
the__alchemist10 months ago
I&#x27;m playing UE5 games, and doing some computational chem with CUDA.