TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Ask HN: Why Don't GPUs Scale?

1 点作者 miraculixx大约 2 个月前
The deeper question is will GPUs get CPU-like features for task-parallelism like virtual cores, pipelines, order rewriting etc?

2 条评论

PaulHoule大约 2 个月前
My understanding is they are plenty pipelined, though the GPU is working on a more predictable workload so the order is more likely to be rewritten by the compiler than by the silicon -- that is, the CPU tries as hard as it can to maximize single threaded performance for branchy workload and "wastes" transistors and power on that, the GPU expects branches and memory access to be more predictable and spends the transistors and power it saves to add more cores.
6SixTy大约 2 个月前
GPUs do scale because they are parallel processors. Software tools like CUDA and ROCm are very specifically designed for parallel compute on GPU.
评论 #43406517 未加载