TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Ask HN: Why Don't GPUs Scale?

1 pointsby miraculixxabout 2 months ago
The deeper question is will GPUs get CPU-like features for task-parallelism like virtual cores, pipelines, order rewriting etc?

2 comments

PaulHouleabout 2 months ago
My understanding is they are plenty pipelined, though the GPU is working on a more predictable workload so the order is more likely to be rewritten by the compiler than by the silicon -- that is, the CPU tries as hard as it can to maximize single threaded performance for branchy workload and "wastes" transistors and power on that, the GPU expects branches and memory access to be more predictable and spends the transistors and power it saves to add more cores.
6SixTyabout 2 months ago
GPUs do scale because they are parallel processors. Software tools like CUDA and ROCm are very specifically designed for parallel compute on GPU.
评论 #43406517 未加载