TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Tim Sweeney: The end of the GPU roadmap [pdf]

64 pointsby simonbalmost 16 years ago

6 comments

nvoorhiesalmost 16 years ago
One thing that didn't seem to be addressed in the slides themselves is the reason that GPUs have fixed function texturing operations, a fixed pipeline stage configuration, etc. It saves on memory bandwidth by making the memory accesses more coherent and dramatically magnifying the usefulness of relatively small caches.<p>Basically this is a claim that there's a massive amount of graphical quality that will be unleashed once we're unchained from the tyranny of the fixed function GPU behavior that remains, and that this is worth whatever loss in computational power we'll have when we increase the sizes of caches to cover the less-coherent memory access.<p>I'm not sure if I buy that. Games look quite nice already, and I'm not sure if I could personally tell the difference between what we see in something like Crysis and something like Toy Story, which didn't have the shackles of fixed function pipelines.<p>An easier explanation that's long been a pet theory of mine is that Larrabee is essentially a thread parallel FP-heavy processor for the scientific market motivated by GPGPU's encroachment on that market segment, and necessarily labeled a "CPU/GPU" in order to not step on toes in the wrong places at Intel.<p>It's definitely going to be an interesting next couple years in graphics, though.
评论 #756531 未加载
评论 #755566 未加载
nossalmost 16 years ago
"Load 16-wide vector register from scalars from 16 independent memory addresses, where the addresses are stored in a vector!"<p>Wont that cause 16 memory reads that are much costlier than what one can benefit on a 16-way SIMD ALU operation?<p>What I wanted when I tried to write highly optimized computer-vision code was to have multiple "cursors" in memory that i would read from and increment. I.e. that the hardware would prefetch data and my code would operate as if data was a stream.
评论 #755035 未加载
jacquesmalmost 16 years ago
That's a really great presentation!<p>One observation though, what is sold as a 'game' today is more of an interactive movie.<p>As to the larrabee, I can't wait until it comes out. That chip is going to give NVIDIA serious headaches.<p>Lots of good stuff in there, I just skimmed it will go back later to read it again.
评论 #754963 未加载
评论 #754937 未加载
hypermattalmost 16 years ago
The epic gaming guys always have the most interesting presentations, they seem more up on more current software techniques like STM and functional programming, I don't hear a lot of other game companies talk about.
akamakaalmost 16 years ago
It's particularly interesting to read Sweeney's views on this subject, considering that he's been talking about the shift back to software rendering since 2000 or so, when GPUs started becoming programmable.<p>Check out this old interview: <a href="http://archive.gamespy.com/legacy/interviews/sweeney.shtm" rel="nofollow">http://archive.gamespy.com/legacy/interviews/sweeney.shtm</a><p><i>2006-7: CPU's become so fast and powerful that 3D hardware will be only marginally benfical for rendering relative to the limits of the human visual system, therefore 3D chips will likely be deemed a waste of silicon (and more expensive bus plumbing), so the world will transition back to software-driven rendering... If this is a case, then the 3D hardware revolution sparked by 3dfx in 1997 will prove to only be a 10-year hiatus from the natural evolution of CPU-driven rendering.</i><p>His timeline was off by a few years, but I think he basically had the right idea all along.
评论 #756623 未加载
peripiteaalmost 16 years ago
The return to CPU-based rendering might do wonders for PC gaming's popularity. It hadn't occurred to me until I read this, but it seems like the decline of PC gaming has mapped fairly closely to the increase in reliance on top-end graphics cards. There is probably a huge slice of people at the margins that have shunned PC gaming (actively or passively) because of the various burdens that requiring a graphics card adds.