I think he's right, but since his company sells engines he's also got a vested interest in the conversion to software rendering coming to pass.<p>It's a hell of a lot easier to build up a passable 3D engine from scratch using DirectX + Shaders etc than it will ever be starting from scratch and implementing those same algorithms in software. This will spread the field a lot - the best engine developers will be able to do amazing stuff with the complete control over rendering pipeline software 3D will provide, so the gap between the best engines and average in house ones will increase even further.<p>Which is a great thing if you develop and license out one of the top 3D engines on the market.
A return to software rendering could present an interesting opportunity for Linux. A major barrier against using Linux is the lack of availability of A-list games. If a game no longer requires DirectX, then it would seem to be an easier task to port it to many operating systems. It would not be a magic fix, but it would be one more step toward the possibility of a mass market Linux on the desktop.
<i>...the 3D hardware revolution sparked by 3dfx in 1997 will prove to only be a 10-year hiatus from the natural evolution of CPU-driven rendering.</i><p><a href="http://www.catb.org/jargon/html/W/wheel-of-reincarnation.html" rel="nofollow">http://www.catb.org/jargon/html/W/wheel-of-reincarnation.htm...</a>
I think it will be interesting to see the variety of rendering paradigms that come from this, but on the other side, it seems like the transition will be a bit of a nightmare - having to produce games with a both DirectX engine and a software one to cope with two different sets of hardware. It would also be nice if game developers put more effort into the gameplay than the graphics, but I guess I'll be waiting a while longer for that...<p>As an aside, has anyone tried getting something like erlang to run on a GPU? I think it would be useful to be able to harness all the computing power of one (or more) PCs with a common (high level) language which is designed for that purpose.
This article is a bunch of bunk. Given the current speed advantage of the GPU over the CPU, the more likely scenario is that the GPU will become more generalized ie: NVIDIA CUDA.<p>The article is right we are returning to an earlier paradigm but it isn't the CPU. It's having to have a knowledge of your hardware's capabilities and how to program it for good performance.