Note this is not about WebGL, but about WebGPU by Apple, claiming it "generally offers better performance" than OpenGL/WebGL.<p>I'm not an expert in this area, but isn't it a bad idea to introduce another standard if there's already WebGL with great cross-browser and -platform support? It reminds me of what people have been raging about towards Microsoft for years...
Is Apples WebGPU proposal still being actively developed? I haven't seen any signs of life since the initial reveal nearly a year ago, or from Mozilla's Obsidian proposal for that matter.<p>At least from an outsider's perspective, the only active proposal for a WebGL successor seems to be Googles NXT: <a href="https://github.com/google/nxt-standalone" rel="nofollow">https://github.com/google/nxt-standalone</a>
Would rather see something like WebCL back in development for general purpose GPU computation, most importantly deep learning, on the web, then the Xth attempt on superseeding WebGL.
Funneling things through WebGL just doesn't feel right, where proper OES_texture_float support is also not guaranteed.<p><a href="https://www.khronos.org/webcl/" rel="nofollow">https://www.khronos.org/webcl/</a>
I peeked at the source code of the shaders, it looks like it is metal for the web.
I don't get how a closed API like metal could be the future of an open internet. Yet it can start a discussion.
This is amazing - I've never run any kind of demos like this before in a browser (i.e. WebGL) that didn't immediately spin up all fans with 100%+ CPU usage for the browser. I ran the Shadertoy on a 4k monitor in full screen mode and CPU (as well as energy usage) was minimal, if noticable at all on my MacBook Pro. (Edited: Which means it's using the GPU only... that's the whole point here.)
Would love to see some kind of status update on WebGPU from any of the participants. Is this the latest?<p><a href="https://lists.w3.org/Archives/Public/public-gpu/2017Sep/0015.html" rel="nofollow">https://lists.w3.org/Archives/Public/public-gpu/2017Sep/0015...</a>