Interactive raytracing is the future both in mobile and desktop. My project <a href="http://clara.io" rel="nofollow">http://clara.io</a>, an online 3D modeler + renderer, recently gained the ability to do embeded interactive ray traced 3D embeds using V-Ray (one of the best and most accurate renderers in the world).<p>It works best on Chrome, Firefox and Safari:<p><a href="https://plus.google.com/u/0/+BenHouston3D/posts/DYq2RKJENC5" rel="nofollow">https://plus.google.com/u/0/+BenHouston3D/posts/DYq2RKJENC5</a><p>Here is another example:<p><a href="https://twitter.com/exocortexcom/status/443538733661704192" rel="nofollow">https://twitter.com/exocortexcom/status/443538733661704192</a><p>I can only see raytracing becoming more popular.
John Carmack is a fan: "I am very happy with the advent of the PVR Wizard ray tracing tech. RTRT HW from people with a clue!"
<a href="https://twitter.com/ID_AA_Carmack/status/446021820290842624" rel="nofollow">https://twitter.com/ID_AA_Carmack/status/446021820290842624</a>
It's best to remember that ray tracing isn't the be all and end all of graphics since it's far harder for artists to control the results than the other hacked up ways. One consequence of this is their example comparison, where I think it's entirely subjective to say the ray traced output is better. Neither is particularly great.<p>The big win is around indirect lighting, but the development of that in standard rasterisers in the last decade has exceeded even my wildly optimistic expectations.<p>New options = good, but there's no such thing as a graphics silver bullet.
What's the difference between ray tracing and path tracing? And could we have a path tracing chip?<p>This demo was pretty impressive. I think they said they used 4 Titans at 720p.<p><a href="https://www.youtube.com/watch?v=aKqxonOrl4Q" rel="nofollow">https://www.youtube.com/watch?v=aKqxonOrl4Q</a>
"... the potential that these technologies have to revolutionize the user experience from mobile and console gaming to virtual and augmented reality."<p>I've been interested in ray-tracing since the early '90s, and I'm glad it's finally coming to real-time, but this isn't going to "revolutionize" shit. It's going to make 3D games and VR slightly prettier than they were. It's not going to enable new styles of gameplay or new modes of interaction. We will never again see anything like the enormous forward leaps in realtime graphics that happened during the '90s.<p>I'm also a bit put off by their comparison showing that PowerVR has better reflections, shadows, and transparency than a raster engine with reflections and some shadows turned off and a very poor choice of glass-transparency filter.<p>On another look, I don't even know what they're going for with the shadows. The rasterized image has "NO SHADOWS" printed right between the shadows of a building and a telephone wire, and their hybrid render has the light from the diner windows casting shadows across outside pavement in <i>broad daylight</i>. Bwuh?
So what is the API and libraries like? Is there any sort of built-in OGL fallback or do devs need to write two completely different renderers? Is there standards relating to this; are any other vendors going to be implementing same API? Is this new hardware compatible with the older Caustic2 cards?
It never fails to amaze me how powerful raytracing is. Last year I took an "Image Synthesis" class and did a quick presentation about a post I've seen here on HN about a raytracker in 1337 bytes [0]. It is amazing how such a small program can generate an image with depth of view, shadows and texture.<p>0: <a href="http://fabiensanglard.net/rayTracing_back_of_business_card/index.php" rel="nofollow">http://fabiensanglard.net/rayTracing_back_of_business_card/i...</a>
<i>high-performance ray tracing, graphics and compute in a power envelope suitable for mobile and embedded use cases.</i><p>If true, color me impressed!
This brings back some memories. Used to do some experiments using POV Ray back in the day.<p>I remember how slow the process was, it could take several hours/days to generate an image full of reflections, but in the end the results were usually stunning...<p>Link: <a href="http://www.povray.org/" rel="nofollow">http://www.povray.org/</a>
I think the examples used are not very impressive. And I think path tracing (which is a kind of ray tracing) is more interesting for the future.<p>A nice blog about real time path tracing is: <a href="http://raytracey.blogspot.nl/" rel="nofollow">http://raytracey.blogspot.nl/</a>
I think the sample frames demonstrate that hybrid ray tracing is less realistic that pure path tracing. I hope that someone figures out how to make stuff like the Brigade 3 demos based just on feeding geometry and textures to hardware.
I'd wager that the current rasterising pipeline is more flexible than one based on raytracing - capable of generating a range of styles, not just photorealistic ones. And therefore having an extra dedicated chip for raytracing seems uneconomical.<p>The comparison examples given in the article were slightly ridiculous. How does a non-reflective car represents 'traditional' rendering? Look at any great AAA game and you'll see reflections, refractions, radiosity, etc., that are all pretty amazing. I don't think that general demand will be there for alternative rendering hardware for quite a while.
Two thoughts: I would love to see a video of a dynamic environment (aka actors/objects moving in a scene). Also, how long before cryptocurrency miners use these to gain step function over current gpus.
I thought Wolfenstein 3D was an example of ray tracing but reading this article and Wikipedia it seems to be all about lighting effects. What do I misunderstand?
This is incredible! Mobile GPU does 300 million rays per second without using any shading GFLOPS? This is the GPU that is going to be on the next iPhone, right? Brigade 3 does 750 million rays per second on Nvidia GTX 580 with full power. I just wonder what this thing could do when scaled up.
The biggest problem here is content..
Producing content is the most expensive aspect of a game.<p>Improvement in tooling and reuse are the only way we can actually properly use better rendering
is this not from the same guys? <a href="https://www.youtube.com/watch?v=rfgz90Y93c0" rel="nofollow">https://www.youtube.com/watch?v=rfgz90Y93c0</a> < google tech talk
"life-like reflections"[1]?<p>allow me to disagree. the reflection is too green. probably not the hardware's fault, or is it?<p>[1] <a href="http://blog.imgtec.com/wp-content/uploads/2014/03/5_-PowerVR-Ray-Tracing-hybrid-rendering-1.jpg" rel="nofollow">http://blog.imgtec.com/wp-content/uploads/2014/03/5_-PowerVR...</a>