OTOY is absolutely at the forefront of digital imagery. They're like MPEG in a Kodak world, that's how different their approach is.<p>How?<p>1) Light fields are 4 dimensional images, light field video is a 5 dimensional stream. This is a basic requirement for hand- or head-tracked images, like in headsets or AR devices.<p>2) We're just getting to the point where real time ray tracing is truly economical, and OTOY is all-in on it. Up til now rendering has been a "bag of tricks" approach, where you try to paint sophisticated paintings on polygons. Many of these tricks fall apart when you try to do the predictive modeling required for 6DOF streaming. You see the reflections painted onto the countertop. Ray tracing actually simulates light.<p>3) They've fully embraced the cloud. They're offering everything they do as cloud services, which means it can work on every device, for a minimal cost, with no need for customers or users to be on the latest hardware.<p>4) Open formats. They're not trying to build a portal the way Oculus or Valve is, they're inventing the content pipeline and getting it integrated everywhere they can. I am skeptical any the closed content stores will win, we saw how big the web became, I think a better bet is that the metaverse will be more like the web than the App Store, and that's the bet OTOY is making.<p>Orbach has been working relentlessly on this vision behind the scenes. Not a lot has been coming out of the company, but I've been watching him lay the groundwork for the whole next generation of content distribution for 5 years now, and he's killing it. Release after release of core building blocks.
OTOY is contributing the ORBX container and render graph system to MPEG I part 2 at MPEG 120 as a'tier 1'license (equivalent to MIT license). Paid licenses and patents IMO should not be in the baseline minimal schema/scene graph system, or we will never get to a truly open metaverse. I made as strong a case as I could that plenty of value and IP can still be implemented as services or modules on top of such open framework whenever this issue came up at MPEG 119 last month.<p>Here is one of the two ORBX docs from MPEG 119, the other (which has the full container schema) I'll post shortly.<p><a href="https://home.otoy.com/wp-content/uploads/2017/08/m41018-An-introduction-to-ORBXJU-6-ATH.pdf" rel="nofollow">https://home.otoy.com/wp-content/uploads/2017/08/m41018-An-i...</a>
I find everything I read about OTOY's stuff (seemingly purposefully) confusing.<p>They seem to have a render farm, light field format and renderer, and a streaming video format that they will always mix together in their demos.<p>"Look at these amazing renders. It runs on a phone!"*<p>*its actually just streaming video to the phone.<p>I'm pretty excited about this stuff but I keep finding myself frustrated trying to crack through OTOY's marketing to get my hands on something I can try myself.<p>Can someone please break my incredulity?
At 13:45 he talks about:<p>1: streaming a Unity/Unreal game into a surface texture<p>2: packaging an entire Unity project into an ORBX file<p>So... am I understanding this right: that ORBX can contain not just a light field, but all the assets and logic for a game, compiled to LuaJIT, which the ORBX player (or orbx.js) will then play? And Unity can target this for output?
What about real video recording to generate real VIDEO light fields?<p>Also: how much processing is it required to get video light fields from video recording (with an array of cameras I suppose)? I mean: does it scale?