Part 1 of an ongoing 4-part write-up. In part four he has released an alpha preview of his open source Oculus and VR framework; see <a href="http://github.com/Doc-Ok/OpticalTracking" rel="nofollow">http://github.com/Doc-Ok/OpticalTracking</a>.<p>tl;dr; The Oculus positional tracker system uses 40 IR LEDs, each with a 10-bit cyclic code of bright or dim flashes. The tracker camera is synced to the headset and can identify each LED using these codes, elegantly simplifying the job of recovering the 3D position of the headset (the 'Perspective-n-Point' problem in Computer Vision). This gives very good absolute position and orientation tracking, however degrades when only coplanar LEDs are visible. This can be overcome by fusing the IMU data from the Oculus, which will be the contents of part four when it is written up.<p>This man (Oliver Kreylos) seems to do a great job of explaining computer vision problems. I really enjoyed reading these posts, and others on his blog. He's is very well qualified to write about these topics, as well (see his bio here: <a href="http://doc-ok.org/?page_id=6" rel="nofollow">http://doc-ok.org/?page_id=6</a> - he has been researching in this area for years).
Interesting. I think Valve's solution is probably the most elegant one I've seen so far: <a href="http://www.hizook.com/blog/2015/05/17/valves-lighthouse-tracking-system-may-be-big-news-robotics" rel="nofollow">http://www.hizook.com/blog/2015/05/17/valves-lighthouse-trac...</a>