TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Launch HN: Lifecast (YC W22) – 3D video for VR

87 pointsby fbriggsabout 3 years ago
Hi HN, I’m Forrest of Lifecast (<a href="https:&#x2F;&#x2F;www.lifecastvr.com" rel="nofollow">https:&#x2F;&#x2F;www.lifecastvr.com</a>), with my co-founder Mateusz. We make software to create 3D video for VR, robotics simulation, and virtual production. We convert any VR180 video or photo into our 6DOF VR video format, or into meshes compatible with Unreal Engine. Our 3D reconstruction is based on computer vision for dual fisheye lenses and deep learning.<p>VR video can be categorized as 3DOF (three degrees of freedom) or 6DOF (six degrees of freedom). 3DOF responds only to rotation, while 6DOF responds to both rotation and translation—meaning you get to move your head. VR games are 6DOF, but most VR videos are 3DOF. 3DOF can cause motion sickness and eye strain due to incorrect 3D rendering. 6DOF VR video fixes these problems for a more comfortable and immersive experience, but it is harder to make because it requires a 3D model of each frame of video.<p>There are some prototypes for 6DOF VR video systems in big tech, but they typically involve arrays of many cameras, so they are expensive, not very portable, and generate an impractical amount of data. Because of these challenges, 6DOF hasn&#x27;t been widely adopted by VR video creators.<p>In 2015 I was working on ads at Facebook, but I was more excited about VR. I built 3D cameras out of legos and GoPros, showed some of this at a hackathon, and eventually they let me do that as my day job. I was the first engineer on Facebook&#x27;s 3D VR camera team, which made Surround 360 (an open-source hardware&#x2F;software 3D VR camera), and Manifold (a ball of 20+ cameras for 6DOF). After Facebook, I was a tech lead on Lyft&#x27;s self-driving car project, and Google X&#x27;s everyday robot project.<p>I started Lifecast because I wasn&#x27;t satisfied with the progress on 6DOF VR video since I left Facebook. I learned new ideas from robotics which can improve VR video. The Oculus Quest 2 has just enough power to do something interesting with 6DOF. There have also been advances in computer vision and deep learning in the last few years that make it possible to do 6DOF better.<p>Our software makes it simple to create 6DOF VR video using any VR180 camera. It&#x27;s a GUI for Mac or Windows, which takes VR180 video or photos as input, and produces Lifecast&#x27;s 6DOF VR video format (more info: <a href="https:&#x2F;&#x2F;fbriggs.medium.com&#x2F;6dof-vr-video-from-vr180-cameras-2e17805ef3bc" rel="nofollow">https:&#x2F;&#x2F;fbriggs.medium.com&#x2F;6dof-vr-video-from-vr180-cameras-...</a>). VR180 video can be created with any VR180 camera; the Canon R5 is one of the best on the market right now. We make a video player for WebVR which runs on desktop, mobile or VR. Playing the videos on the Quest 2 doesn&#x27;t require installing any software, just visiting a web page in the Oculus Browser.<p>In addition to our 6DOF format, the software can also output point clouds (.pcd) or triangle meshes (.obj) compatible with Unreal Engine. We are seeing interest in using this for virtual production (2D film-making in a game engine), and creating environments for robotics simulation.<p>This recent video review&#x2F;tutorial does a nice job of explaining our tech: <a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=_4a-RnTLu-I" rel="nofollow">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=_4a-RnTLu-I</a> (video by Hugh Hou, not us). For something more interactive, the thumbnails on <a href="https:&#x2F;&#x2F;lifecastvr.com" rel="nofollow">https:&#x2F;&#x2F;lifecastvr.com</a> are links to demos that run in browser&#x2F;VR.<p>6DOF VR video is one piece of a larger puzzle. We envision a future where people wear AR glasses with 3D cameras, and use them to record and live-stream their experience. 3DOF is not sufficient for this because it causes motion sickness if the camera moves. We have prototypes which fix motion sickness in 3D POV VR video from wearable cameras. Watching the videos in VR feels like reliving a memory. Here&#x27;s a demo: <a href="https:&#x2F;&#x2F;lifecastvr.com&#x2F;trickshot.html" rel="nofollow">https:&#x2F;&#x2F;lifecastvr.com&#x2F;trickshot.html</a><p>You can download a free trial from <a href="https:&#x2F;&#x2F;lifecastvr.com" rel="nofollow">https:&#x2F;&#x2F;lifecastvr.com</a> after entering your email address, but do not need to create a full account. The free trial is not limited in any way other than putting a watermark on the output.<p>We’d love to hear your thoughts and experiences about VR video, virtual production and robotics!

11 comments

anish_mabout 3 years ago
This is awesome project! I looked into starting a &quot;real world travel&quot; app for oculus with recorded videos, but not having an easy way to record 6dof videos is a big problem for true VR experience with videos. If you can pull this off, you have the potential to actually make VR more mainstream outside gameverse. Good luck and congrats!
somethingsomeabout 3 years ago
Hi! Seems very nice! How do you compare with the MPEG&#x27;s software RVS [0] that is used as the decoder in the next video standard?<p>Real-Time realistic 6DoF with only few images&#x2F;videos, it is currently being ported to Oculus Quest 2 and next step includes MPIs ;)<p>Demo: <a href="https:&#x2F;&#x2F;d2dxqgbltsja2l.cloudfront.net&#x2F;vimmerse-resource&#x2F;choco_fountain.mp4" rel="nofollow">https:&#x2F;&#x2F;d2dxqgbltsja2l.cloudfront.net&#x2F;vimmerse-resource&#x2F;choc...</a><p>[0] <a href="https:&#x2F;&#x2F;ieeexplore.ieee.org&#x2F;abstract&#x2F;document&#x2F;9590541" rel="nofollow">https:&#x2F;&#x2F;ieeexplore.ieee.org&#x2F;abstract&#x2F;document&#x2F;9590541</a>
评论 #30679927 未加载
0x20cowboyabout 3 years ago
This is very cool. I love this kind of stuff. I built a web player to view 360 streaming videos using VR: <a href="https:&#x2F;&#x2F;meshvue.com&#x2F;" rel="nofollow">https:&#x2F;&#x2F;meshvue.com&#x2F;</a>, but it hasn&#x27;t caught on.<p>If you think it might help, I&#x27;d be keen to chat.<p>(It works best on desktop, but it does work on the Quest too. But because the texture sizes the Quest supports are quite small, it&#x27;s resolution is bad currently)
acgourleyabout 3 years ago
We&#x27;re working on something similar - several scene layers packed and transmitted over h265 streams and unpacked into a 3D client for 6DoF playback. Captures from something as simple as a GoPro and then our CV compares perspectives of the scene across time to reconstruct it in 3D for the encoding&#x2F;transmission steps.<p>Targeting exercise market (where we got our start) but it could go beyond it in time.<p>Short demo: <a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=DST9jz9Rrcc" rel="nofollow">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=DST9jz9Rrcc</a><p>Happy to chat, email in profile.
评论 #30679470 未加载
评论 #30677112 未加载
评论 #30677905 未加载
评论 #30677040 未加载
评论 #30680930 未加载
评论 #30678670 未加载
评论 #30679432 未加载
评论 #30676607 未加载
Findetonabout 3 years ago
Ok so I&#x27;ve tried doing some version of this that is a bit more advanced [0] but I gave up because I&#x27;m not a ML expert. Have you thought about creating&#x2F;projecting video versions of lightfields? Like Google&#x27;s Deepview [1]. I&#x27;d love for DeepView Video kind of tech to be comoditized.<p>[0] <a href="https:&#x2F;&#x2F;roblesnotes.com&#x2F;blog&#x2F;lightfields-deepview&#x2F;" rel="nofollow">https:&#x2F;&#x2F;roblesnotes.com&#x2F;blog&#x2F;lightfields-deepview&#x2F;</a><p>[1] <a href="https:&#x2F;&#x2F;augmentedperception.github.io&#x2F;deepviewvideo&#x2F;" rel="nofollow">https:&#x2F;&#x2F;augmentedperception.github.io&#x2F;deepviewvideo&#x2F;</a>
评论 #30676706 未加载
glinkotabout 3 years ago
This is a really worthwhile goal. I grabbed my Quest 2 and jumped onto the site. I wondered if the artifacts around areas in motion (eg the parachutist) are from compression or the 6DOF conversion process? Some &#x27;tiling&#x27; effects on the grass areas had me wondering the same, but I imagined these were because the algorithm generates a certain size mesh even when the surface is fairly flat.<p>What options are there for filling in &#x27;unknown&#x27; regions? On the fire spinner video, you can move your head from side to side, but you get darkish blobs shadowing out from behind the performer and the trees. That got me thinking about stereo separation with multiple 180 vr cameras as you mentioned, and how many 360 degree cameras (and&#x2F;or TOF sensors etc) we&#x27;d need to approach real time photogrammetry and a scene where the VR user could walk a meaningful distance within the filmed environment. How plausible would that be given all your experiences in the area?<p>I&#x27;d love to see some downloadable full quality ones to see them in all their glory! Best wishes with it.
评论 #30682205 未加载
Wowfunhappyabout 3 years ago
I own an Index and a powerful PC, and I&#x27;d like to experience a 6DOF video. (I&#x27;m not interested in making my own.) How can I do that? I haven&#x27;t used WebVR, what do I need to do to get that working?
评论 #30681548 未加载
charcircuitabout 3 years ago
Is there any chance that you will release a Linux version considering Firefox and Chrome don&#x27;t support WebXR?
评论 #30676986 未加载
huevosabioabout 3 years ago
Can this be done in real time? I.e. can I stream a 180&#x2F;360 video and have it be 6DoF on the fly?
评论 #30680608 未加载
评论 #30680404 未加载
spupeabout 3 years ago
Hey this is cool guys, congrats. Lots of potential for new tools to produce better VR content.
paultabout 3 years ago
I’ll address the elephant in the room; I think that at this time, the use case for this with the widest potential for adoption is VR porn. I would very much like to see how well this adapts to, uh, organic surfaces. I’m not joking, so please don’t downvote me. :)
评论 #30683865 未加载
评论 #30680399 未加载