TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Show HN: 3D-Parallax, labelfree 3D experience from a 2D image using parallax

52 pointsby crou68over 4 years ago

10 comments

xaedesover 4 years ago
&quot;We offer an experience of 3D on a single 2D image using the parallax effect, i.e, the user is able move his real-time tracked face to visualize the depth effect.&quot;<p>Since there are no examples I can&#x27;t be sure if this is is what I think it is, but IF it is:<p>I want this on huge monitor for any 3D game instead of clunky headgear VR or tiny smartphone AR.<p>Months ago I also tested this with a small 3D visualization and very crude head tracking.<p>The effect is damn awesome! To be able to move around in real space with the rendering adapting to it, makes it so immersive, even for my very crude tests.<p>In my opinion the resulting 3D effect is MUCH better than viewing in stereo with one picture per eye.<p>Here is an example from someone else from 2007:<p><a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=Jd3-eiid-Uw" rel="nofollow">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=Jd3-eiid-Uw</a><p>From 2012:<p><a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=h9kPI7_vhAU" rel="nofollow">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=h9kPI7_vhAU</a><p>Obviously this works for only one person viewing it, but does that really matter? There are a LOT of use cases where only a single person uses a single monitor for viewing, especially in these times. In fact it is the standard.
评论 #25643921 未加载
评论 #25644350 未加载
评论 #25646777 未加载
评论 #25645281 未加载
评论 #25652051 未加载
评论 #25643888 未加载
yunusabdover 4 years ago
Not exactly the same, but I made something a while ago that takes a 2D image and tries to infer a depth map to create a 3D effect: <a href="https:&#x2F;&#x2F;awesomealbum.com&#x2F;depth" rel="nofollow">https:&#x2F;&#x2F;awesomealbum.com&#x2F;depth</a><p>It&#x27;s based on [1] and runs entirely in the browser, allthough it takes a moment to create the depth map. It&#x27;s more of a toy project at this point. But I was surprised when I saw that Google is doing the same thing now in Google Photos [2].<p>[1] <a href="https:&#x2F;&#x2F;github.com&#x2F;FilippoAleotti&#x2F;mobilePydnet" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;FilippoAleotti&#x2F;mobilePydnet</a><p>[2] <a href="https:&#x2F;&#x2F;www.theverge.com&#x2F;2020&#x2F;12&#x2F;15&#x2F;22176313&#x2F;google-photos-2d-3d-photos-cinematic-memories-activities-things" rel="nofollow">https:&#x2F;&#x2F;www.theverge.com&#x2F;2020&#x2F;12&#x2F;15&#x2F;22176313&#x2F;google-photos-2...</a>
评论 #25645187 未加载
slingnowover 4 years ago
It cracks me up that in 2021 people are still posting fundamentally visual tools without so much as a single screenshot to help understand what it does
fish44over 4 years ago
<a href="https:&#x2F;&#x2F;munsocket.github.io&#x2F;parallax-effect&#x2F;examples&#x2F;deepview.html" rel="nofollow">https:&#x2F;&#x2F;munsocket.github.io&#x2F;parallax-effect&#x2F;examples&#x2F;deepvie...</a> here is a different library with a demo
phoe-krkover 4 years ago
Do you have any examples?
评论 #25643535 未加载
Moosdijkover 4 years ago
How do you calculate the angle between the persons eyes and the screen, in order to render the parallax effect?
评论 #25643936 未加载
dannywover 4 years ago
Please give us some examples.
chrisseatonover 4 years ago
What does XP mean in this context?
评论 #25643683 未加载
criddellover 4 years ago
The iPhone and iPad have motion tracking tied to the acceleration of the device. Is this a similar effect except it&#x27;s based on head movement rather than device movement?
评论 #25646060 未加载
nojvekover 4 years ago
Would be great if README had screenshots. I’m not entirely sure what this does.