TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Show HN: Real-Time Gaussian Splatting

144 pointsby markisus5 days ago
LiveSplat is a system for turning RGBD camera streams into Gaussian splat scenes in real-time. The system works by passing all the RGBD frames into a feed forward neural net that outputs the current scene as Gaussian splats. These splats are then rendered in real-time. I've put together a demo video at the link above.

18 comments

spyder5 days ago
Correct me if I'm wrong but looking at the video this just looks like a 3D point cloud using equal-sized "gaussians" (soft spheres) for each pixel, that's why it looks still pixelated especially at the edges. Even when it's low resolution the real gaussian splatting artifacts look different with spikes an soft blobs at the lower resolution parts. So this is not really doing the same as a real gaussian splatting of combining different sized view-dependent elliptic gaussians splats to reconstruct the scene and also this doesn't seem to reproduce the radiance field as the real gaussian splatting does.
评论 #43995701 未加载
echelon5 days ago
OP, this is incredible. I worry that people might see a &quot;glitchy 3D video&quot; and might not understand the significance of this.<p>This is getting unreal. They&#x27;re becoming fast and high fidelity. Once we get better editing capabilities and can shape the Gaussian fields, this will become the prevailing means of creating and distributing media.<p>Turning any source into something 4D volumetric that you can easily mold as clay, relight, reshape. A fully interactable and playable 4D canvas.<p>Imagine if the work being done with diffusion models could read and write from Gaussian fields instead of just pixels. It could look like anything: real life, Ghibli, Pixar, whatever.<p>I can&#x27;t imagine where this tech will be in five years.
评论 #43995113 未加载
评论 #43995142 未加载
yuchi5 days ago
The output looks terribly similar to what sci-fi movies envisioned as 3D reconstruction of scenes. It is absolutely awesome. Now, if we could <i>project</i> them in 3D… :)
评论 #43997547 未加载
whywhywhywhy5 days ago
Would be good to see how it&#x27;s different from just the depth channel applied to the Z of the RGB pixels. Because it looks very similar to that.
评论 #43995841 未加载
sendfoods5 days ago
Please excuse my naive question - isn&#x27;t Gaussian Splatting usually used to create 3D imagery from 2D? How does providing 3D input data make sense in this context?
评论 #43995323 未加载
评论 #43996188 未加载
评论 #43995231 未加载
hi_hi5 days ago
While undoubtedly technically impressive, this left me a little confused. Let me explain.<p>What I think I&#x27;m seeing is like one of those social media posts where someone has physically printing out a tweet, taken a photo of them holding the printout, and then posted another social media post of the photo.<p>Is the video showing me a different camera perspective than what was originally captured, or is this taking a video feed, doing technical magic to convert to gaussian splats, and then converting it back into a (lower quality) video of the same view?<p>Again, congratulations, this is amazing from a technical perspective, I&#x27;m just trying to understand some of the potential applications it might have.
评论 #44001885 未加载
armchairhacker5 days ago
Gaussian Splatting looks pretty and realistic in a way unlike any other 3D render, except UE5 and some hyper-realistic not-realtime renders.<p>I wonder if one can go the opposite route and use gaussian splatting or (more likely) some other method to generate 3D&#x2F;4D scenes from cartoons. Cartoons are famously hard to emulate in 3D even entirely manually; like with traditional realistic renders (polygons, shaders, lighting, post-processing) vs gaussian splats, maybe we need a fundamentally different approach.
mandeepj5 days ago
Another implementation of splat <a href="https:&#x2F;&#x2F;github.com&#x2F;NVlabs&#x2F;InstantSplat">https:&#x2F;&#x2F;github.com&#x2F;NVlabs&#x2F;InstantSplat</a>
评论 #43995452 未加载
评论 #43995479 未加载
smusamashah5 days ago
The demo video does not show constructing 3d from input. Is it possible to do something like that with this? Take a continus feed of a static scene and keep improving the 3D view?<p>This is what I thought from the title, but the demo video is just a conitnuously changing stream of points&#x2F;splats with the video.
评论 #43998462 未加载
drewbeck5 days ago
imo this is a key component of a successful VR future for live events. Many cameras at a venue, viewers strap on a headset at home and get to sit&#x2F;stand anywhere in the room and see the show.<p>Also I love the example video. Folks could make some killer music videos with this tech.
asadm5 days ago
This is amazing! Video calls of the future (this + vision pro) would be lovely.
metalrain5 days ago
How did you train this? I&#x27;m thinking there isn&#x27;t reference output for live video frame to splats so supervised learning doesn&#x27;t work.<p>Is there some temporal accumulation?
评论 #43995438 未加载
badmonster5 days ago
What is the expected frame rate and latency when running on a typical setup with one Realsense camera and an RTX 3060?
评论 #43997911 未加载
sreekotay5 days ago
This is realtime capture&#x2F;display? Presumable (at this stage) for local viewing? Is that right?
评论 #43995010 未加载
donclark4 days ago
Holodeck coming soon?
corysama5 days ago
So, I see livesplat_realsense.py imports livesplat. Where’s livesplat?
评论 #43995512 未加载
评论 #43995510 未加载
kookamamie5 days ago
[flagged]
评论 #43998361 未加载
评论 #43995750 未加载
patrick4urcloud5 days ago
nice