TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Show HN: Gaussian Splat renderer in VR with Unity

21 pointsby chrisnoletover 1 year ago
I was playing with this for a few weeks over the holiday break. This is one of the GS3D sample scenes running on PCVR at about 65 FPS. I'm sorting on the CPU at the moment, so there are some hitches, but it works! I may publish this as a Unity asset. (I'd love to get it working on Vision Pro, but we'll see.)

6 comments

mike_hearnover 1 year ago
Chris&#x27; post doesn&#x27;t really give much background info, so here&#x27;s what&#x27;s going on here and why it&#x27;s awesome.<p>Real-time 3D rendering has historically been based on rasterisation of polygons. This has brought us a long way and has a lot of advantages, but making photorealistic scenes takes a lot of work from the artist. You can scan real objects with photogrammetry and then convert to high poly meshes, but photogrammetry rigs are pro-level tools, and the assets won&#x27;t render at real time speeds. Unreal 5 introduced Nanite which is a very advanced LoD algorithm and that helps a lot, but again, we seem to be hitting the limits of what can be done with polygon based rendering.<p>3D Gaussian Splats is a new AI based technique that lets you render in real-time photorealistic 3D scenes that were captured with only a few photos taken using normal cameras. It replaces polygon based rendering with radiance fields.<p><a href="https:&#x2F;&#x2F;repo-sam.inria.fr&#x2F;fungraph&#x2F;3d-gaussian-splatting&#x2F;" rel="nofollow">https:&#x2F;&#x2F;repo-sam.inria.fr&#x2F;fungraph&#x2F;3d-gaussian-splatting&#x2F;</a><p>3DGS uses several advanced techniques:<p>1. A 3D point cloud is estimated by using &quot;structure in motion&quot; techniques.<p>2. The points are turned into &quot;3D gaussians&quot;, which are sort of floating blobs of light where each one has a position, opacity and a covariance matrix defined using &quot;spherical harmonics&quot; (no me neither). They&#x27;re ellipsoids so can be thought of as spheres that are stretched and rotated.<p>3. Rendering is done via a form of ray-tracing in which the 3D Gaussians are projected to the 2D screen (into &quot;splats&quot;), sorted so transparency works and then rasterized on the fly using custom shaders.<p>The neural network isn&#x27;t actually used at rendering time, so GPUs can render the scene nice and fast.<p>In terms of what it can do the technique might be similar to Unreal&#x27;s Nanite. Both are designed for static scenes. Whilst 3D Gaussians can be moved around on the fly, so the scene can be changed <i>in principle</i>, none of the existing animation, game engines or artwork packages know what to do without polygons. But this sort of thing could be used to rapidly create VR worlds based on only videos taken from different angles, which seems useful.
评论 #39124221 未加载
yodonover 1 year ago
Very cool work! Are there papers or repos that do fast splat generation of digitally-originated assets?<p>I&#x27;m wondering if there is a way to embed digitally-originated assets in the scene and render them using the same splat drawing pipeline you&#x27;re using to render your photographically-originated assets?
评论 #39138317 未加载
junonover 1 year ago
That&#x27;s pretty cool, can you incorporate other traditional 3D assets and have them look decent when rendered alongside this?
评论 #39115676 未加载
ideashowerover 1 year ago
very cool! any chance we could get a small tutorial? I’d love to replicate this effort and play around with it!
评论 #39138368 未加载
MoeGoomsover 1 year ago
Nice work :)
Konwarover 1 year ago
Unity