I loved the article that this links to about 'Jump Flood Algorithm'!:<p><a href="https://bgolus.medium.com/the-quest-for-very-wide-outlines-ba82ed442cd9" rel="nofollow">https://bgolus.medium.com/the-quest-for-very-wide-outlines-b...</a><p>So fascinating! Thanks for indirectly leading me to this! I love thinking about all the various approaches available at the pixel/texel/etc level!<p>It's also another case where it's a very clever way of generating a type of SDF (Signed Distance Field) that is doing a lot of the heavy-lifting. Such a killer result here as well! Any-width-outline-you-like in linear time?!! Amazing when compared to the cost of the brute-force ones at huge widths!<p>I wholeheartedly endorse SDFs, whether they are 'vector' ones, function-based, like Inigo Quilez's amazing work, Or 'raster' ones like in the article, texel-or-voxel-based. Houdini supports raster-SDFs very well I think, has a solid, mature set of SDF-tools worth checking out (there's a free version if you don't have a lic)!<p>And of course there's all the many other places SDFs are used!! So useful! Definitely worth raising-awareness of I reckon!
One day, I'd love to dive into stylised 3D graphics as an R&D project. There's been decent progress recently, but I think there's a lot of low-hanging fruit left to pick.<p>Some open questions:<p>- How do you reduce the detail of a toon-rendered 3D model as the camera zooms out? How do you seamlessly transition between its more-stylised and less-stylised appearance?<p>- Hand-drawn 2D animations often have watercolour backgrounds. Can we convincingly render 3D scenery as a watercolour painting? How can we smoothly animate things like brush-strokes and paper texture in screen space?<p>- How should a stylised 3D game portray smoke, flames, trees, grass, mud, rainfall, fur, water...?<p>- Hand-drawn 2D animations (and some recent 3D animations) can be physically incorrect: the artist may subtly reshape the "model" to make it look better from the current camera angle. In a game with a freely-moving camera, could we automate that?<p>- When dealing with a stylised 3D renderer, what would the ideal "mesh editor" and "scenery editor" programs look like? Do those assets need to have a physically-correct 3D surface and 3D armature, or could they be defined in a more vague, abstract way?<p>- Would it be possible to render retro pixel art from a simple 3D model? If so, could we use this to make a procedurally-generated 2D game?<p>- Could we use stylisation to make a 3D game world feel more physically correct? For example, when two meshes accidentally intersect, could we make that intersection less obvious to the viewer?<p>There are probably enough questions there to fill ten careers, but I suppose that's a good thing!
I started my career working on VR apps and soon pivoted to webdev due to the better market.<p>Articles like this one make me miss the field - working with 3d graphics, collisions, shaders, etc. had a magical feeling that is hard to find in other areas. You're practically building worlds and recreating physics (plus, math comes up far more practically and frequently than in any other programming field).
For my game Astral Divide, I ended-up making a technique that is not listed.<p>It's similar to the one described as "Blurred Buffer", except that instead of doing a blur pass, I'm exploiting the borders created by the antialiasing (I think via multi sampling, or maybe texture filtering).<p>I draw the object in plain opaque white on a transparent black background, and in the fragment shader I filter what does not have a fully opaque or transparent alpha channel (according to some hardcoded threshold). It gives decent enough results, it's cheap performance-wise and is very simple to implement.
These are great notes!<p>When looking into the edge detection approach recently, I came across this great method from the developer of Mars First Logistics:<p><a href="https://www.reddit.com/r/Unity3D/comments/taq2ou/improving_edge_detection_in_my_game_mars_first" rel="nofollow">https://www.reddit.com/r/Unity3D/comments/taq2ou/improving_e...</a>
That is excellent and the result can be very pleasing as this render in the article : <a href="https://ameye.dev/notes/rendering-outlines/edge-detection/contribution-result.png-543w.webp" rel="nofollow">https://ameye.dev/notes/rendering-outlines/edge-detection/co...</a><p>It looks like a frame from dutch comic book Franka !
What a phenomenal article and reading UX.<p>Explaining a difficult concept in terms anyone can understand. Great diagrams and examples. And top marks on readability UX for spacing and typography.<p>OP, what inspired you to create your current theme? Have you ever considered creating an engineer-focused publishing platform?
Technical art is definitely my first love in software. I'm excited for godot to add an easier compute shader pipeline for post processing effects - their current compositor plugin set up is a bit boiler plate intensive.<p>this repo is a great example of post processing in godot: <a href="https://github.com/sphynx-owner/JFA_driven_motion_blur_demo">https://github.com/sphynx-owner/JFA_driven_motion_blur_demo</a>
I remember the first time I saw this effect was on Wacky Races on the Dreamcast.<p>I remember at the time there was a lot of PR around this being the first game to introduce that effect and how the developers basically invented it.<p>I can’t comment on whether that was actually true or just PR BS, but it was definitely the first time <i>I</i> experienced it as a gamer.
The edge detection part reminds me a lot of the game Rollerdrome<p><a href="https://store.steampowered.com/app/1294420/Rollerdrome/" rel="nofollow">https://store.steampowered.com/app/1294420/Rollerdrome/</a><p>I wonder if they used something like that
I’d render the model to a buffer using a single colour (no shading, lighting or texturing), then render the buffer with edge detection. This gives an outline with one additional render pass.<p>Suprised this isn’t obvious.
Funny, I was just playing around with some of these techniques yesterday for my game <a href="https://roguestargun.com" rel="nofollow">https://roguestargun.com</a> now that I got it running reasonably comfortably at 72 fps on the Oculus Quest 2.<p>Mostly due to laziness, as a cell shaded look requires less retexturing for my game than simply creating proper PBR materials.<p>The inverted hull method + cell shaded look I initially used however actually really does have quite a performance hit.
Quite like the techniques used to make cel-style graphics using the usual 3-d pipeline as seen in quite a few Nintendo games like<p><a href="https://en.wikipedia.org/wiki/Pok%C3%A9mon_Sun_and_Moon" rel="nofollow">https://en.wikipedia.org/wiki/Pok%C3%A9mon_Sun_and_Moon</a><p>and also used to make other illustration-like styles such as<p><a href="https://en.wikipedia.org/wiki/Valkyria_Chronicles" rel="nofollow">https://en.wikipedia.org/wiki/Valkyria_Chronicles</a>
A simple technique not listed here for drawing contour edges:<p>1) Create an array storing all unique edges of the faces (each edge being composed of a vertex pair V0, V1), as well as the two normals of the faces joined by that edge (N0 and N1).<p>2) For each edge, after transformation into view space: draw the edge if sign(dot(V0, N0)) != sign(dot(V0, N1)).
Side note, whatever happened with the great Unity pricing debacle? Did developers end up moving en masse to Unreal and Godot? Or were Unity’s walkbacks and contrition sufficient to keep it a going concern?
Great breakdown of outline rendering techniques! The detailed explanations and code snippets are super helpful for understanding Unity's shader possibilities. Thanks for sharing!