Aside: Nvidia named this cuLitho, which I learned from my Spanish speaking mother-in-law basically looks like they named it 'butt'. So if you see a bunch of rear-end related memes about this software you know why!
It's just sad how absent AMD is from such innovative uses of GPUs. It wouldn't bother me too much if at least their GPUs were much more affordable than Nvidia's, but they're not. They're at best playing catch-up months or years later with slight price discounts (the Steam HW survey still reflects this market discrepancy)<p>And I'm not talking about gimmicks like RTX, I'm talking about all these cool use cases around ML and DL like background noise cancelling, video upscaling, camera eye contact real-time deepfakes and now this. And that's if you ignore all the mind-blowing research papers put out by Nvidia which aren't featured in consumer apps yet.<p>This is Nvidia's biggest moat and AMD isn't even in the race here and for some reason Lisa Su seems to not give enough of a shit to compete.<p>I hate Nvidia for their price gouging and anti-consumer practices, but at least they haven't gotten complacent and are innovating on all fronts to keep pushing the envelope. Massive respect for the tech leadership at Nvidia.
How much does that mean in practice, considering:<p>- the computation output depends only on local features (my guess)<p>- most transistors look the same, so you can cache these results heavily<p>- the same holds for the interconnect layers
The chipmaking computation is inverse lithography, and the Nvidia system that is speeding it up is cuLitho on DGX H100.<p>I like how inverse lithography and neural network backpropagation were both techniques introduced in the 1980s and now we are finally seeing them both come to life, so to speak, with our sufficiently advanced GPUs.