I work on this team! (Specifically: applied deep learning research, chip design).<p>It's a shame to see so many people dismissing this work as marketing. I see lots of clever people working hard on really novel and interesting stuff, and I really do think that ML has real potential to customize a design much more "deeply" than traditional automation tools.
Did nvidia just promise us singularity? :)<p>Hard to read a talk like this from a pulpit & not see shout outs to the incredibly super-fantastic open-source innovative projects like OpenROAD which have been shipping amazingly well-routed-by-AI chips for a while now. There's papers you can cite, galore, many open source designs[1].<p>It's not like Nvidia is promising anyone else will benefit from this work. This seems to be very high level coverage their R&D department is looking at, perhaps/perhaps not using. The article makes it hard to find out what is available, what has been published or otherwise deeply discussed (which is I think the best we can hope from Nvidia not real participation). There's only one paper linked, on NVCell[2], described as:<p>> <i>The first is a system we have called NVCell, which uses a combination of simulated annealing and reinforcement learning to basically design our standard cell library. </i><p>This just feels like so much else going on in computing. WSL coming to windows, the recent Unity vs Unreal topic[3]. It's hard to imagine refusing to participate with others. It's hard to imagine not being part of the open source community working shoulder to shoulder to push for better. NVidia patently doesn't get it, patently isn't participating, patently isn't there. It's cool we can hear what they are up to, but it's also extremely NVidia that they're doing it all on their own. Anyhow, Looking forward to more AI based chip power system design starting to emerge; that sounds like a good idea NV.<p>[1] <a href="https://theopenroadproject.org/" rel="nofollow">https://theopenroadproject.org/</a><p>[2]
<a href="https://research.nvidia.com/publication/2021-12_nvcell-standard-cell-layout-advanced-technology-nodes-reinforcement-learning" rel="nofollow">https://research.nvidia.com/publication/2021-12_nvcell-stand...</a><p>[3] <a href="https://news.ycombinator.com/item?id=31064552" rel="nofollow">https://news.ycombinator.com/item?id=31064552</a> (412 points, 3 days ago, 311 comments)
I don't understand the backlash here. The jist seemed to be traditional tools that are exact take a long time to process complex designs. Deep learning offers a statistical approach that can give a 'coarse' prediction and they're using this to reduce development time. That seems to make sense to me, especially in the earlier verification phases of the hardware design lifecycle.<p>To me this sounds like a good use-case of AI and Neural Nets. It doesn't appear to be looking to replace the traditional tools, just augment.
the last time I checked autorouters were still not capable of doing all the routing on a multi layer PCB properly, and manual work was still required to produce a decent design.
What is extremely telling is what is missing ... Design Rule Checking (DRC) and Layout Vs Schematic (LVS).<p>These require:<p>1) Longer bit length arithmetic<p>32-bit float simply isn't enough. 64-bit float is close, but limited. You really want 128-bit integer. And nVidia isn't delivering that.<p>2) Real algorithmic improvements<p>We're still stuck with computational geometry algorithms that don't parallelize. It would be awfully useful if nVidia would actually research some new algorithms instead of just waving around the ML/AI marketing wand.<p>But, then, this is the company that built itself on benchmarketing, so ...
Please keep up the processing power progress!<p>Economics of the software industry (or at least the products that I work on) depend on the assumption that cost of computing (including storage) diminish exponentially over time! <3
I really hope that they can apply some of these AI approaches on the driver situation on Linux as well. I will never buy an Nvidia product after the nightmares they've put me through.