I don't know who this guy is and what he does. I do know this<p>> 1) Nuclear weapons are not smarter than humanity.<p>> 2) Nuclear weapons are not self-replicating.<p>is a nonsense claim. Nuclear weapons are not <i>yet</i> smarter than humanity or self-replicating, in the same way that AGI is not <i>yet</i> a realistic threat. These are comic-book arguements without basis in reality, probably even Hideo Kojima shaking his head at this interpretation of nuclear politics and AI.<p>The nuclear weapons that are currently-manufactured and ready-to-launch stand a bigger immediate threat than any LLM on the market today. It's a no-brainer. Hypothesizing future AGI threats is a useless exercise, in the same way that hypothesizing future nuclear weapons is futile and pointless.