This means regulating the most advanced chip fabs, TSMC, Samsung, and Intel on manufacturing ever more powerful AI training chips.<p>I can't think of another way to prevent a rogue lab from training ever more powerful AI.<p>Only AI labs that pass stringent audits and regulations can be allowed to buy more powerful AI chips.<p>If we determine that AI is an existential threat to humanity, like nuclear weapons are, shouldn't we regulate AI training chips like we regulate nuclear weapon making material?<p>Countries have treaties signed on nuclear weapons. Why not sign treaties on AI chip making/buying?
How do we make sure no country builds underground chip facilities just for training AGI models? tbh, the genie is out of the bottle. I never thought the path to AGI would be so "simple": basically you take a 2017 paper and give it tons of data and see AGI emerge. It's fascinating.
Not just chip fabs; also data centers. According to Eliezer S. Yudkowsky, if push comes to shove, we can also order airstrikes against the latter:<p><i>If intelligence says that a country outside the agreement is building a GPU cluster, be less scared of a shooting conflict between nations than of the moratorium being violated; be willing to destroy a rogue datacenter by airstrike.</i><p><a href="https://time.com/6266923/ai-eliezer-yudkowsky-open-letter-not-enough/" rel="nofollow">https://time.com/6266923/ai-eliezer-yudkowsky-open-letter-no...</a>
> Countries have treaties signed on nuclear weapons.<p>Yet we also have nuclear technology for energy, medicine, and more. It's a big jump from weapons to a foundation technology.<p>How do you predetermine if a multi-use chip is going to be used for AI or something else? These GPU like chips have multiple purposes