I'm disgusted by the mindset that companies should be able to do whatever they want when it comes to technology as impactful and revolutionary as AI.<p>AI sucks up the collective blood, sweat and tears of human work without permission or compensation and then re-monetizes it. It's a model that is even more asymmetrical than Google Search, whom at least gives back some traffic to creators (if lucky).<p>AI is going to decide on human lives if it drives your car or makes medical diagnoses or decisions. This needs regulation.<p>AI has the ability for convincing deepfakes, attacking the essence of information and communication in itself. This needs regulation, accountability, at least a discussion.<p>As AI grows in its capability, it will have an enormous impact on the work force, both white collar and blue collar. It may lead to a lot of social unrest and a political breakdown. "Let's see what happens" is wildly irresponsible.<p>You cannot point to foreign competition as a basis for a no-rule approach. You should start with rules for impactful/dangerous technology and then hold parties to account, both domestic and foreign.<p>And if it is true that we're in a race to AGI, realize that this means the invention of infinite labor. Bigger than the industrial revolution and information age combined.<p>Don't you think we should think that scenario through a little, rather than winging it?<p>The inauguration had the tech CEOs lined up directly behind Trump, clearly signaling who runs the country. Its tech and its media. How can you possible have trust in a technology even more powerful ending up in ever richer and more autocratic hands?<p>But I suppose the reality is that Altman should donate $100 million to Trump and tell him that he's the greatest man ever. Poof, regulation is gone.