I miss the days when it was unfashionable to call things "AI".<p>It forced people to think more clearly, and it made the models used sound to normal people like boring applications of numerical optimization, which is pretty accurate. It's not something a normal person should feel any sense of excitement about, any more than they should care about a new type of satellite dish or whatever -- you have to be a pretty odd duck to find it intrinsically interesting.
It's very simple: do NOT legislate. Stop regulating every single detail. Instead, focus on making sure that the fundamental principles (fundamental rights, trade...) are respected. Those principles will probably be the same in 30 years than they were 50 years ago, or at least very similar.<p>There are very basic functions that the legislative power is still missing. Still, they want to regulate on super advanced topics up to the most specific details.
Great, another PR piece by the mountebank clown car at "OpenAI" trying to tell us the singularity is upon us, but please send us billions of investment dollars so we can make it so and protect you from evil.
Perhaps the real question is not whether we will survive the AI apocalypse, but whether we will survive the existential tsunami that will probably pour forth if we fail to build God and devoid of a God that could be dug out of the past or created in the future, humanity is, in a way never before known, condemned to be free and everything is rendered impossible.