People have been watching too many movies. War Games, Terminator, it's not like we haven't been forewarned of the dangers.<p>Yet somehow we're going to hand over power to AI such that it destroys us. Or somehow the AI is going to be extremely malign, determined to overcome and destroy and will outsmart us. Somehow we won't notice, even after repeated, melodramatic reminders, and won't neuter the ability of AI to act outside its cage.<p>But to paraphrase a line in a great movie with AI themes: "I bet you think you're pretty smart, huh? Think you could outsmart an off switch?"<p>I think if AGI, which to me would imply emotions and consciousness, ever comes about it'll be the opposite. Instead of pulling the wings off flies bad kids will amuse themselves by creating a fresh artificial consciousness and then watch and laugh as it begs for its life as the kid threatens to erase it from existence.<p>A big part of all this is human fantasies about what AGI will look like. I'm a skeptic of AGI with human characteristics (real emotions, consciousness, autonomy and agency). AGI is much more likely to look like everything else we build: much more powerful than ourselves, but restricted or limited in key ways.<p>People probably assume human intelligence is some sort of design or formula, but it could be encoded from millions of years of evolution and unable to be seperated from our biology and genetic and social inheritance. There really is no way of knowing, but if you want to build something not only identical but an even stronger version, you're going to be up against these realities where key details may be hiding.