A kinda depressing thought about AI: Even in a world in which we have perfectly aligned AI and there’s no risk of a “paperclip maximizer” destroying the world, we still have a decentralized technology that’s probably more powerful than an atomic bomb. If every person in the world today had a nuclear weapon the world would already have been destroyed many times over. The only hope for humanity is that somehow we create a benevolent AI dictator that’s more powerful than the other AIs and is able to prevent them from causing damage. I guess that’s a possible scenario but it’s pretty bleak that the bull case for AI is having an AI master controlling the world as a totalitarian dictator. My take is that it would be best for humanity to avoid any advanced AI development, although I understand that’s probably impossible.