This question comes to my mind quite often now. Let me try to explain this if I can; (I am not an expert in this topic, sorry in advance if it doesn't make sense)<p>Let's give a fixed amount of computing/algorithm power to a AI setup. Do you think it is "in theory", mathematically possible for this setup to realise/identify that it has a limited capability?<p>Can this setup have an ability to guess?<p>Is there any possibility for this machine to generate "guesses" of what can be out of reach for itself?