I am by no means an expert on AI, but I find it really interesting to think about the possibility to actually getting to strong/general AI using existing techniques (some form of deep learning / neural network).<p>What is the really strong argument for why the current techniques with added computational power and continuous improvements will not lead to strong AI?<p>People often claim things like "that is just optimization, not really intelligence" or something similar - but it seems very hard to prove that what we perceive as human intelligence is more than just similar calculations.