I'm on board with the scepticism of what is a gigantic engineering challenge, it's got a lot of hype which ignore the fundamental and serious issues that make it a hard problem but certain paragraphs don't pass the sniff test, eg.<p>> So the number of continuous parameters describing the state of such a useful quantum computer at any given moment must be at least 2^1,000, which is to say about 10^300. That's a very big number indeed. How big? It is much, much greater than the number of subatomic particles in the observable universe.<p>> To repeat: A useful quantum computer needs to process a set of continuous parameters that is larger than the number of subatomic particles in the observable universe.<p>> At this point in a description of a possible future technology, a hardheaded engineer loses interest.<p>Comparing bits of information to number of particles in the universe is a fancy card trick to impress students, but to pretend that 1kb of state makes "hardheaded engineers" lose interest is laughable. People will keep trying because it's the holy grail for certain types of computational power that will still be there even as Moore's law wanes. For fields like bioinformatics and other natural sciences, it's probably our only way to get reasonable sized simulations regardless of worldwide digital processing power.<p>It's the computing equivalent of fusion, we know it can be done and that's enough to keep trying, whether we get there this decade or next century is somewhat irrelevant while the impetus remains.