Based on the comments in this thread... Guys, Microsoft fuckery doesn't invalidate an entire field.<p>I think certain VCs are a little too optimistic about quantum computing timelines, but that doesn't mean it's not steadily progressing. I saw a comment talking about prime factorization from 2001 with some claim that people haven't been working on pure quantum computing since then?<p>It's really hard. It's still firmly academic, with the peculiar factor that much of it is industry backed. Google quantum was a UCSB research lab turned into a Google branch, while still being powered by grad students. You can begin to see how there's going to be some culture clash and unfortunate pressure to make claims and take research paths atypical of academia (not excusing any fraud, edit: also to be very clear, not accusing Google quantum of anything). It's a hard problem in a funky environment.<p>1) it's a really hard problem. Anything truly quantum is hard to deal with, especially if you require long coherence times. Consider the entire field of condensed matter (+ some amo). Many of the experiments to measure special quantum properties/confirm theories do so in a destructive manner - I'm not talking only about the quantum measurement problem, I'm talking about the probes themselves physically altering the system such that you can only get one or maybe a few good measurements before the sample is useless. In quantum computing, things need to be cold, isolated, yet still read/write accessible over many many cycles in order to be useful.<p>2) given the difficulty, there's been many proposals for how to meet the "practical quantum computer" requirement. This ranges from giving up on a true general purpose quantum computer (quantum annealers) to NV vacancies, neutral/ionic lattices, squid/Josephson based,photonic, hybrid system with mechanical resonators, and yeah, topological/anyon shit.<p>3) It's hard to predict what will actually work, so every approach is a gamble and different groups take different gambles. Some take bigger gambles than the others. Id say topological quantum was a pretty damn big gamble given how new the theory was.<p>4) Then you need to gradually build up the actually system + infrastructure, validating each subsystem then subsystem interactions and finally full systems. Think system preparation, system readout, system manipulation, isolation, gate design... Each piece of this could be multiple +/- physicist, ece/cse, me, CS PhDs + postdocs amount of work. This is deep expertise and specialization.<p>4) Then if one approach seems to work, however poorly*, you need to improve it, scale it. Scaling is not guaranteed. This will mean many more PhDs worth trying to improve subsystems.<p>5) again, this is really hard. Truly, purely quantum systems are very difficult to work with. Classical computing is built on transistors, which operate just fine at room temperature*<i>(plenty of noise, no need for cold isolation) with macroscopic classical observables/manipulations like current, voltage. Yes, transistors work because of quantum effects, and with more recent transistors more directly use quantum effects (tunneling). For example, the "atomic" units of memory are still effectively macroscopic. The systems as a whole are very well described classically, with only practical engineering concerns related to putting things too close together, impurities, heat dissipation. Not to say that any of that is easy at all, but there's no question of principle like "will this even work?"<p>* With a bunch of people on HN shitting on how poorly + a bunch of other people saying its a full blown quantum computer + probably higher ups trying to make you say it is a real quantum computer or something about quantum supremacy.<p>*</i>Even in this classical regime think how much effort went into redundancy and encoding/decoding schemes to deal with the very rare bit flips. Now think of what's needed to build a functioning quantum computer at similar scale<p>No, I don't work in quantum computing, don't invest in it, have no stake in it.