<i>"Quantum computers operate at speeds unattainable by even today’s most powerful supercomputers, operations that are so fast, they can process millions of calculations in a fraction of the months, even years, traditional computers take"</i><p>God <i>damn</i> I hate bullshit lines like this used about QCs. The whole article is gives the usual misleading impression of QCs being generically faster than normal computers.<p>They're not.<p>They can answer <i>some</i> problems much, much faster than traditional non-QC computers because they are capable of running classes of algorithm that rely on quantum effects.<p>Don't get me wrong - that's a pretty darn useful subset of problems... the future of QCs is full of rosy cool stuff... but this isn't just like upping the clock cycles of a CPU.<p>It doesn't make everything faster. Completely different classes of constraint are being tweaked.<p>QCs aren't going to make everybody's laptop or smartphone rilly rilly fast.
D-Wave's advancements are very interesting to me for a few reasons. The first is that initially many people suspected D-Wave was a scam, because the most successful research efforts in quantum computing used just a few qubits, and D-Wave claimed a massive improvement (something like 128 or 256). Scott Aaronson (<a href="http://www.scottaaronson.com" rel="nofollow">http://www.scottaaronson.com</a>) was perhaps the most vocal critic. Over the years, the criticism has softened, and D-Wave has managed to get a paper or two into Nature. I think the truth of what they've achieved is somewhat less than what their marketing machine would like to suggest, but it's nevertheless very impressive (and D-Wave is certainly a place I'd like to work at if I could).<p>To clarify, D-Wave has not developed a general-purpose quantum computer, and in fact the term "general purpose" is kind of ill-defined for quantum computing anyway. Right now, there are a lot of different quantum effects that are used in different ways to accomplish specific tasks. I believe D-Wave's device uses quantum annealing to solve certain optimization problems, but someone check me if I'm wrong.<p>The little I do know about quantum computing relates to my area of study: simulation. The computation required to exactly solve the Schrodinger equation scales with 2^N for the number of particles (or whatever basis the equation is set in). Even the largest supercomputers are incapable of doing more than a few atoms [which, incidentally, is actually what I'm attempting to accomplish right now for a project that I should be working on instead of posting on here...] Anyway, with quantum computers, the scale would be O(N) instead of O(2^N), so you could perform incredibly accurate simulations that reach chemical accuracy. Chemical accuracy is kind of the holy grail of simulation, because what it means is that you can predict actual, macroscopic chemical properties of a variety of substances without doing any real-world experiments whatsoever. I believe it has been accomplished for things like pure hydrogen and quite a few bosonic systems (bosons are easier to simulate since they don't suffer from the fermion sign problem - <a href="http://en.wikipedia.org/wiki/Numerical_sign_problem" rel="nofollow">http://en.wikipedia.org/wiki/Numerical_sign_problem</a>).<p>Anyway, I probably sound like I know more than I really do, but hopefully this gives you an idea of what kind of applications a real, working quantum computer could be used to achieve.
This part of the article makes me facepalm so hard, as someone who knows a little about quantum computing and a lot more about computer vision:<p>> Quantum computers operate at speeds unattainable by even today’s most powerful supercomputers, operations that are so fast, they can process millions of calculations in a fraction of the months, even years, traditional computers take.<p>Quantum computers can carry out some algorithms which normal computers can't, which can be much faster, but they're not usable for general computing so this statement makes no sense.<p>> They can even be taught and can recognize objects in images, a task standard computers struggle with.<p>Er... what? I wouldn't say that standard computers can do vision easily, but it's a problem of finding the right algorithms, not computing power.
$10 million is <i>cheap</i>. True, it only does discrete optimization problems, but I think you could make your money back in a few years years renting it out at $5k/hour and consulting on translating problems to that domain.<p>Will some clients be wildly overpaying for something they could do equally well on regular computers? Sure, and they'll love every $ of it because of the bragging rights. Is this an efficient use of the hardware? Certainly not, it'll probably be exploiting <1% of the system's potential. Doesn't matter. People will frequently pay more for novelty than actual utility. If their vanity subsidizes the tiny subset of research computation that would have serious economic benefits, I call that a win-win.
Lockheed had been working with D-Wave for a while. If they decided to actually buy one of their computers, this probably means that they liked what they were seeing.
I wonder if Google will buy this version, too. I think they've been working with D-wave for a few years now.<p><a href="http://phys.org/news180107947.html" rel="nofollow">http://phys.org/news180107947.html</a>
So I am in my final year of high school now, and I'm not studying physics next year. The only thing about 'quantum physics' that we had was emission-spectra, and light-interference. Quantum Physics sounds _Really_ interesting though, so I would love if anyone could point me in the right direction with where and how I start learning this stuff. Thanks:D
Did anyone try their developer portal. I'm all excited to hear that they'll put up their quantum computer online after Beta. <a href="http://www.dwavesys.com/en/dev-portal.html" rel="nofollow">http://www.dwavesys.com/en/dev-portal.html</a>
"500,000 times faster than its predecessor"<p>How much power do quantum computers use and when will they be affordable enough to put in a smartphone/tablet?<p>And why isn't any other company doing this?