<a href="http://www.dwavesys.com/press-releases/d-wave-systems-breaks-1000-qubit-quantum-computing-barrier" rel="nofollow">http://www.dwavesys.com/press-releases/d-wave-systems-breaks...</a><p>This announcement, claims:<p><i>Every additional qubit doubles the search space of the processor. At 1000 qubits, the new processor considers 2^1000 possibilities simultaneously, a search space which dwarfs the 2^512 possibilities available to the 512-qubit D-Wave Two.</i><p>Since we still aren't able to factor any large numbers with it, those 2^1000 bits don't really work like they say they do. I'm guessing there are many caveats behind their description.<p>I would appreciate any explanation from an expert.
These results seem a bit misleading - they're comparing their multi-million dollar system to a classical optimizer running single-threaded on a 3-year old processor (Intel Xeon E5-2670), and then saying 'look how much faster we are than a classical optimizer!'<p>How does this compare to a giant cluster of new Xeons with faster interconnects?
I think the most significant gain made here is not speed but power consumption. On <a href="http://www.dwavesys.com/d-wave-two-system" rel="nofollow">http://www.dwavesys.com/d-wave-two-system</a> they compare a supercomputer using almost 2 gW while 2X uses 27 kW when you factor in "the fridge". I don't know if that's an apples-2-apples comparison, hoping the supercomputer is set up to solve comparable problems. If I am understanding this properly they have a great product on their hands and a great deal of units to make.
Didn't we just read about the NSA moving to quantum resistant algorithms a few days ago?<p><a href="https://news.ycombinator.com/item?id=10064226" rel="nofollow">https://news.ycombinator.com/item?id=10064226</a> - NSA announces plans for transitioning to quantum resistant algorithms<p>also I found this from a few hours ago:
<a href="http://arstechnica.com/security/2015/08/nsa-preps-quantum-resistant-algorithms-to-head-off-crypto-apocolypse/" rel="nofollow">http://arstechnica.com/security/2015/08/nsa-preps-quantum-re...</a>
What language is used to program for this processor? I wouldn't think you'd take the same approach programming a quantum computer as you would a conventional computer. Is there a quantum computer simulator? What does the development workflow look like? How do you debug? Boy I have a lot of questions!
There was a paper in 2008 claiming that adiabatic quantum computers could factor with a quantum speed up. <a href="http://arxiv.org/abs/0808.1935" rel="nofollow">http://arxiv.org/abs/0808.1935</a>