Is this an actually good explanation? The introduction immediately made me pause:<p>> In classical computers, error-resistant memory is achieved by duplicating bits to detect and correct errors. A method called majority voting is often used, where multiple copies of a bit are compared, and the majority value is taken as the correct bit<p>No in classical computers memory is corrected for using error correction not duplicating bits and majority voting. Duplicating bits would be a very wasteful strategy if you can add significantly fewer bits and achieve the same result which is what you get with error correction techniques like ECC. Maybe they got it confused with logic circuits where there’s not any more efficient strategy?
Note the paper they are referring to was published August 27, 2024<p><a href="https://arxiv.org/pdf/2408.13687" rel="nofollow">https://arxiv.org/pdf/2408.13687</a>
While I'm still eager to see where Quantum Computing leads, I've got a new threshold for "breakthrough": Until a quantum computer can factor products of primes larger than a few bits, I'll consider it a work in progress at best.
I'm someone not really aware of the consequences of each quantum of progress in quantum computing. But, I know that I'm exposed to QC risks in that at some point I'll need to change every security key I've ever generated and every crypto algorithm every piece of software uses.<p>How much closer does this work bring us to the Quantum Crypto Apocalypse? How much time do I have left before I need to start budgeting it into my quarterly engineering plan?