Computing started out from the brains of men. Then we use writing tools to write it it out, and then we invented machines to help us compute. From abacus to analog machines to digital ones. This evolved from single bit to four, eight, sixteen, thirty two and nowadays sixty four. What an amazing journey it has been. I remember when 32 bit was all the rage. Nowadays 32 bit machines are no longer viable for dealing data larger than 4Gib.<p>So today we're at 64. 2^64 is not small for most of what our computing needs are. How long do you think it's going to stay viable for the masses? I hoping to spark interesting discussions, anecdotes, stories and what not from the HN crowd, perhaps throughout the weekend :)
In a way they are already obsolete. If you asked the top quantum physicists 5 years ago when they thought we'd have viable quantum computers, some of the top people in the field predicted that it would be in about 50 years. Well it's happening now. There will definitely be quantum computers for sale by the early 2020s, and I'm not talking D-Wave here.
RISC-V has some 128b word length support or at least has reserved space for it. Still, I doubt there will ever be a large enough demand/market for that to justify the development. Even applications like crypto would probably use ASICs+FPGAs rather than burden a general purpose CPU with all those extra wires.<p>Never. Is never soon enough?