Mining started out with CPU's, then with GPU's, then with FPGA's, and now ASIC's. What's the next technology to make mining an order of magnitude or two more efficient? D-wave computers perhaps? Those cost quite a bit, though.
I'm curious as to where exactly this guy got the funding from. ASICs are not cheap. Even at a um process node, it would still probably come out to around $1m.