Slightly off topic, but it's widely accepted among physicists that the act of computation expends energy [1]. Thus, there are actually limits to how much the cost of a given computation can be reduced, regardless of how cleverly we build the computer, or what we build it out of (silicon, DNA, fiber optics, whatever) [2].<p>[1] <a href="http://en.wikipedia.org/wiki/Landauer%27s_principle" rel="nofollow">http://en.wikipedia.org/wiki/Landauer%27s_principle</a><p>[2] If we're willing to use algorithms that don't destroy information, or willing to operate at arbitrarily low temperatures, as I understand it there's no theoretical limit to how small we can make the energy costs, but these restrictions seem highly impractical.
As someone who has done extensive work in image processing using custom hardware I am not really sure what he is talking about. Is this intended to suggest that software is cheaper than hardware? Or that it has performance advantages over specialized hardware? Not sure.<p>It's tough to beat smartly-designed specialized hardware in image processing. Some of the things I've done would require ten general purpose computers running in parallel to accomplish what I did in a single $100 chip. So, yes, less cost, higher data rate, reduced thermal load, reduced physical size, less power requirements, etc.<p>Maybe I don't get where he is going with this?