This immediately reminded me of when Intel predicted 10GHz processors by 2011 back in 2000<p><a href="http://www.geek.com/articles/chips/intel-predicts-10ghz-chips-by-2011-20000726/" rel="nofollow">http://www.geek.com/articles/chips/intel-predicts-10ghz-chip...</a><p>Increasingly lower energy computing is obviously happening, but there are lower bounds to how little power you need for a useful device, it would seem. Wireless networking will always require a reasonable amount of power to achieve any reasonable distance/reliability.
<i>Accurately measuring geospatial location via GPS, making a phone call, or playing a game is meaningful.</i><p>Starting with a definition like that, it's not hard to see how the author concludes that near-zero-energy computing isn't possible. But a more modest definition-- for example, of compute-enabled "smart" versions of already-existing products-- may make that vision possible. Thinking of sensors as a form of meaningful computation expands the range even more.<p>For example, I've seen switches that harvest enough energy from the act of pressing them, to communicate their change in status to the controlled device. Now consider coupling that energy to some logic. For example, maybe a single light switch dynamically determines which of several lights you want to switch on or off in the space.<p>The real barrier to "ubiquitous meaningful low-energy computation" is that the marginal extra energy for adding computation to an existing device must be small compared to the energy that device already draws. The classic example of this is automobiles, which have been acquiring more and more sensors and internal control logic over the years. As logic components get to lower energy consumption, why shouldn't those possibilities jump to other devices?<p>The following, related, analysis was posted to HN a while ago:
<a href="http://www.antipope.org/charlie/blog-static/2012/08/how-low-power-can-you-go.html" rel="nofollow">http://www.antipope.org/charlie/blog-static/2012/08/how-low-...</a>
>Looks great, but ignores the fact that transistors don’t scale like they used to. Remember, the point of near-threshold voltage and the research into replacing silicon is intended to move the bar forward bit by bit, not to re-enable the classic Dennard scaling of the 1980s and 1990s. That era is gone, and nothing short of a miracle material that fulfills all the roles of silicon will ever bring it back.<p>Incremental changes in architecture do not have to equate to incremental changes in capability. Replacing silicon with a different substrate can introduce new time complexities for old problems. For example, mapping neural models onto memristors could scale better than mapping same models onto traditional silicon. Mapping quantum physics models onto qbits will scale better than mapping same models onto silicon. Mapping protein folding onto protein based computers could... and so on.
What happened to the magical PixelQi: <a href="http://pixelqi.com/" rel="nofollow">http://pixelqi.com/</a> screens that we have been promised for the last 3 years? Supposedly completely reflective full color fast refreshing LCD screens with very low power consumption. Seems that all that's available is a screen that you have to hack into a very limited number of netbook models yourself. Low power screens would make a far greater impact than any advances in low power processing. Can't understand why Apple or somebody else isn't all over these.<p>Even if the color reproduction is terrible, imagine the benefits of being able to write code in the park. We'd be seeing a lot of very tan developers.
I think the author may have misunderstood the claim. Intel was probably referring literally to low power computing (what happens inside Intel's chips) not to low power any-thing-you-can-do-with-a-computer(display, communicate with others, etc). In other words the processor.<p>Nearly all the power and heat problems in processors has to do with impedance mismatches between materials in the circuit. It's been about 5 years since I went to a conference called Beyond Moore's Law, but I remember a brilliant talk on a 5-6 order of magnitude decrease in power that is possible though impedance matching. (I couldn't find a link online, sorry!)<p>I suspect (rather arrogantly, since I have not seen intel's article directly) that this is what Intel was talking about.
I've wondered if mobile devices will reach the point of such low power consumption that they can be powered by all the radio wave energy already enveloping us.
Reminds me of this article by SF Writer Charlie Stross: <a href="http://www.antipope.org/charlie/blog-static/2012/08/how-low-power-can-you-go.html" rel="nofollow">http://www.antipope.org/charlie/blog-static/2012/08/how-low-...</a>
I remember seeing videos of some lectures by Hal Abelson on a project he was working on (in scheme of course) about a network of processing units sharing informations and distributing computations.
I remember reading a lecture by Feynman where he discusses reversible computing and suggests that you can get processing down to insanely low power usage using those methods.<p>I couldn't find a link to the lecture in question, but here is some recent research to give an overview - <a href="http://ercim-news.ercim.eu/en79/special/micropower-towards-low-power-microprocessors-with-reversible-computing" rel="nofollow">http://ercim-news.ercim.eu/en79/special/micropower-towards-l...</a>