<i>Accurately measuring geospatial location via GPS, making a phone call, or playing a game is meaningful.</i><p>Starting with a definition like that, it's not hard to see how the author concludes that near-zero-energy computing isn't possible. But a more modest definition-- for example, of compute-enabled "smart" versions of already-existing products-- may make that vision possible. Thinking of sensors as a form of meaningful computation expands the range even more.<p>For example, I've seen switches that harvest enough energy from the act of pressing them, to communicate their change in status to the controlled device. Now consider coupling that energy to some logic. For example, maybe a single light switch dynamically determines which of several lights you want to switch on or off in the space.<p>The real barrier to "ubiquitous meaningful low-energy computation" is that the marginal extra energy for adding computation to an existing device must be small compared to the energy that device already draws. The classic example of this is automobiles, which have been acquiring more and more sensors and internal control logic over the years. As logic components get to lower energy consumption, why shouldn't those possibilities jump to other devices?<p>The following, related, analysis was posted to HN a while ago:
<a href="http://www.antipope.org/charlie/blog-static/2012/08/how-low-power-can-you-go.html" rel="nofollow">http://www.antipope.org/charlie/blog-static/2012/08/how-low-...</a>