I've been wanting to know how electronics' power efficiency compares to biological neurons, and this paper gives a clue. The most efficient hardware it mentions is the GV100 "Tensor Core", at 400GFLOPS/W for FP16.<p>If a typical neuron requires 10^6 ATP per activation [1], and if it takes 30.5 kJ/mol to charge ATP [2], and if a typical neuron has 100 axons, each of which is contributing one FLOP, then I _think_ a human neuron is about <i>500 times</i> as efficient as a GV100 [3], at 200,000 GFLOPS per watt.<p>[1] <a href="https://www.extremetech.com/extreme/185984-the-human-brains-remarkably-low-power-consumption-and-how-computers-might-mimic-its-efficiency" rel="nofollow">https://www.extremetech.com/extreme/185984-the-human-brains-...</a><p>[2] <a href="https://en.wikipedia.org/wiki/Adenosine_triphosphate" rel="nofollow">https://en.wikipedia.org/wiki/Adenosine_triphosphate</a><p>[3]<p><pre><code> Neuron:
10E6 ATP = 1 activation = 100 FLOP
30.5 kJ = 1 mole ATP = 6E23 ATP
1 kJ = 0.28 Wh
...
2E5 GFLOPS = 1 W
GV100 "Tensor Core":
4E2 GFLOPS = 1 W</code></pre>
There is also a paper from same authors with about the same content:<p>Efficient Processing of Deep Neural Networks: A Tutorial and Survey
Vivienne Sze, Yu-Hsin Chen, Tien-Ju Yang, Joel Emer<p><a href="https://arxiv.org/abs/1703.09039" rel="nofollow">https://arxiv.org/abs/1703.09039</a>