๐Ÿ‘คblopeur๐Ÿ•‘8y๐Ÿ”ผ172๐Ÿ—จ๏ธ17

(Replying to PARENT post)

I've been wanting to know how electronics' power efficiency compares to biological neurons, and this paper gives a clue. The most efficient hardware it mentions is the GV100 "Tensor Core", at 400GFLOPS/W for FP16.

If a typical neuron requires 10^6 ATP per activation [1], and if it takes 30.5 kJ/mol to charge ATP [2], and if a typical neuron has 100 axons, each of which is contributing one FLOP, then I _think_ a human neuron is about 500 times as efficient as a GV100 [3], at 200,000 GFLOPS per watt.

[1] https://www.extremetech.com/extreme/185984-the-human-brains-...

[2] https://en.wikipedia.org/wiki/Adenosine_triphosphate

[3]

  Neuron:
  10E6 ATP = 1 activation = 100 FLOP
  30.5 kJ = 1 mole ATP = 6E23 ATP
  1 kJ = 0.28 Wh
  ...
  2E5 GFLOPS = 1 W

  GV100 "Tensor Core":
  4E2 GFLOPS = 1 W
๐Ÿ‘คneolefty๐Ÿ•‘8y๐Ÿ”ผ0๐Ÿ—จ๏ธ0

(Replying to PARENT post)

There is also a paper from same authors with about the same content:

Efficient Processing of Deep Neural Networks: A Tutorial and Survey Vivienne Sze, Yu-Hsin Chen, Tien-Ju Yang, Joel Emer

https://arxiv.org/abs/1703.09039

๐Ÿ‘คnshm๐Ÿ•‘8y๐Ÿ”ผ0๐Ÿ—จ๏ธ0

(Replying to PARENT post)

link seems broken!
๐Ÿ‘คShishram๐Ÿ•‘8y๐Ÿ”ผ0๐Ÿ—จ๏ธ0