Oh hey! This looks <i>very</i> interesting!<p>> Transient
phenomena
associated
with
forward
biased
silicon
p
+ -
n - n
+
struc-
tures
at
4.2K
show
remarkable
similarities
with
biological
neurons.
The
devices
play
a role
similar
to
the
two-terminal
switching
elements
in
Hodgkin-Huxley
equivalent
circuit
diagrams.
The
devices
provide
simpler
and
more
realistic
neuron
emulation
than
transistors
or
op-amps.
They
have
such
low
power
and
current
requirements
that
they
could
be
used
in
massive
neural
networks.
Some
observed
properties
of
simple
circuits
containing
the
devices
include
action
potentials,
refractory
periods,
threshold
behavior,
excitation,
inhibition,
summation
over
synaptic
inputs,
synaptic
weights,
temporal
integration,
memory,
network
connectivity
modification
based
on
experience,
pacemaker
activity,
firing
thresholds,
coupling
to
sensors
with
graded
sig-
nal
outputs
and
the
dependence
of
firing
rate
on
input
current.
Transfer
functions
for
simple
artificial
neurons
with
spiketrain
inputs
and
spiketrain
outputs
have
been
measured
and
correlated
with
input
coupling.
Another interesting quote:<p>> We estimate that a system with 10^11 active 10μm x 10μm elements (comparable to the number of neurons in the brain) all firing with an average pulse rate of 1KHz (corresponding to a high neuronal firing rate) would consume about 50 watts.
The quiescent power drain for this system would be 0.1 milliwatts.<p>Note they are referring to 10μm process technology. Modern state of the art technology would probably get the power consumption of such brain scale system down to under a single watt.
Wow super interesting. I'm assuming this would be a fixed network thought? Could you adapt this hardware to change the weights and have the network learn (rather than passively interpret input data?)