IUBio Biosequences .. Software .. Molbio soft .. Network News .. FTP

Brain utilization

Matt Jones jonesmat at ohsu.edu
Thu Feb 25 17:00:56 EST 1999


In article <36D3BEFD.33AA at uswest.net> David B. Held, dheld at uswest.net
writes:
>  I roughly estimated that if each neuron had on average, say 100
>connections, and each connection stored, say a 32-bit floating point
>value (the synaptic "weight" of the connection), and there were say, 10
>billion functional neurons in the brain, then you could say that the
>brain roughly has a capacity of 4 TeraBytes (TB) of data.  And if the
>average neuron could fire, say, 300 times per second, and you considered
>one firing event to be 99 summations plus a comparison operation,
>resulting in 100 floating-point operations, then you could say that the
>brain has a "peak operating capacity" of around 300 TeraFLOPS (TFLOPS).
>That's some pretty serious power!  I think that's several hundred of 
>the fastest supercomputers in existence.
>

David, 

Like everyone else who's responded, I think this 10% business is a very
poorly thought out piece of folklore. But I like the spirit of your
calculation. Nobody really knows what the right numbers are, but yours
are reasonable at first glance. The ones I would adjust would be to
increase the number of connections by at least an order of magnitude, and
reduce the bit-depth of each connection to 1 or 2 bits (because the
release of a packet of transmitter at an individual release site follows
roughly binomial statistics, but the mean size of the unitary responses
can vary over about 2-fold, if you consider postsynaptic mechanisms of
plasticity).  The issue of summation and comparison is a little sticky,
because neuronal firing is probably a highly nonlinear function of
membrane potential, but there are plenty of computer models of brain
function out there that use a roughly linear dependence (in fact, my
feeling is that most models try to force this requirement, because
otherwise the computations are intractable). Also, the average neuron may
be _capable_ of firing at 300 Hz for short periods, but in recordings
from awake animals the average rate observed empirically is much slower,
like 5 Hz.  Which brings me to another point:

Neurons _can_ fire really fast sometimes, but apparently most of them
don't do this very often. That means that whatever information their
firing pattern contains is sparsely coded (i.e., the number of actual
spikes is a small fraction of the number of _possible_ spikes). This is
actually a very efficient way of storing and transmitting information.
For example, suppose the maximum sustainable rate was 100 Hz. Then, in a
100 ms window of time, the neuron could fire ten spikes, max (this is a
reasonable window of time to consider, because animals often have to make
decisions and initiate an action in roughly this time frame, give or
take). What kind of patterns could we expect to see in this window, and
how much information can be coded? Well, we have ten independent time
bins, in each of which exactly zero or one spikes can occur (we can't
have two spikes in a bin because of the refractory period). That means
that in any window, we can have anywhere from zero to ten spikes,
distributed in whatever order. Under a "rate coding" scheme, where the
information encoded is a function of the average number of spikes only,
we can encode only ten possible patterns (i.e., distinct rates). A lot of
people think this is how neurons work. But under a "time coding" scheme,
where the timing of each spike encodes information, we can encode
N!/K!(N-K)! different patterns, where N is the number of bins (i.e., 10
bins in 100 ms at a maximum possible rate of 100 Hz), K is the number of
spikes fired (i.e., the mean rate (Hz) *size of the window (sec)) and "!"
means factorialization (i.e., 5! = 5 * 4 * 3 * 2 *1).  Here's what you
get for mean rates between 10 and 100 Hz:

rate, Hz             spikes/100 ms       # of possible codes
10                            1                            10
20                            2                            45
30                            3                            120
40                            4                            210
50                            5                            252
60                            6                            210
70                            7                            120
80                            8                            45
90                            9                            10
100                          10                          1


The number of possible codes is a bell-shaped (i.e., binomial) function
of how many spikes the cell fires. That is, if you really want to be able
to encode a lot of information, only fire at half your maximal rate.
Definitely DON'T fire at your maximal rate, because that's a big waste of
spikes (and thus of energy), and actually reduces your ability to encode
different patterns.

On the other hand, if energy is tight, maybe fire at even less than half
your maximal rate. If you have a bunch of neurons, you can divide up the
job of encoding a chunck of information between them, so that no one
neuron is using up a lot of energy in firing spikes, but altogether you
can still get the point across.


The lesson to be learned from this is: 

If you know what's good for you, you'll never use more than half your
brain.


Matt Jones



More information about the Neur-sci mailing list

Send comments to us at biosci-help [At] net.bio.net