We only use 5% of our brain, etc..

Laurence Fiddick fiddick at lifesci.ucsb.edu
Sat Jan 22 04:49:19 EST 1994


In <2hn3gf$dlj at news.u.washington.edu> dfitts at u.washington.edu (Douglas Fitts) writes:

>>This may seem like a lot of memory, but let's look at it.  Suppose we
>>wanted to keep a record of everything we see.  To be concrete, suppose
>>we needed to store a 100K JPEG image every second.  (This is only a
>>tiny fraction of the information that comes in from the retina.)  It's
>>pretty straightforward to work out how long it would take to exhaust a
>>terabyte of storage: the time is on the order of one year, even if
>>storage only goes on during waking hours.

>Seems to be straining a point to compare the human memory capacity with a
>lossless compression algorithm.  Why not store that image in a
>quasireliable 5K and forget about the next 30 sec or so of redundant data? 
>Maybe your memory's better than mine, wouldn't be hard, but I tend to 
>lose lots of detail.  In fact, I'd include the auditory and other senses 

this reminds me of something i had been wondering about. i would imagine that
the majority of information that could be stored in memory is practically 
useless beyond some short time interval or at least the utility of the
information decays as time passes. still there is some information which
will probably be of enduring utility and hence should be preserved. with
such information how much redundancyof stored information or degenerate
neurons might be required to accurately store the information and protect
the information against accidental loss? more generally how often might 
parallel channels be used to increase the speed and accuracy of single 
computations? i.e. are the vast numbers of neurons buying us additional 
speed and accuracy in computation rather than quantity of computation?




More information about the Neur-sci mailing list