Structure of the Optical Nerves ??

Hawley K. Rising III rising at a.crl.com
Sun Sep 10 13:04:08 EST 1995


In message <42pq5s$agg at mserv1.dl.ac.uk> - Eugene Leitl 
<ui22204 at sunmail.lrz-muenchen.de> writes:
>On 7 Sep 1995, Jan Vorbrueggen wrote:
>
>> In article <42ldqb$n7b at nntp.crl.com> rising at a.crl.com (Hawley K. Rising III) writes:
>> 
>>    >The problem is: if the retina's gone, you have to mimick the
>>    >its function with a circuit/algorithm (I totally ignore interacing
>>    >difficulties here). The information compression factor is 126:1, a lot 
>>    >of processing horsepower (well beyond any current/near future 
>>    >supercomputer according to Moravec. And he's too optimistic, imo). 
>> 
>>    Why is this information compression, and why is it beyond a supercomputer?
>>    I've seen no information which indicates that it goes 126:1 *retaining* all
>>    the information, only that its 126:1.
>
>It _is_ information compression since the bit rate goes down. 
>I have not implied the compression to be lossless. Lossy compression
>does not indicate the necessary amount of computation to be smaller.

Actually it can just be information loss.  There is even a good reason to 
have such loss, averaging over several inputs would improve robustness and 
accuracy.

>
>I thought this was obvious. 
>

No, its not obvious, I can take an image 126:1 with almost no computation,
and no real compression of information either, just take one of each
126 inputs.  Compression somehow indicates that information is retrievable.
I'm sorry to belabor the point, but if you *know* it's doing a compression,
and that its a significant piece of information processing, I'd like very 
much to be referred to sources (I'm not joking or throwing down a gauntlet, 
if 126:1 image compression is a computation of the retinal ganglia, I want to 
read about it).

Hawley Rising
rising at a.crl.com





More information about the Neur-sci mailing list