Structure of the Optical Nerves ??

Hawley K. Rising III rising at
Sun Sep 10 13:04:08 EST 1995

In message <42pq5s$agg at> - Eugene Leitl 
<ui22204 at> writes:
>On 7 Sep 1995, Jan Vorbrueggen wrote:
>> In article <42ldqb$n7b at> rising at (Hawley K. Rising III) writes:
>>    >The problem is: if the retina's gone, you have to mimick the
>>    >its function with a circuit/algorithm (I totally ignore interacing
>>    >difficulties here). The information compression factor is 126:1, a lot 
>>    >of processing horsepower (well beyond any current/near future 
>>    >supercomputer according to Moravec. And he's too optimistic, imo). 
>>    Why is this information compression, and why is it beyond a supercomputer?
>>    I've seen no information which indicates that it goes 126:1 *retaining* all
>>    the information, only that its 126:1.
>It _is_ information compression since the bit rate goes down. 
>I have not implied the compression to be lossless. Lossy compression
>does not indicate the necessary amount of computation to be smaller.

Actually it can just be information loss.  There is even a good reason to 
have such loss, averaging over several inputs would improve robustness and 

>I thought this was obvious. 

No, its not obvious, I can take an image 126:1 with almost no computation,
and no real compression of information either, just take one of each
126 inputs.  Compression somehow indicates that information is retrievable.
I'm sorry to belabor the point, but if you *know* it's doing a compression,
and that its a significant piece of information processing, I'd like very 
much to be referred to sources (I'm not joking or throwing down a gauntlet, 
if 126:1 image compression is a computation of the retinal ganglia, I want to 
read about it).

Hawley Rising
rising at

More information about the Neur-sci mailing list