Consciousness ~=~ self-referentiality' (was Re: Consciousness, New Thinking About

Jim Chinnis jchinnis at
Thu Jun 6 13:35:09 EST 2002

jonesmat at (Matt Jones) wrote in part:

>- A perfectly efficient information coding scheme will produce signals
>that appear entirely random.
>Technically, he was really talking about the Fourier spectrum of the
>signal being Gaussian white noise, but that means the same thing as
>what I just said above.

In both Shannon's and your case, the alphabet matters. But Shannon's
information measure is just the thermodynamic entropy. In fact, trying to
relate consciousness to Shannon information is hard because thinking actually
is not driven much by Shannon information. Meaning is somewhere else. A speck
of dust that lands on my computer screen lands at a precise point and that
meaningless event carries a lot of information (many bits). In fact, that's
why display adapters are so big--the Shannon information processing load. But
things with very low information may have great impacts on our thoughts, such
as whether a medical test came back negative or positive (one bit if the two
outcomes are equally likely).

>So, there may still be something to the idea that consciousness is
>related to algorithmic complexity, and therefore unpredictability. No
>wonder it remains intractable to definition.

There is a great book, though quite old now, by Wendell Garner, who was a
professor at Johns Hopkins in the 60s, called "Uncertainty and Structure as
Psychological Concepts." He explores how far uncertainty and complexity and
information can take us in understanding a wide range of psychological
phenomena and mysteries.

>Free Will, anyone?

The quantum dice?
Jim Chinnis  Warrenton, Virginia, USA  jchinnis at

More information about the Neur-sci mailing list