Toward a Science of Consciousness 1998
patrick at gryphon.psych.ox.ac.uk
Mon Apr 27 08:40:35 EST 1998
In article <6i218h$s8c$1 at news.fsu.edu> jac at ibms48.scri.fsu.edu (Jim Carr) writes:
>jac at ibms48.scri.fsu.edu (Jim Carr) writes:
>| tonmaas at xs4all.nl (Ton Maas) writes:
>| } According to neurophysiologists Varela & Maturana consciousness is
>| } restricted to autopoietic systems - which by definition produce their own
>| } organization by an evolutionary process not unlike "tinkering". Seems like
>| } the conscious computer will have to invent itself from scrap in order to
>| } ever attain consciousness :-)
>| patrick at gryphon.psych.ox.ac.uk (Patrick Juola) writes:
>| >Unfortunately, this definition is immediately and trivially incorrect.
>| >Individual humans do not evolve; evolution is a process restricted to
>| >populations. An individual human (which I assume is conscious) is
>| >largely a copy of prior humans -- and so a sufficiently detailed copy
>| >of a human organism should also be conscious, by a similar process.
>| Largely, but part of that "copy" is a brain that is a work in progress
>| whose structure is not wholly dictated by genetics. Although not
>| evolution in the strict sense of evolutionary biology, the brain
>| does change and adapt. A 'sufficiently detailed copy' would have to
>| include those rules that allow the 'tinkering' the previous posted
>| noted was important.
>patrick at gryphon.psych.ox.ac.uk (Patrick Juola) writes:
>>But that's part of the "fine enough detail" described above.
> It was not clear to me if you were agreeing or disagreeing with
> the previous thoughts. It seemed like you were doing both.
Oh, I'm disagreeing, vehemently. We *have* evolving computers, we
*have* computers (or at least multi-cpu networks) with greater storage
capacity and comparable complexity to the human brain, and that's
not produced anything that appears in any way conscious. The
argument that "evolving systems," "tinker-able systems" or any
such is just another attempt to insert a ghost-in-the-machine through
a process that the writer doesn't understand, even though many other
scientists may understand quite weill.
There's no particular difference between "an evolved system" and
"a copy of an evolved system" -- so the idea that a machine will
need to invent itself in order to become consious is bluntly,
beyond ridicule, the more so as humans are presumed consious and
*don't* invent themselves. And if the key is simply structural
adaptability, then there's nothing particularly new or interesting
in the idea, especially as it's been tried and found wanting.
One might as well suggest that energy dissapation is the key to
consciousness. Certainly every conscious system that we are aware
of does so. However, designing computers to make sure that they
use power doesn't seem to make them any more intelligent.
More information about the Neur-sci