death of the mind.
wwolfkir at sympatico.ca
Tue Jul 27 19:34:32 EST 2004
Allan C Cybulskie wrote:
> Conscious thinking is a very large and critical part of the process called
> "intelligence" and, in fact, "awareness". Which is what the AI person, the
> philosopher of mind, and the psychologist are, in fact, extremely interested
I've snipped the rest of your comment becasue IMO this one is the one
that demonstrates most clearly the difference between your p.o.v. and mine.
I see no evidence whatsoever that conscious thinking is a critical part
of the process we call intelligence. If you do, you are limiting
"thinking" to "reasoning". I don't, for the same reasons that I don't
limit "communication" to "language". You also appear to assume that
reasoning happens before insight or understanding, but the evidence IMO
shows that it's the other way round: first we glimpse the truth, then we
find reasons for it's being true.
Secondly, you are conflating "intelligence" and "awareness", via that
dangerous phrase "and, in fact," between "intelligence" and "awareness."
(I'm guilty of taking risks with that phrase, too- :-)) I think that's a
major mistake. For one thing, "awareness" comes in many kinds - a cat is
aware of its mirror image, but it's not self-aware as humans are, who
relate the mirror-image to "me." (This happens around 18-24 months;
chimps display the same behaviour, so presumably they are self-aware
also.) Moreover, I don't think we can (as yet) affirm that mammalian (or
vertebrate?) awareness is the only kind, since a bait-worm's attempts to
escape the hook imapaling it look like awareness of pain to some people.
Furthermore, a system doesn't need to give signs of consciousness to
demonstrate intelligent behaviour. "Intelligent systems" are intelligent
- that is, given certain inputs, they produce the same sorts of outputs
as an intelligent human would (sometimes more intelligent than most
humans would...) If you want "creativity" as well as "problem solving"
as an aspect of intelligence, I can specify in general terms what a
"creative program" should look like, if, for example, you want it to
write poetry. (Verse would be harder, I think.) And in nay case, befoere
one imputes "intelligence" to any system, human, animal, or electronic,
one had better rule out the Clever Hans explanation.
IOW, I judge you to be intelligent, aware, self-aware, creative, etc
based on your behaviour. A machine that exhibits the same sort of
behaviour would have to be judged intelligent, aware, self-aware,
creative, etc, on the same grounds.
BTW, "private behaviour" is the behaviour that's accessible to me by
introspection. When I report on it, I engage in public behaviour. there
is no guarantee whatsoever that my repoort will be truthful, accurate,
or complete. (The witness's affirmation to offer "the truth, the whole
turth, and nothing but the truth" expresses at best a pious hope, at
worst a more or less deliberate fraud.)
The reason that my reports of my private behavior are unreliable is that
introspection is itself a behavior, and one over which I have little, if
any, conscious control. If you "decide" to remember, say, what you had
for breakfast this morning, _something in your environment_ triggered
that "decision to remember." It's as likely that were not aware of the
trigger as that you noticed it -- more likely, IMO. And I bet that, as
you read about remembering what you had for breakfast, you started to
It's interesting that folk-psychology recognises that "thought" is
largely uncontrollable - there are terms like "reverie", for example,
and phrases like "It occurred to me...", etc. The fact these IMO
reasonably accurate insights into "mind" are inconsistent with other
folk-psychological notions, such as "I can think what I want", should be
More information about the Neur-sci