In article <8t8ke2$rve$1 at nnrp1.deja.com>,
mentifex at scn.org wrote:
> In article <8t73rm$jin$1 at nnrp1.deja.com>,
>simonh_hibbs at my-deja.com wrote:
> > I've been thinking about how we might construct an artificial
> > inteligence, and the architecture of a mind in general, and
> > the following question occured to me.
> Now that people are actively coding artificial minds based on
> neuroscientific principles, the dull imaginations of the SF
> writers are losing out to the Grand Challenge project of AI.
>http://www.geocities.com/mentifex/js-mind.html -- AI emerges.
>> > Does Inteligence require conciousness?
>http://www.geocities.com/mentifex/conscius.html is a PD AI
> treatment of consciousness as an emergent epiphenomenon.
I had a look through the web site. There were a lot of
words there, but I'm afraid I couldn't make sense of much
of it. It was all ver stream-of-conciousness (funily enough).
> IMHO inteligence does not so much *require* consciousness
> as inescapably give rise to, or *cause* consciousness as
> a natural by-product of knowledge about self and the world:
> "Cogito ergo sum" -- the immortal words of Renatius Cartesius.
>> > Is it necessery that an artificial inteligence must be
> > self aware, or rather have an experience of being which
> > is not a result of mere sensory evidence of it's existence?
> > Any thoughts?
> It is hard to imagine any other self-awareness than the
> sensory one. One wonders what alternative you may suggest.
> Even the internal consciousness of a dream is sensory.
I'm talkign about awareness of one's own mental processes
in action. I don't see how equating memory retrieval with
sensory stimulation is usefull, especialy in architectures
in which the mechanism of memory storage may be the same as
the mechanism for reasoning.
Sent via Deja.com http://www.deja.com/
Before you buy.