IUBio Biosequences .. Software .. Molbio soft .. Network News .. FTP

the homunculus

Dustin Voss agent at siu.edu
Tue Apr 13 18:10:20 EST 1999

In article <370e94ca.0 at ns2.wsg.net>, "Ray Scanlon" <rscanlon at wsg.net> wrote:

>The homunculus is the little man (green?) who sits in the middle of the
>head, watching a TV set, and punching buttons. We are too sophisticated to
>believe in such a little man, instead we have the soul (mind, intellect,
>self, that which stands behind the brain, that which is other). The soul
>(mind) selects from the data proffered by the brain, manipulates the data,
>reaches a decision, and forwards the result to the brain for execution.
>Do we really need such an entity?

The brain must have filters of some sort.  The filters are distributed,
but they are there.  Additionally, no one can dispute that we do make
decisions.  The decision might simply be one impulse that momentarily
gains ascendence, but it is a decision nonetheless.

A homunculus is simply a convenient mental package to put that stuff in. 
Though I'll grant that it is misleading.

>It seems that many, who profess materialism but are actually dualists, need
>one. They need something or someone to do the "thinking". The notion of a
>brain composed of neurons, each doing its own little thing, being a
>structure that can remember, associate, think, and decide is unsettling.
>All those who ask, "Can the machine think?" are in need of a soul (mind) to
>be associated with the machine.

Those who ask "Can a machine think?" can only mean "Can it think well
enough?" given what we--and even they--know about cognition (I use the
term, but refuse to get into a definition discussion.  Y'all know what I

>All those who claim that emotion is necessary for an advance in AI are
>invoking an homunculus.

Not necessarily.  In people, emotion directs our actions: "me hungry, me
eat"  "me lonely, me need people 'cause lonely is bad".  A machine that
thought would also require goals.  In a special-purpose thinking machine,
the goals could be inherent in its cognitive processes, but a
general-purpose thinker (an AI) would need some motivating system that can
be separated from specific goals.  Emotion would serve that purpose quite

-- Agent

More information about the Neur-sci mailing list

Send comments to us at biosci-help [At] net.bio.net