IUBio Biosequences .. Software .. Molbio soft .. Network News .. FTP

Giving a computer program free will

Arthur T. Murray uj797 at victoria.tc.ca
Mon Apr 26 17:37:00 EST 1999


Re: Giving a computer program free will.
Or: Moderate and be damned! (As in: Publish and be damned.)
Or: Anatomy of a Mentifex post.  (As told to comp.ai readers.)
Or: I know that AI is possible because I have solved it myself.
    Details forthcoming.  (As in: Matthew L. Ginsberg's post.)

R Jones (jonesrob at esumail.emporia.edu) wrote on Mon, 26 Apr 1999:

> I don't really believe in free will.  By that I mean that I
> reject the ideas of blame and credit.
                      ^^^^^ ^^^ ^^^^^^
Just like Friedrich Nietzsche:
         "Was wir tun, wird nie verstanden,
          sondern immer nur gelobt und tadelt."
which means:
         "What we do is never understood,
          but always only praised and blamed."

But that quote from Nietzsche is enough philosophy; let's get
into some serious AI and neuroscience theory of the free will.

>                                       For my take on free will
> see comp.ai.philosophy 10 Dec. 1997. (also see Minsky's
> Society of Mind).  Still, I'd like to give my computer programs
> the limited kind of free will that I do believe in.  How would
> I do this?

First build the artificial mind for autonomous mobile robots at the
http://www.geocities.com/Athens/Agora/7256/acm.html -- PDAI project.
[Attn: comp.ai voters!  A Mentifex post ALWAYS includes an AI link.]

>               When we human's exercise "free will" it seems that
> what we are doing is deciding between what's best for us in the
> short term (if we say live only another year) versus what's
> best for us in the long term (if we say live decades longer).

I'm not going to live forever, but I expect my AI progeny to live
for ever increasing spans of time eventually coterminous with U1
-- my just now invented neologism (a sign of aberration) for the
universe as we know it.  [Neologisms spice up an Internet post.]

> This could be automated as a system which negotiates temporal
> utility discounts in a utility-based agent.  Reducing the
> discount factor would be favoring things that make us happy if
> we live a short life whereas increasing the discount factor
> would be gambling on a long life.

Gambling?!  Here in Comp.River.City?!  But to moderate comp.ai
would be an enormous gamble.  And why make David Kinny work so hard?

The comp.ai forum is the most important newsgroup on the Internet.
Here we mortals design our AI successor species as prophesied by
http://www.ugcs.caltech.edu/~phoenix/vinge/vinge-sing.html Vinge.

If you moderate us, do we not self-silence our hundred blossoms?
If you blacklist us, do we not mute our voices rising in dissent?

Read my Lisp, Dr. McCarthy:  Mentifex will not post to a moderated
comp.ai newsgroup because his legitimate and on-topic posts never
make it through into sci.psychology.consciousness (moderated).

>    It might also be that free will in humans is composed of a
> number of processes of which this is just one.

On-topic Mentifex free will theory:  Our brain-mind declares many
far-flung instances of *verisimilitude* or what we believe to be
true as we make a nearly-free-will decision.  Any stopping to con-
sider more factors puts a brake on the will.  Any sudden insight
of a compelling factor tips our free-enough will into action:
   
  Hearing    Vision    Concepts Volition Emotion   Motor Output
 /iiiiiii\  /!i!i!i!\                             /YYYYYYYYYYYY\
| ||||||| || ||||||| |  B H   E                  | |||||||||||| |
| ||||||| || | ___ | |  + +   +                  | |||||||||||| |
| ||||||| ||  /   \  |  +----/ \                 | |S|||||||||| |
| ||||||| || (bears)-|--+ + (eat)                | |H|||||||||| |
| ||||||| ||  \___/  |  + +  \_/                 | |A|||||||||| |
| ||||||| ||         |  + +---+             __   | |K|||||||||| |
| || |||| ||         |  +------------------/  \  | |E|||R|||||| |
| ||e-----||---------|--------+    ____   (fear)-|--*|||U|||K|| |
| |a||||| ||   ___   |  + +   +   /    \---\__/  | |||||N|||I|| |
| |||t||| ||  /   \  |  + +---+  / de-  \--------|------*|||L|| |
| ||| ||| || (honey)-|----+   + (  ci-   )       | |||||||||L|| |
| ||||||| ||  \___/  |  +--------\ sion /--------|----------*|| |
| ||||||| ||         |  + +   +   \____/         | |||||||||||| |



More information about the Neur-sci mailing list

Send comments to us at biosci-help [At] net.bio.net