IUBio Biosequences .. Software .. Molbio soft .. Network News .. FTP

A Cyborg Bill of Rights

Rob and Stef rdunca19 at idt.net
Thu Jul 23 15:30:44 EST 1998

Peter Hesketh wrote:
> In article <35B79053.4B4E at idt.net>, Rob and Stef <rdunca19 at idt.net>
> writes
> >yup.  And your point is...?  If we reduce our humanity to having it on
> >par with creations of science, whats the point of existing? Serious.
> Seriuosly, humanity, like all life, is a creation of science.

Nice play on words. 8*) Humanity exists simply because it does not,
"NOT" exist.  It isnt a creation, its (self contemplation) a byproduct
and a neccessary component of survival.

An AI will be nothing more than a machine (or whatever) which has been
intentionally created by man.  This is not to say that it wont have true
and very real cognisence. They most certainly will, its a gauruntee. 
But so what? who cares?  It can not have any emotion whatsoever (outside
of programs or other neet tricks)  and that would preclude it acting on
its "own free will". AKA expensive piece-o-junk. Emotions are in direct
conflict with sentianc.  So if "it" doesnt *care* why would we?  If it
does *care*, then it is not acting of its own free will and is being
influenced by emotions or emotion like programing.  It is a catch 22.

>  I see no
> difference except in degree, of the conciousness of a machine or a
> person.

To be honest I think our degree of "self awareness" will be exceeded by

>  As a member of the top tribe on this planet (homo sapiens) I
> would want to protect people against extermination by other concious
> beings, and therefore would restrict the rights of non-biological
> entities, but only from selfish self-preservation reasons, not because I
> felt superior.
> --
> Regards, Peter Hesketh  Monmouthshire UK
> Twenty reasons why chocolate is better than sex: number 13
> "You don't get hairs in your mouth with chocolate."

A complex interplay of emotion and self awerness is going on in your
mind.  Emotion removes from you your ability to behave as you want.  If
I asked a girl to walk down the street naked she will say she doesnt
"want" to therefore she wont.  Reality is, she has no self determined
free will.  She excuses her inability to walk down the street naked with
"not wanting to".  Emotion and societal conditioning has overcome her
free will and sentience. At other times she shows free will and
sentience.  (such as when contemplating the effects of her behavior on
other beings)

Why does and will she have more rights than an AI?  Because she can
feel, she can have emotions, she can cry. This is what would remove
sentience and free will from a man made construct.  It would require
"programing" that would essentially remove "free will" from any machine.
(or whatever)


More information about the Neur-sci mailing list

Send comments to us at biosci-help [At] net.bio.net