Technological Singularity

James Sharman james at exaflop.demon.co.uk
Mon Jun 8 07:42:08 EST 1998


>There's a good parable -- I think it comes from Sir Peter Medawar, but
>I can't recall exactly -- about what proto-primates would have come
>up with if asked to design their "super-primate."  He suggested that
>they would have been interested in making it stronger, more agile,
>gifted with better teeth, and so forth.  The idea of making it weak,
>clumsy, hairless, but capable of language would probably not even have
>crossed their minds.  I see a similar problem with your projections;
>knowing that the "mind" is the mark of man -- which we've known since
>Aristotle -- it's easy enough to project that a superman must have a
>supermind (and of course, super-strength, super-stamina, X-ray vision,
>the ability to fly, and a Kryptonite allergy).  But that doesn't mean
>*either* that anything with a supermind will eventually become a superman
>(and replace us) -- evolution happens in baby steps, nor does it even
>mean that the evolutionary replacement for H. sap. sap. will be our
>projected superman.  Perhaps our evolutionary replacement will be
>the subcaste of humans too dumb to be able to figure out how all our
>lethal toys like phasers and warp drives work, and therefore will be
>able to survive the coming war....


Your whole argument is of course excellent, my argument has of course been
largely based on the principle that the superman is by definition more apt
at survival than the ordinary man.  Or to me its just some guy who appears
superior but actually isn't.

Moving back to the Technological singularity problem we can apply a bit of
cold hard logic to it.  We could well sit here on this newsgroup discussing
various potential examples of the technological singularity while one we
haven't considered creeps up and catches us unawares.  In fact if such an
event occurs that results in the eventual distruction or replacement of the
human race I expect one of the primary causes to be that we didn't see it
coming.

Many of the first researches into X-rays and radiation died of cancer
because they didn't stop to consider that these unseen rays might actually
be damaging them in some way. This is in some ways a small scale localised
Technological Singularity.

I remember a conversation I had quite recently with a colleague.  He said
that he saw the potential in particle accelerators to destroy the planet,
the argument was based on the principle that some of the time they are just
splitting odd particles to see what happens and one day they may just split
something that releases incredible energy.  My first reaction was that this
was incredibly unlikely but on reflection it is perhaps more likely than you
would think.  The true issue at this end of the debate is that people like
to mess with things they don't truly understand. However this line of
thought is also dangerous,  the last thing I would want would be to stop
progress for fear of the unknown but a little caution is never a bad thing.

However despite arguing against there is a fundamental proof that
technological singularity is possible,  mankind now has it in its power (in
the form of nuclear weapons) to destroy all human life on the planet. There
are also other emerging technologies that may give rise to a similar ability
and so it becomes increasing difficult to argue that it is impossible that
we may destroy ourselves by accident one day.

J,









Ps: please note in this thread I have been using the term 'man' a lot,  this
is not intended to be sexist in any way.






More information about the Neur-sci mailing list