Technological Singularity

James Sharman james at exaflop.demon.co.uk
Mon Jun 8 04:45:04 EST 1998

John Piccirillo wrote in message <357857B7.5244 at sigmatechinc.com>...
>Bill Moyer wrote:
>>   Arthur is referring to what Vernor Vinge dubbed the "Technological
>> Singularity", which can be generally described as the point in time
>> at which a technological innovation either renders mankind incapable
>> of controlling their environment, or at least exerts an irresistable
>> force on humanity.
>    Looks like a little fear mongering is at work here.  Doesn't this
>sound very similar to the "Future Shock" schlock that came out several
>years ago?  Personally, my view is that there will always be Luddites.
>As one who grew up post WWII with the threat of nuclear annihilation and
>civil defense drills in primary school, my view is the future may be
>even worst than you imagine but not in the way you imagine --> full
>steam ahead.

To a degree,  but I would not comapre myself and the others to luddites. We
are not morons smashing machines out of a lack of understanding.  I myself
am simply saying that as a technology becomes more and more powerfull we
should put a little thought into how we apply it.  I for one am all up for
technology,  I can't get enough of it.  One of the problems with much of
this is that technology has traditionaly been fulled by the military and so
the discrutive uses of much of this are being explored on purpose.  For
example,  their is a reasnable chance that the first truely inteligent
A.I.'s will be created for the military and then the technology will be
applied to the dometic market.  So what I'm asking is that once they have
created that military A.I. is that they spend five minutes takeing the
killer instinct out of it before they put it in my microwave.


More information about the Neur-sci mailing list

Send comments to us at biosci-help [At] net.bio.net