>> * The grey goo scenario -- microtechnology run amok, microscopic
>> Van Neumann machines pulling everything apart to make more Van
>> Neumann machines until the entire Earth's surface is converted,
>>Should be pretty simple to create machines that have an enormous
>appetite on microscopic VN machines (and nothing else).
The problem of nanotech run amok is a very interesting one. You must
remember that self replicating nanotechnology has exponential growth all the
time resources are available so even if the 'hunter killer' machine you
describe as a solution were self replicating nano machines it is likely to
be the case that they could never sufficiently catch up with the out of
control replicators that were destroying the world. It is also possible
considering the exponential nature of self replicating growth that it would
take only a matter of days before earth was destroyed, this is perhaps
hardly enough time for a well though out reaction.
>> * The outbreak scenario -- genetically engineered bacteria or
>> virii getting loose into the environment and killing everyone,
>>This assumes that a genetically engineered bacteria/virii is far
>superiour to the worst case naturally evolved bacteria/virii.
>Constructing "killing" bacteria/virii has several tradeoffs: if it kills
>too fast, it doesn't spread well (just leaving small dead spots), if it
>kills slow, it must hide for a long time, which makes it difficult to
>move to new victims (see AIDS).
Alternatively the altered virus could masquerade as a particularly virulent
form of the common cold and spread across the globe without anyone paying
attention to it, then after a preset time (or number of generations) it
would become the fast killer.
>>> * The Homo Superior scenario -- creating genetically engineered
>> human beings with superior abilities against whom "normal" human
>> kind cannot compete,
>>We already have quite different abilities. I see no reason why a
>superiour human would kill "inferiour" humans.
It is not necisary for the superiour to kill the inferior. The basic
premise of natural selection tells as that the superiour will eventualy
replace the inferior. Even an incredibly small increase in proberbility of
servival of the superiour humans will (given enough generations) evantualy
result in the replacement of 'homo-inferious'
>>> * The Borg scenario -- mating human beings with cybernetic systems
>> creating an elite class of humanity against whom "normal" human
>> kind cannot compete (note -- it can be argued that this has
>> already happened to an extent; a college student without a home
>> computer is at a disadvantage when competing against a college
>> student with a home computer, and an engineer with access to a
>> well-equipped workstation can max out any intelligence test),
>>This isn't a thread. Humans who can read and write are superiour to
>"normal" humans (who can't read and write - that's how they are born).
>However, many human can "assimilate" (learn, get computers,
>computer-implantats). The Borg scenario started with the use of the
>first flintstone tools.
Generaly the 'borg' scenario is based on the same evolutionary basis as the
super-human scenario, however in the long term I would question the
improved servivability of a mechanicly enhanced human (A whole over issue).
>>> * The Frankenstein scenario -- creating a superintelligent AI
>> entity whose cognitive capabilities are as beyond ours as ours
>> are beyond an animal's; this scenario usually assumes that the
>> AI is capable of self-direction.
>>If you look around the world, the Frankenstein scenario is already
>there. The superintelligent being is the human race itself, and it
>deliberately destroys the resources of most other beings, often so
>shortsighted, that it destroys its own resources.
Well, perhaps we are already involved in a long slow 'Technological
Singularity', however you could use similar arguments on the 'Gaia theory'
to suggest that the earth is engaged in a long slow process of cleansing
itself of humanity.