Technological Singularity

Bernd Paysan bernd.paysan at
Fri Jun 5 03:29:23 EST 1998

Bill Moyer wrote:
>   Some examples of scenarios which would constitute the Singularity:
>   * The grey goo scenario -- microtechnology run amok, microscopic
>     Van Neumann machines pulling everything apart to make more Van
>     Neumann machines until the entire Earth's surface is converted,

Should be pretty simple to create machines that have an enormous
appetite on microscopic VN machines (and nothing else).

>   * The outbreak scenario -- genetically engineered bacteria or
>     virii getting loose into the environment and killing everyone,

This assumes that a genetically engineered bacteria/virii is far
superiour to the worst case naturally evolved bacteria/virii.
Constructing "killing" bacteria/virii has several tradeoffs: if it kills
too fast, it doesn't spread well (just leaving small dead spots), if it
kills slow, it must hide for a long time, which makes it difficult to
move to new victims (see AIDS).

>   * The Homo Superior scenario -- creating genetically engineered
>     human beings with superior abilities against whom "normal" human
>     kind cannot compete,

We already have quite different abilities. I see no reason why a
superiour human would kill "inferiour" humans.

>   * The Borg scenario -- mating human beings with cybernetic systems
>     creating an elite class of humanity against whom "normal" human
>     kind cannot compete (note -- it can be argued that this has
>     already happened to an extent; a college student without a home
>     computer is at a disadvantage when competing against a college
>     student with a home computer, and an engineer with access to a
>     well-equipped workstation can max out any intelligence test),

This isn't a thread. Humans who can read and write are superiour to
"normal" humans (who can't read and write - that's how they are born).
However, many human can "assimilate" (learn, get computers,
computer-implantats). The Borg scenario started with the use of the
first flintstone tools.

>   * The Frankenstein scenario -- creating a superintelligent AI
>     entity whose cognitive capabilities are as beyond ours as ours
>     are beyond an animal's; this scenario usually assumes that the
>     AI is capable of self-direction.

If you look around the world, the Frankenstein scenario is already
there. The superintelligent being is the human race itself, and it
deliberately destroys the resources of most other beings, often so
shortsighted, that it destroys its own resources.

Bernd Paysan
"Late answers are wrong answers!"

More information about the Neur-sci mailing list