Towards Massively Parallel Supercomputer AI

Mentifex mentifex at
Mon Sep 4 11:13:24 EST 2000 is now on-line:

  /^^^^^^^^^^^\ Mind-grid Arrays{ } in Robot PDAI /^^^^^^^^^^^\
 /visual memory\                   _________     /  auditory   \
|      /--------|---------\       / ENGLISH \   |   memory      |
|      |  recog-|nition   |       \_________/---|-------------\ |
|   ___|___     |         | flush-vector|       |   ________  | |
|  /image  \    |     ____V_        ____V__     |  /        \ | |
| / percept \   |    /psi{ }\------/ en{ } \----|-/ ear{ }   \| |
| \ engrams /---|---/concepts\----/ lexicon \---|-\ phonemes /  |
|  \_______/    |   \________/    \_________/   |  \________/   |

The diagram above shows each brain-mind information path as if a
single fiber were carrying information which in theory must flow
in massive parallelism not because of, but resulting in, these

Because a front of consciousness advances down the time-dimension
of the brain, filling in the available memory locations as provided
genetically in the tabula-rasa mindgrid, learning can occur as
massively multiple association-tags gradually shift their origins
or their destinations.  That is to say, to learn a new idea is to
revisit old ideas and slightly or drastically modify them
(by changing the very associations that make up the idea).

Because an associative-tag fiber shown in the above diagram
represents massively parallel fibers, all carrying essentially
the same information -- such as the recognition of a percep now being
associated with a concept -- these fibers by their sheer
redundancy produce an extreme reliability in a brain-mind.
Early versions of AI are not yet massively parallel but are
nevertheless quite reliable because there are no brain cells
dying out at haphazard times or in haphazard locations.
Therefore, whereas a wetware brain-mind actually needs massive
parallelism or it may not function reliably, an AI software
early in this Y2K millennium and not yet performing mission-
critical earth-stewardship duties may fake the parallelism
and may, for instance, terminate searches after only one result
in cases where a wetware consciousness would inescapably gather
massively multiple search-results and would therefore blithely
continue associating along a never less-than-MPP pathway
(where "MPP" is the common acronym for "massively parallel processing).

For instructional purposes, each massively parallel channel can
be shown to flow not only like a single fiber across the mindgrid
but, more dramatically, like a sparsely instantiated ribbon cable
which connects othogonally (L-wise or F-wise) with other neuronal
ribbon-cables.  As long as each ribbon-cable connects with another
ribbon-cable, massive parallelism is preserved throughout the AI.
Any breakdown in the massive parallelism would indicate a serious
flaw in the basic design for artificial intelligence, so the
"ribbon-cable test" is valuable both to create and to teach the AI.

-- : AI Theory : Mind.Forth : Visual Basic Mind.VB : Mind.Java (Jala)

Sent via
Before you buy.

More information about the Neur-sci mailing list