Rickert on embedded computation (was re: science of consciousness.)

Oliver Sparrow ohgs at chatham.demon.co.uk
Thu Apr 30 09:32:11 EST 1998

Let me bore on, disnetiqettishly following my own post.

Is the computability of a structure the same things as saying that this
structure is open to logical expression? The terms tend to be used
interchangeably. Secondly, is the capacity to express the
inter-relationships within a system in either of these terms the same thing
as a full description of that system? I suggest that the answer is
"sometimes" and "no". If this is the case, the symbolic AI is a non-runner
as a candidate to reproduce awareness, but the potential for the emulation
of natural intelligence within an engineered framework is in no way
vitiated by this. As most attacks of the potential for AI are based on
discussions of computability-underpinned-by (and therefore reducible to)
-logical operations, then this form of escape would invalidate all of these

A system - queuing at the checkout counter - may be exactly simulable with
respect to a standard interface (as with my Niagara viewer, earlier.) That
is, the system is computable, to this limited degree: I can photocopy it,
sort of. When I do this, then I (programmer-as-deity) set about creating
effects which I know about or have imagined. I create a set of structured
relationships, at the end of each train of which lies an assumption which
is generated in ways which do not reside within the software. Far from this
being the system-rendered-computable, all of the difficult bits are granted
ex ante by my godlike overview on the issues of queues. The simulation
takes what I know (enshrined in mathematics or just plunked in as
happenstance) and does stuff, which ends up with other stuff happening on a
screen which (once again) is interpreted off site. This is hardly
"computable", but to make the issue truly computable, all that resides in
my head about que use (and all that resides in the head of whoever looks at
the output) needs to be embedded in the code ex ante. Tall order.

Presume a giant simulation of a society in all of its aspects (including
inhibitions which invisibly prevent over-much self scrutiny by the aware
inhabitants), then such a simulation might evolve the social ritual of the
queue in the same way that real societies achieve the same end. Then the
grounds for the simulation of the queue would be embedded in the software:
but not in what is coded, but in what the system created for itself. 

Even so, the coding would have all sort of assumptions built in, but about
societies, not queues. Even taller order. Very well, walk really tall.
Begin with primordial gas (n.b. assumptions) and let life, civilisation and
the whole damned thing evolve, D. Adams proprietor. The queue simulation
would then be (almost) completely self-validating, depending almost
entirely on what had already been established within the software (of which
the original code is little more than an OS and a remote start
configuration). Ask a question about queues and the system would have the
answer latent within it. (Mind you, what were queueing might be purple
frog-things awaiting ceremonial group egg fertilisation, and the utility of
the exercise might be scant, but that's the deity business all over. ) 

Knowledge is expressed entirely in terms of other knowledge: some elemental
(Ow! Yum!) and some declarative/symbolic. It is probably true that most
elemental knowledge is somehow filtered and prioritised: certainly vision
and other forms of perception is so managed. If everything is made of its
embedding, its context, then this perhaps points to the architecture in
which hierarchies of elemental percepts create primitives, primitives
create a grammar in which entities are located and stored, structures are
generated by feed-forward from dynamically linked entities within dedicated
task managers, priorities set by top down balancing agencies that operate
between and over the task managers. All of these bits would be made from
other bits, in different parts of the system. Nothing would be entirely
fundamental. How the relationships were instantiated - how processed, how
'handled' - would be irrelevant to these relationships, which would be the
primary architecture and effectively independent of the plumbing.


Oliver Sparrow

More information about the Neur-sci mailing list