You rang? En, de profundis cordis mei respondebo tibi.
Oliver Sparrow, ohgs at chatham.demon.co.uk, dipped his ink on 8 Feb 2K:
> The following is a description of an architecture that has unusual
> properties. I claim nothing, but would be interested in comments.
Oyez, oyez, oyez: Discussant is the Director Dr Oliver Sparrow of
http://www.riaa.org/Che/chf/forum.html -- the Chatham House Forum
of the Royal Institute of International Affairs, London town, UK.
> One of the most difficult concepts to instantiate in engineered
> structures is that of synopsis: 'seeing' the whole. There are two
> apparent issues: how is the synopsis to be presented (embedded in
> what?) and how is it to be somehow perceived. Both are hard problems.
> Both stem from traditional ways of looking at the issue: the theatre
> of the mind presents a matinee to a homunculus audience.
For a Chiacago Great Books-style synopticon of what OH Graphics & Sound
(a la Apple II GS) is trying to say, we formulate a synopsis thusly.
Dr Sparrow seems to say that the structure of the mind first forms
itself sufficiently to take in percepts, then is either warped or
at least further formed (informed?) by the very percepts, in a
growth-process whereby the emerging mind remains for a while
structurally unaltered, until dyspepsia (indigestion) of strange
new percepts forces a structural shift in the underpinnings of
the mind. (Gee, poor Dr Einstein -- his mind must have been
stretched and restructured to the bending and breaking point.)
Anyhow, Dr Sparrow seems to be formulating a mental perestroika.
> Let us, for the moment, keep the two poles of this description as
> temporarily useful, but put on ice the questions which they beg.
> Let us ask a question about one of these straw poles. What is it
> that happens when something is 'fused' into a representation?
Let us skip over some text available in the origin of the thread:
> So much, so scrutable; but suppose that is not what happens in nature?
> There is, we note, a new variable in the neighbourhood: strange.
Mntfx: Why wait? UPRH = Universal Philosophical Red Herring.
> Difficult, non-folk concepts. However, please put UPRHs
> (see footnote) back in their cupboards, at least for now.
> [...] The principle components in a learning system will
> change with circumstances, but also depend strongly on
> what has been learned. [...]
> Any one such N space can be fed by and contribute to others.
> One sees how a filtration hierarchy can be created, with
> successive abstraction occurring by layer. Further, feedback
> and feed forward can sharpen the nature of the space and the
> treatment of what is represented in it.
So the mind grows, structurally, by accretion: What you see,
determines *how* you see.
> [...] The drive is, therefore, to accommodate the current data to
> match the ways of handling data that have already been established,
> and the need to change those ways is resisted unless the data
> stubbornly refuse to bend. [...]
> What is a quale, in phenomenal terms? Semantically, by the term
> we mean a unitary percept that does not admit to dissection by
> the conscious mind. Brown, hunger, the smell of treacle.
> [...] (If you have actually starved for a long period - as I
> can personally testify - it is remarkable how starkly
> this happens in daily life.) [...]
> Well, then, what have we here? The theatre is gone and with it,
> the homunculi. We have a hugely complex set of nested, self-
> referring, mapping, error minimizing structures closely coupled
> to ripple through necessary change and to bend to their ends data.
> We have folded into this a range of cross-linking systems that
> use the structure so created to initiate communications between
> remote columns in the hierarchies that have eventuated.
> Think of bridges across a Manhattan of the mind [....]
> Why does this 'feel like' awareness? I have no idea.
> I do not even know how to pose the question. If the
> architecture is even vaguely correct, however, then
> the best way to find out may be to build such a thing.
Commentators: Please do not comment on these excerpts alone,
but go back to the full original text as posted by Dr Sparrow.
Afterthought: Mind.Forth AI is at first a rigid structure into
which we must handcraft each extension of the structure, e.g.,
a particular Chomskyan transformational grammar syntax tree.
Whether still in Mind.Forth or in a derivative Mind.Java, etc.,
the AI structure will eventually generalize its own structures
and thus become capable of learning new components of itself.