Arthur T. Murray
uj797 at victoria.tc.ca
Thu Jun 6 11:56:20 EST 2002
A pre-existing Artificial Mind needs a natural-language parser:
/^^^^^^^^^^^\ _____ /^^^^^^^^^^^\
/visual memory\ T T T /New- \ / audSTM() \
| _______ | | | | _____ (Concept)--|-------------\ |
| /image \ | | | | /Old- \ \_____/ | audRecog() | |
| / percept \---|-----+ | (Concept)----|-------|----------\ | |
| \ engram / | a|C|f| \_____/-----|-------|-------\ | | |
| \_______/ | b|O|i| |______| | c | | | |
| | s|N|b| /Parser()\ | a | | | |
| | t|C|e| \________/ | t | | | |
| | r|E|r| ________|______ | s-/ | | |
| | a|P|s| / \ | e | | |
| | c|T| | ( Instantiate() ) | a | | |
| _______ | t| | | \_______________/ | t-----/ | |
| /fresh \ | |_|_| / _____ _____ | f | |
| / image \ | / \/ / En \ / En \ | i | |
| \ engram /---|--\ Psi /-/ Nouns \--/ Verbs \ | s | |
| \_______/ | \___/ \_______/ \_______/ | h-------/ |
This paper is a reminder of things to think about while coding
http://www.scn.org/~mentifex/parser.html -- a Parser module
for the Robot AI Mind.
7.2 PRIOR ART
Although Web links on the parser.html Doc page may provide
information about traditional parsers, let there be no
prejudice in favor of the way it's always been done.
7.3 DESIGN CONSIDERATIONS
7.3.1 AUDITORY RECOGNITION
http://www.scn.org/~mentifex/audrecog.html Auditory Recognition
may cause information on the previous parsing of a lexical
item to be retrieved. There ought to be an override mechanism
to let a known concept be used as a part of speech different
from the most recent prior usage.
7.3.2 FLAGS AND TAGS
No limit should be set on the number of kinds of flags
which a quasi-fiber may display.
Defaults maybe used as an aid in coding the Parser module.
For instance, if we assume that a typical sentence will
have a verb, we may have a default requirement that one
word or another must be declared as a verb -- although
another guideline may suggest that the initial word in
a sentence is typically not the verb.
By the doctrine of defaults, the whole Parser module may
be seen as a kind of "snare-net" or thicket of default-tests.
7.3.4 BRUTE FORCE TECHNIQUES
AI coders have the option of brute-force in the
so as to make it more powerful, but a simple and
elegant solution is more desirable.
7.4 BOOTSTRAP PROVISIONS
http://www.scn.org/~mentifex/enboot.html English Bootstrap
may ease burdens on the Parser module by means of a "stare
decisis" reliance on previously decided parsing problems.
220.127.116.11 MOST FREQUENT WORDS
Since there are lists available for the most frequent words
of various natural languages, it makes sense, where possible
and where convenient, to favor the bootstrap-inclusion of
higher-frequency words over lower-frequency words.
Such a policy of bootstrapping higher-frequency words may
become obsolete or "moot" as the bootstrap approachs saturation
with essentially all the words comprising a full dictionary of
the target language.
18.104.22.168 "WORK" WORDS
Gradually all prepositions for a given language and all irregular
verb forms may be instantiated within a bootstrap module so that
the AI Parser module may easily recognize and parse such special
7.4.2 SYNTAX STRUCTURE EXAMPLES
7.5 COORDINATION WITH OTHER MIND MODULES
As each new functionality is added, the Parser module
must support each linguistic structure available to the
http://www.scn.org/~mentifex/english.html thought module.
7.5.1 SUBJECT-VERB-OBJECT SVO SYNTAX
http://www.scn.org/~mentifex/svo.html is SVO syntax.
More information about the Neur-sci