Let's say that you are processing ASCII verbal input to a robot AI
and you need to make sure that all the parts of speech are assigned
to their proper nodes on a Chomskyan transformational grammar tree.
Would the following technique work? Instead of forcing the AI
software to "bet everything" on the chosen assumption that the
incoming sentence is of particular transformational pattern, we
could radically simplify the parsing process by letting ALL POSSIBLE
transformational structures try to absorb the sentence, and the
criterion for successful parsing would simply be that the incoming
sentence traverses an entire structure at one pass without aborting:
"If the shoe fits, wear it," so to speak.
If the incoming sentence pattern were successfully to traverse
two widely different sentence structures, AMBIGUITY would be
recognized, e.g.: "Time flies like an arrow." Then other factors
in the Forthmind would adopt one interpretation and discard others.
<a href="http://www.scn.org/~mentifex/aisource.html">Mind.forth</a>.
<PRE>
/^^^^^^^^^^^\ Syntax Strings Together a Thought /^^^^^^^^^^^\
/visual memory\ ________ semantic / auditory \
| /--------|-------\ / syntax \ memory |episodic memory|
| | recog-|nition | \________/------------|-------------\ |
| ___|___ | | |flush-vector | _______ | |
| /image \ | __|__ / \ _______ | /stored \ | |
| / percept \ | / \/ \/ Verbs \------|--/ phonemes\| |
| \ engrams /---|---/ Nouns \ \_______/ | \ of words/ |
| \_______/ | \_______/-------------------|---\_______/ |
</PRE>