> > I know this has lost something in the pasting into this message,
> > but I don't think it really matters since it doesn't actually
> > mean anything. It's all very well to connect things together with
> > little lines on a piece of paper (or a usenet group) but unless you
> > can define what a 'deep concept' is,
>> A deep concept is one or more ganged fibers holding a concept of
> a phenomenon by virtue of holding all the associative tags which
> define and record and remember that phenomenon.
Hmm. Now you have to define ganged fibres, concept (still), phenomenon, and
associative tags. And I'm not too impressed by the definition of deep.
> Follow links from
>http://www.scn.org/~mentifex/ to read the Nolarbeit Theory Journal,
> which, although it is enshrined in some kind of monument to mad
> scientists at Carnegie Mellon University, is nevertheless the very
> instrument by which the above-diagrammed Standard AI Mind Model arose.
>
People who live in glass houses...
> > how that is translated into something a neuron can represent and why it
> > needs to be connected to 'percept engrams' you might as well not bother.
>> Just by holding (gathering? focussing?) the associative tags over to
> experiential memory engrams, a long neuronal fiber concentrates the
> concept-ness of the engrams clustered by logic and by similarity.
> For example, just by holding associative tags over to "dog" memories,
> the neuronal fiber-gang for "dog" holds the concept of "dog."
>
Conceptness?Never mind....