In article <364AA236.DBCC3373 at pop3.concentric.net>,
kkollins at concentric.net wrote:
> the conjecture is False... there're c'zillions of calculations, all occurring
> in-parallel, all the time... convergence upon verbal symbols happens only
> relatively-rarely... but one can go into the "deep"-calculation place if one,
> first, Learns how to do that... still can't attach verbal symbols to
> everything in real-time, but everything's available for one's Observation...
> it's why I like to communicate via "pictures"... which =can= be "drawn" with
> verbal symbols... and which =can= be Explored, all their
> c'zillions-of-calculations-stuff intact, by others. ken collins
>> Gary Jasdzewski wrote:
>> > Recently I posted to this group a question about the reasons why cognitive
> > scientists (linguists, psychologists, etc.) should pay attention to the
> > neurosciences. I haven't received many replies, sad to say. However, I
> > did run across an idea called the '100 step constraint' in a marvelous
> > book called _Speaking Minds_. The idea is that the brain is not fast
> > enough to perform more than 100 computations in something like 300
> > milliseconds, and so any cognitive theory must take this into account. Is
> > anyone familiar with this idea? Is it well known in your field? What are
> > the numbers used to calculate it?
> >
> > --
> > gary jasdzewski
> > gary at siu.edu> > http://omni.cc.purdue.edu/~garyjaz
The number of "computations" the brain is completing per millisecond is
probably up in the many trillions. This depends on what you consider a
computation, but the action potential of single cells is certainly a
computation (even in slow firing cells, this would be 1 Hz, and figure
there are something like 10^13 neurons)... but probably even the action of
a single channel can be considered a computation. In fact, although
transistors are often compared to neurons, they are nowhere near as
complicated, they may actually be more similar to channels, of which each
cell has many thousands. That is the computational power of a few hundred
cells is probably equal to that of a powerful microprocessor, although
they are often improperly modeled as threshold detectors. Now, the main
actual thing that is worse in the brain is transmission speed for signals,
which maxes out at 200 mph, compared to pure electrical signals which
travel at the speed of light. The advantage of course is that the system
is massively and totally parallel, in contrast to serial mechanisms
employed in modern computers. The brain also uses a number of analog
mechanisms which are ignored by modelers but compose probably 99% of the
internal activity of neurons; a number of other mechanisms work over
durable and complex cascades for which no real electrical homolog exists.
This aside, any cognitive theory should take into account the mechanism it
is running on. This is just a matter of making the constructs used in
psychology into real physical observable constructs. If the constructs do
not have a material basis, then they do not exist, at least in the form
that psychologists are describing them. So some psychological theories
can be tested this way; for example, one can pursue an "engram,"
"representation," or "association" as a real physical construct -- as well
as many cognitive operations. But others (e.g., schema) are probably not
reducable to observable phenomena and need to be reformulated.
Naturally, you can try to design your theory based on solely input output
operations (i.e., "black box") - but this didn't work for behaviorists,
and the problem they faced was much simpler than cognitive psychologists,
so there is no reason to expect it can work here either.
Other things to consider are things such as domain-generality, which does
not exist in the brain for the most part. So a simple cognitive learning
theory would posit one domain-general learning system irrespective of
materail. In reality, the brain has multiple specialized systems (perhaps
by evolution) which are material-specific, sometimes independent,
sometimes interactive, and sometimes redundant (e.g., hippocampal
declarative system; and multiple non-declarative systems such as the basal
ganglia, amygdala, & cerebellum). A plain cognitive theory has a lot of
trouble dealing with this. Because of these systems, there are somethings
which the brain does not like to do (for example, associate illness with
tones or lights), and other things which it can do much better than a
domain-general system (for example, associate illness with tastes or
smells)....
In any case, my guess is that you can probably empirically prove that the
brain of a seaslug is capable of 300 computations per sec (as the author
of the book claims for humans) so I think he is not even in the ballpark -
even vaguely. Even fruitflies probably need to do this many computations
just to fly around, never mind when they are doing something interesting
:)
Just my 2c worth
Cheers,
Stephan Anagnostaras, PhD
UCLA Dept of Neurobiology