Bill Modlin modlin at concentric.net wrote a clear and fascinating post:
[ First, the title was excerpted from a post by Aaron Sloman. - Mentifex]
>Thomas Gigear:
>>Also, I just realized, isn't the brain somewhat digital on at least
>one level? [SNIP]
>>Gary Forbis:
>>I could be wrong about this. It seems to me too much is made of
>the "digital" firing of neurons. There is another domain, time, in
>which which they are very analog. [SNIP]
>>>Kin Hoong:
>>Anybody recall the Perceptron? It was designed on the assumption that
>neurons were digital devices... then somebody whose name escapes me
>proved that they could not do all that much. Neurons are analogue
>devices. I seem to recall that even if they fired, the amount of
>chemicals released into the synapse is not always the same (dependent
>on quite a few things) and even if they were, reception is not always
>the same depending on how much stimulation the receptor had just
>received etc. Sorry I can't remember the details, but they should be
>available somewhere.
>> +++++
>>Two things: [ Now begins Modlin's treatise. - ATM/Mentifex ]
>>First, about the supposed limitations of Perceptrons.
>>Minsky and Papert showed that a SINGLE STAGE (or "layer) of
>perceptrons is limited to computing linearly-separable functions.
>>It is a trivial result, not nearly worth the attention paid it.
>Unwarranted emphasis on the point retarded progress in neural
>modelling for 20 years, and even now people who quote it without
>understanding continue to propagate the damage it has done.
>>The same limitation applies to ANY set of primitive functions, whether
>Perceptrons, arithmetic operations, or things like AND/OR/NOT boolean
>logic gates. In one stage you can only compute those primitive
>functions. To compute more functions you have to combine the outputs
>of the first steps with more of the primitives, producing a
>multi-stage computation.
>>Adding more devices lets you compute more complex functions.
>Perceptrons with non-linear or thresholding outputs are boolean
>complete, so you can compute any computable function with a network of
>them, just as you can with NAND, or with AND/OR/NOT, or with many
>other sets of primitives.
>>Technically, one extra layer of intermediate functions between inputs
>and outputs is enough to compute anything. This cannonical form is
>inneficient, and practical circuits use many stages.
You mean a little sandwich of three layers can compute *anything*?
(Would that be like the ones and zeros on a Turing machine tape?)
The bang-bang cannonical (sorry!) form must be reiterable, so
that you may simply concatenate whatever it is you want to
compute -- is that how it works?
>Second, about the analog/digital confusion:
>>All devices, including the logic gates of a computer, are "analog" in
>basic construction. The class of digital devices is distinguished by
>having discontinuities in its mapping between inputs and outputs, so
>that the output switches states abruptly as the computed function
>crosses decision boundaries. Typical computer logic gates and neurons
>all share this property of having discrete decision boundaries, and so
>all are properly classified as digital devices.
In a face-to-face discussion I had for six hours with a netgod of
comp.ai.philosophy on Sun.7.Mar.1999, we discussed the absence of
Bill Modlin from the newsgroup, so I made sure to read this quoted
post -- and it turns out to be one of the most informative Usenet
post that I have ever read. Thank you, Bill Modlin. - Arthur M.
>Dynamic digital circuitry depends on the timing of signals for its
>function.
>>In a computer, time is broken into "clock cycles" and functions are
>computed based on signals presented together in a single cycle. For
>neural circuitry each cell provides its own asynchronous timing, but
>functions are still based on the coincidence of signals in some
>definite interval of time. Adjustment of the interval to suit
>conditions encountered is part of the learning process implemented by
>a cell, and this is an analog process.
>>The variable releases of chemicals you mention are the variable
>connection weights, and their variability is also important to
>learning. The weight adjustments are also reasonably considered
>analog.
>>But all these analog processes feed into a decision surface with
>discontinuous decision boundaries for output, so the overall function
>is still properly called digital.
>>Bill Modlin