In article <fa69ae35.0402110159.5a3eee8f at posting.google.com>, Eray
Ozkural exa <erayo at bilkent.edu.tr> writes
>"AlphaOmega2004" <OmegaZero2003 at yahoo.com> wrote in message
>news:<2b87008a9fd70789cb534988f584011a at news.teranews.com>...
>> I think the conceptual error these guys make is that one cannot explain the
>> psychological terminology by reference to neuropsychological concepts.
>>>> These are *all* bona fide *ways* of conceptualizing a space that is
>> multi-dimensional in the broadest sense. TO think one can divorce the
>> psychological from the neuropsychological is non sense.
>>>>The problem with these authors is that they have published the book in
>2003, while it seems to me it should have been published around 1930s
>or 1940s when Wittgenstein was worshipped by many. Ah, still the
>church of Wittgenstein thrives I guess, and I pity that I'm not a
>priest and true believer (in God) from time to time. Anyway, you get
>the idea. If it weren't for an absolute conviction in the truth of
>what Wittgenstein said and his followers, there wouldn't be such a
>book. That's an important point.
The important point is that there *are* serious *theoretical* or
conceptual confusions in "Cognitive Neuroscience". Much of neuroscience
is 'just' careful anatomy, biophysics, chemistry, histology etc etc and
even some of the behavioural work is clearly sound and useful science.
The problem is that in "Cognitive Neuroscience" some seem to use the
complexity of the domain almost as a black cheque to write whatever they
want - and because it can't be tested, or begs all the questions being
asked in another lab (often just down the corridor!), no end of nonsense
is written and published. The "storage" point has been covered here, but
there are other.
You may well think you know what "memory" is. I suspect most people
posting to this group know how flip flops, registers and RAM cells work.
They also probably know how Turing Machines, ALU's and modern CPU's
work. Given that, why might they be concerned about the usage of the
computer as a model or metaphor in neuroscience or behavior science?
That's the point you need to try to tell *yourself* that you may not
understand.
>>Note also that the authors have written some serious non-sense. In
>particular, this one:
>> "It is wrong, *conceptually* wrong, to suppose that memory is always
>of the past, or to think that memories can be *stored* in the brain in
>the form of the strength of synaptic connections (Kandel, Squire,
>Bennett)."
I don't know what they are specifically talking about here, but it might
just be that what's sometimes presented as "retrieval" or "recollection"
of specific events is not, as reported, what has actually happened - at
least not in that configuration.
>>As a computer scientist, I know very well what "memory" is, and *that*
>our intuitive notion of "memory" coincides with the technical term of
>memory,
How do you know that? It is this sort of 'certainty' which easily breaks
down when you look into how the two notions are actually used. It may
just be that you're being so loose in your association that it seems
this way to you. These issues tend to become even more apparent when one
actually does practical work in these areas. One of the authors is an
*accomplished* neuroscientist and I suggest you have a look at one of
his earlier books:
http://www.isanweb.org/general/historyofsynapse.pdf
>*and* furthermore that it also corresponds to "memory" in
>neural associative memory models, etc. That is to say, there is *no*
>conceptual mistake. And I also direct your attention to the stupid
>blurb about "the past". Oh, memory is not of the past, great, then
>perhaps it is of future? Trust me, this is non-sense. I agree with the
>authors in that a good philosopher must be able to distinguish sense
>from non-sense, but so must a scientist and from both perspectives
>their statement is pure non-sense.
>
This, as I say, may simply reflect your limited knowledge of how the
term "memory" is used and misused by different people. I think one only
really starts to get a headache over the usage of the term when one
really gets down to dealing with the practicalities. In Computer
Science, like it or not, one tends to operate from the Intentional
Stance, even when one is working with CPU architectures. You are, in
effect, relying on how one naturally thinks when fabricating or working
with the technology - just as one does when one works with film, video,
TV etc. The actual components can be dealt with in physical, extensional
terms, but when considering how they are put together, you take the
Intensional Stance as an assumption - ie the technology is tailored to
fit - ie sell!
>That's why I think they sounded more like village idiots than
>philosophers when they said such things. They say even sillier things
>about computationalism as Pierre demonstrated, you can knock down
>their arguments with a single analogy.
You have this unfortunate propensity to disparage what you don't
understand. I've commented elsewhere on how this intolerance of what one
does not understand is just one element of a class of behaviour which,
whilst all too human, is odious none the less..
>>Regards,
>>--
>Eray Ozkural
--
David Longley