In article <417baab5 at dnews.tpgi.com.au>, John Hasenkam
<johnh at faraway.?.invalid> writes
>>"patty" <pattyNO at SPAMicyberspace.net> wrote in message
>news:0eNed.416604$mD.323949 at attbi_s02...>> John Hasenkam wrote:
>>>> > What concordance I
>> > have with behaviorism arose long before I encountered the Terrible
>Twins, it
>> > arose from reading research and finding many problems with the concepts
>of
>> > localisation, modularity, and the failure of many to recognise that
>across
>> > individuals and even within individuals how the brain handles stimulus A
>or
>> > whatever is highly variable. Strictly speaking there can be no
>understanding
>> > of brain function until we have a science of behavior. Even short
>changes in
>> > stimulus presentation can profoundly alter experimental results. We need
>to
>> > understand these types of dynamics much more precisely. Trying to
>understand
>> > how brains work independently of observing behavior is like trying to
>> > understand the function of aerodynamic surfaces on the moon.
>> >
>>>> Well we are creatures of habit. We can try to predict our behavior in
>> terms of the various ways that it changes as it responds to the
>> contingencies that it creates ... what are the independent variables ...
>> what are the dependent variables ... and how can these can be related by
>> generalities. But there is one aspect of intelligence that works
>> against such scientific predictions: The nature of intelligence is to
>> *break habits*. Yet that is the variability that your science is
>> required to eliminate from you equations. I think behaviorism needs to
>> deal with that fundamental paradox to make further progress.
>>>> patty
>>I have no idea whether behaviorism will be sufficient, I don't read
>behaviorists so I am in no position to comment. The nature of intelligence
>is to create new habits, and high intelligence often serves to break other
>peoples' habits. We need to think about:
>>Looking back over the past 40 or so years in the neuroscience
>and behavior domain, I have mixed feelings of
>satisfaction and disappointment. The disappointment
>comes in the lack of progress in understanding the way
>the hippocampus and the rest of the brain operate in
>both the larger and the smaller scale. We still do not
>really know the ways memories are created, stored, or
>retrieved. Knowledge of hippocampal functions must
>wait until a much more useful understanding of principles
>of neural organization at the whole brain level. At
>this point, I do not believe we have the conceptual tools
>necessary to approach the enormity of the problem.
>>Behavioral and Cognitive Neuroscience Reviews
>Volume 1 Number 2, June 2002
>Unsolved Mysteries: The Hippocampus
>Robert L. Isaacson
As you know John, I wasn't going to post this (although I did circulate
it privately about a month ago). But maybe it's time to post it now.
Although this post was partly prompted by a few of Alex's remarks on the
thalamus and "consciousness", it's mainly an oblique, rather general
commentary on some of the more eccentric posts and interests of others
who have things to say about "cognitive science", "cognitive
neuroscience", "computational neuroscience" generally in lieu of an
adequate grasp of "behavioural science". It's written as something of a
caricature and polemic, but as with all caricatures (and some polemics),
there's more than a grain of truth to it.
The first link below begins with a short extract from the introduction
section of a thesis written over twenty years ago. Today, more than half
a century after the popularization of the reticular formation and much
ado about arousal, activation, "consciousness", the EEG and much else
besides, most of the key issues, and problems, are still either glossed
over, or missed altogether by most who purport to be talking about "the
brain" and "consciousness". Despite much expertise and experience in
biological sciences, many of those working in neuroscience do not
understand behaviour as well as they think they do (and certainly not as
well as they should given the critical role that it plays, and must play
within their discipline). This was true when I wrote the above outline
(and I was well placed to do so), and it's still true today. Sadly, even
back then the more reputable people working on LTP seemed to be more
sympathetic to the type of speculative Hebbian Conceptual Nervous System
paraphernalia (which even many of the more empiricist researchers in his
day frowned upon) than to the actual behavioural technology which was
available even back then! Hopefully, this post goes a little way towards
explaining why people are both attracted to Hebbian etc metaphysics,
what the negative consequences are, and why if they actually looked to
the Experimental Analysis of Behavior, they'd actually find where it all
really came from but without the muddled metaphysics.
My experience is that most who claim to be talking about "the brain" and
"consciousness" today aren't really talking about this at all. Instead
they're rummaging in books or papers (sometimes around the brain itself
in vitro) *and* rummaging around in their quite separate suitcase of
metaphysical folk psychological presumptions, usually quite oblivious to
why these two patterns of behaviour are not connected, whilst also being
oblivious to this and why there might be anything wrong with what
they're assuming and inferring. This, presumably, is why they take such
great offence (or think it odd) when others (who do work in behavioural
science) suggest that what they're saying really is unsubstantiated
metaphysics and why that can't be a productive, rational use of anyone's
time.
<http://groups.google.com/groups?selm=812831857snz@longley.demon.co.uk>
<http://groups.google.com/groups?selm=tXUwKqPipsdAFw2Y@longley.demon.co.u
k>
Over half a century ago, EEGs and deep electrode recordings were *the*
technologies in neurology and "brain research". About 95% or more of
that research would have been done using rats and other mammals, just as
it is today. Human studies, based on the occasional clinical prods and
natural injuries made a relatively much smaller and (because of the
nature of the samples and the data) much less reliable contribution. A
quarter of a century ago many were still working primarily with
relatively crude in vivo recording and stimulating techniques. Today,
fMRI is the new EEG, but the old problems of interpretation (philosophy)
haven't changed, so it isn't really much more, in fact, *than* a new EEG
with pictures. Half a century ago, much was said about the Ascending
Reticular Activating System, its functional neuroanatomy and the
behavioural correlates of activity in specific and non-specific thalamic
projection systems. However, given that the thalamus comprises *the*
major set of deep afferent/sensory nuclei on the way to the cortex, I
guess that really isn't surprising. As it's at least two synapses in
from receptors, the assumption was that it must presumably have
something to do with the seat of the homunculus and its projection
screen, the ever so complex cerebral cortex!.
What *should* be surprising, however, is how few make the appropriate
connections between the above structures and functions and what's
outlined at the beginning of the excerpt in the first link provided
above. Or then again, perhaps it isn't surprising, as in my experience,
many people talking about the brain get away with the nonsense they come
out with largely because the research is so highly specialised and
arcane that most of their audience haven't got a clue whether those
talking about "the brain" really know what they're talking about, and if
they do, how much. Their naive audience can't critically
("intelligently") evaluate what's being said to them because natural
"intelligence" is basically limited by irrational biases which come with
our folk psychological heuristics. This gives these popular authors
phenomenal degrees of freedom and licence to say whatever they like,
pretending what they say is scientific fact. Given this carte blanche,
or tabula rasa, such authors use their esoteric lexicon to construct
whatever metaphysics they wish, making it *appear* otherwise to those
who know no better. The methodology of "cognitive neuroscience" is not
all that different from that of the magician, except the latter tend to
know they're just in the entertainment business. When they don't we call
them con-men or psychotics.
I've suggested that many in c.a.p need to take a long, careful look at
the sociological (if not geo-political) changes which have taken place
over the past few decades and consider just how radically old traditions
and mores have altered over that period. These contingencies have
dramatically changed academia, research, the professions and last of
all, even quite recently, the internet. I've also suggested that *most*
of what's peddled as "cognitive neuroscience" is actually little more
than a new genre of science fiction, and that as such, it's merely the
latest media upon which a seemingly ever expanding free speaking
self-obsessed "intelligentsia" projects its pernicious metaphysics.
This phrenological bump moulding and reading has always been around in
one form or another. It's just that in the past it was a naturally
limited or constrained, relatively benign manifestation and expression
of our species' sub-clinical psychoticism (except for those unlucky
enough to need to see a neurosurgeon in the days when psycho-surgery was
fashionable). Whilst more reputable researchers now accept that the
schizophrenias and affective disorders *are* just extreme variants in
the continuum of what we call "normal" behaviour (and "consciousness"),
this acceptance has actually started to present us with problems now
that we've also discovered ways of handling most of those disposed to
such "altered states" within the community rather than locking them away
as we once did for their own (and others') best interests. Whilst it is
now widely known that prevalence of schizophrenia globally is about
1:100 and that the incidence of the affective disorders is higher, few
seem to appreciate the social or biological significance, implications
or effects of this, let alone the social consequences of other, less
debilitating, forms of behavioural diversity (e.g. the 1:20 children
whose learning disabilities have been renamed as ADD/ADHD).
<http://www.nimh.nih.gov/publicat/numbers.cfm>
<http://groups.google.com/groups?selm=41491c7d@newsfeed.netlojix.com>
Just as IQ has been renamed and in some cultures, remeasured as
"Cognitive Ability Tests", no doubt the political agenda is that by
rebadging old, somewhat sensitive notions, most folk won't notice (cf.
intensional opacity) the sleight of hand.
This has all become an even bigger problem with academia opening its
doors progressively wider and wider over the same period, the aim
apparently being to enable up to half the population to benefit from
higher education. This inevitable redistribution in the "accreditation"
of skills or behaviours (as "multiple intelligences") is a curious
feature of this democratisation of education and our liberal (helpless?)
recognition of the extent of human diversity. The expansion of higher
education to accommodate this "creativity" (or "variability" seemingly
so characteristic of human nature) brings with it an inevitable change
to critical standards. "Intelligence" (aka "consciousness" to the naive)
can't shift or be shifted *that* way. The so-called "Flynn-Effect",
often cited as evidence to the contrary, appears to be accounted for by
past anomalies at the *bottom* end of the distribution of "cognitive
ability" or IQ ('g') having been redressed through better nutrition, and
other healthcare policies.
Politically correct tolerance of sub-clinical behavioural diversity and
the more disabling behavioural extremes ("disorders"), many of which,
like (predominantly male) delinquency, tend to peak in late adolescence
and early adulthood (after which incidence tends to decline though not
the prevalence), appears to have come about as a consequence of the
rather naive belief that "cognitive skills" must be good (they've been
valued in the past in a minority at the high end of the distribution),
and therefore everyone in a democratic society should have some or more
of them (whatever they are)! This is, alas, democracy gone mad, and
whilst some have pointed out the absurdity of all this, others have
leapt in with "revolutionary" counter arguments. Not content with the
intensional notion of "intelligence" as a single reified factor (which
only has merit where treated in its technical, extensional sense - e.g.
'g'), these liberal minded, politically correct, academics seem to have
decided to partition "intelligence" into multiple "intelligences", in
the somewhat naive egalitarian belief that this presumably must enable
more folk to excel in different "types" of it which have somehow been
neglected through the past obsession with those dispositions making up
'g'. In this way more people can have "cognitive abilities" - a truly
laudable and socially desirable democratic "educational" ideal. The fact
that this is all just science fiction and a corruption/misunderstanding
of what 'g' actually comprises as a hierarchical construct produced from
factor analysis, seems to be besides the point to these celebrities. The
"idea" sells books, gets attention and acolytes, and *that's* all that
really matters in a world where personal success is all.
This intensional nonsense, and more like it elsewhere, is today being
reinforced through liberal arts programmes supported by popular science
media, all of which, coupled with the democratisation of education and
affirmative action policies encourages the growth and further funding of
regressive "filler" services such as "cognitive science". These subjects
have burgeoned since the 1970s more as a response to democratic market
forces rather than as a legitimate function of any scientific or
practical merits which can be gauged by predictive utility, control over
variables or technological usefulness.
What needs to be focused on is the relational measures of "intelligence"
as classes of behaviour with particular focus on the actual *measurement
scales*, and how these need to move from at best the present ordinal
level to interval level measures in the near future. In animal models,
one aspect of this is the difference between absolute and relative
*rates* of response in the higher order discriminated operants studied
by behaviour analysts on the one hand and the behavioural geneticists on
the other. Here, there needs to be a closer examination of the research
programme consequences of focusing on absolute over relative rates. In
human work this is touched upon in the analysis of individual
differences in measures of Reaction Time and Inspection Time in the msec
range. Whilst it is not what most people usually expect to hear when
they ask what the "concept of intelligence" refers to, they might be
wise to look into what differentiated Herrnstein's approach to
behaviour analysis from that of Skinner, and the programmes of work
which followed (or didn't alas) as a logical consequence. This is where
Tversky and Kahneman's work has its scientific roots in my view.
Instead of sound research, what we see today is the effect of market
forces in a free society where metaphysics sells books, fills lecture
theatres, creates employment, and has become politically de rigeur. It
persists because lots of confused adolescents and young adults need it,
who, despite some lip-service respect for *science* and technology,
actually appear to be unable to cope with the discipline, restraint and
austerity demanded by the extensional stance. Knowing this, publishers
and conference organizers profit from a demand for an interminable
traffic in cognitivist rhetoric and metaphysics. One can see these
forces at work everywhere today, and newsgroups are no exception. In
fact here, it's perhaps more obvious as the behaviour is under less
social restraint because the authors think that they're writing in
private. It's an interest in virtual science, not science per se, ie the
problem with cognitive and computational neuroscience is that it's all
in the mind.
Most of what's written (here and elsewhere) is, I suspect, really just
an inconsequential expression of this curious disposition of ours and is
quite pointless. Apart from self-promoting, exploitational ignoramuses,
claiming otherwise, this politically correct, fashionable "theoretical"
nonsense hasn't actually *replaced* behavioural science and its
technology within neuroscience, the professions or anywhere else where
quantitative accountability actually matters. The young, naive and those
caught up in the contingencies which control economic and social
security are consumed by this phenomenal waste of time and resources,
whilst in reality, productive and accountable research, and successful
practices in the applied professions continue to depend on the
technologies produced by behaviour analysis (although those with
personal interests at stake, ie charlatans and ignoramuses, arrogantly
ignore these facts as one might expect). These people don't know what's
actually done, or if they do, pretend otherwise so long as their
clients/audiences want something else.
Many of the academics and popularists promoting "cognitive science"
don't, can't or won't look very closely at what's actually done in the
procedures they purport to be referring to, so they don't, can't or
won't see how they plagiarise and misrepresent the facts of what's done
as a consequence. Nor do they seem to notice the extent to which the
celebrities (whom so many take as their role models) have *other people*
doing their bench work for them. This is yet another negative
consequence of the distributed responsibility which characterises these
multi-disciplinary teams, and the inevitably (un)natural selection of
loud-mouthed, arrogant front men and women who often get to lead them.
These people tend to excel more in unscrupulous personal politics and
technical *incompetence* than in the areas within which one might expect
scientists to excel, ie in remaining true to the empirical evidence, and
not going beyond it - (even if that doesn't win funding!). This is an
almost impossible demand on anyone in such positions alas, so one can
see how these positions actually shape up such behaviour if one is to
survive in them.
The problem is that although many will recognise the degree of truth in
this caricature, most don't know what can be practically done to curtail
it or better manage it. Their lives are caught up a complex web of
obligations which prevent them from doing anything unless they have
tenure or some other form of security, and by the time that most have
either, the majority have long since given up on rocking the boat
anyway, after all, that's why they have tenure, and they've seen what
happens to those who don't play the game!
In recent decades there's been a lamentable burgeoning and commercial
exploitation of speculative metaphysics (masquerading as "creative
thinking" or "hypothesis testing") largely *because*, in my view, this
fills a *popular*, almost religious, vacuum. The biases, distortions and
other unfortunate consequences of epistemological anarchism used to be
kept under better restraint by empirical contingencies grounded in
prediction based on control over variables. But in "Cognitive Science"
these are weak to the point of being non-existent. Science today is
clearly influenced by contingencies or "market forces" which are
different from those which prevailed even thirty years ago. The older
contingencies, shaped and reinforced by selective pressures on 'g',
whilst divisive and elitist, are today being replaced by far more
cynically managed "contrived reinforcers" in pursuit of profit even if
there is no tangible product. It's all becoming more and more virtual.
Science has become commercialised (like everything else), and we're
seeing a return to the darker days of medieval mentalism which was only
temporarily restrained by the Positivist movement's reaction to the
cognitivist excesses of the beginning of the last century. The great
wars of the last century seemed to serve as purges in more ways than
one. The second link above includes a somewhat measured warning from the
last, and perhaps most influential positivist of the last century. I
suggest those who have read this far give it some serious thought,
Skinner was even more explicit in expressing his concerns for our
future.
I've suggested that those reading c.a.p (at least) look sceptically at
the claims of "cognitive neuroscientists", and I've paraphrased and
(perhaps somewhat fancifully) elaborated on Skinner's suggestion that
rather than the "revolutionary movement" which it purports to be,
"cognitive science" is actually little more than a refuge for
practically unproductive, dispossessed mentalistic philosophers,
psychologists, linguists & other disaffected academics who had the
metaphysical wind taken out of their aspirations by a relatively small
(but disproportionately influential) group of non-mentalistic
philosophers, logicians and behaviour analysts pursuing an austere line
of research throughout the 30s, 40s, 50s and 60s, and which found solid
(and for some, very disturbing), empirical support from extensive
empirical research within mainstream "cognitive" psychology in the 70s,
80s and 90s. A Nobel was awarded to one of the *living* contributors in
2002 (the asterisking is not an oblique reference to the late Amos
Tversky, but to Richard Herrnstein). One can, with some caveats perhaps,
look to behaviour analysts like Rachlin for some useful contributions to
this line of research.
Making the 90s "the decade of the brain" steered research funding in
neuroscience's direction and set the bandwagon rolling. Coupled with
commercial exploitation and hype of monoamines research by the
pharmaceutical industry (which also drives much of the research) it has
become even harder for "the new intelligentsia" to see the truth behind
these trends, and the extent to which "cognitive science" is little more
than commercial hype and adolescent creative writing which is well
beyond anyone's control. Ironically, referring to the present decade
"the year of behavior" just perpetuates this sorry charade.
<http://www.journals.apa.org/prevention/volume5/pre0050023a.html>,<http:/
/www.kent.ac.uk/kcjc/PDFfiles/George%20Mair%20paper.pdf>,<http://www.long
ley.demon.co.uk/Sm-97apr.pdf>,
<http://www.longley.demon.co.uk/Workj97.pdf>
These days, what can one really say to the consumers (and funding
agencies) other than 'caveat emptor' ? ;-)
--
David Longley
http://www.longley.demon.co.uk