Ripped from 'Nature', Vol 381, 9 May 1996, p. 97.
Though I doubt the necessity of investigating the complex
phenomenon of 'consciousness' on its own right, regarding it as
a process naturally emerging from large-scale
evolution-artefacted NNs statespace evolution, _being_ the
system's evolution, in fact, thus pleading for a strong focus on
the bottom-up approach, instead, the following quite fresh text
elucidates of what's currently going on in _interdisciplinary_
neuroscience. As an aside; I guess this microessay wasn't
written by a 'wet' neuroscientist. Too few tech talk for that.
{{my comments in curly doublebrace}}
'gene
________________________________________________________________
Reductionists lay claim to the mind.
There are deep disagreements about how science should deal with
the problem of consciousness. But neuroscientists should be soon
in a position to set the agenda. The result is likely to be
rapid progress.
Although the object of their study is also the organ that
produces consciousness, neurobiologists are often reticent on
the topic following their failure so far to reach any agreement
about how (if at all) this phenomenon should be studied. But
they have recently begun to raise their heads above the parapet,
to join equally rash physicists, philosophers, psychologists and
computer scientists. This community is united in accepting that
consciousness has something to do with the brain (and that
neurobiology might therefore be relevant), but in little else.
Perhaps the most basic question is whether existing physical
principles are sufficient to explain the brain. {{-- Occam's
razor??}} The boldest attempt to 'rescue' consciousness from
neurobiological reductionism comes famously from Roger Penrose.
{{the same Penrose who has such trouble believing in Darwinian
evolution?}}. He argues that the brain performs feats of
understanding that cannot be computed {{so he has tried, has
he?}} and so cannot be implemented by any system based on known
physical laws. Instead, he believes, explaining the brain will
require a new theory linking events at the quantum level {{why
not invoking bloody SUSY or singularity theory?}} with those at
the macroscopic world. Adherents to this view invoke key bits of
cellular infrastructure -- microtubules -- as the most likely
substrate for coherent quantum-level phenomena {{and if it is
not microtubules it is obviously vis vitalis}}, but most
neurobiologists are unimpressed. It remains to be demonstrated
that quantal processes might show unusual manifestations in
microtubules and it is difficult to see how they could influence
or be influenced by neurotransmitter release or electrical
depolarization of neuronal membranes, events with undoubted
relevance to mental processes. {{hear, hear}}
Most importantly, there is no direct evidence that current
physical concepts are inadequate to explain the brain, or that a
quantum-level theory would be any better. Penrose believes that
neurobiologists will indeed encounter phenomena that cannot be
accounted by for by known principles, perhaps even in relatively
simple nervous systems. But, unless this happens,
neurobiologists might best treat his argument as an obstacle to
be side-stepped rather {{It is Penrose's problem if he wants to
play the jester's part. But people _listen_ to Penrose. He
cannot be left unopposed, even if this may mean shooting in one
own foot, granting him the status of an opponent}} than a source
of illumination. That certainly seemed to be a majority response
to his views at a multidisciplinary meeting on this topic
("Towards a science of consciousness") last month in Tucson,
Arizona, which provided a snapshot of current debates.
If the brain can be entirely explained in terms of known
physical processes, then perhaps consciousness has no
independent causal effects. How then was it favoured by natural
selection? Perhaps it wasn't. Consciousness may simply be an
epiphenomenon of certain arrangements of matter, or indeed of
all matter {{he cannot be serious, can he?}}: as one speaker at
the conference remarked, perhaps even an atom feels a spasm of
relief {{this means pushing the meaning of 'feeling' and
'relief' a bit far}} as an electron falls to a lower orbital.
Complex brains would presumably generate more complex forms of
consciousness as a by-product of their {{slightly, surely only
slightly}} more sophisticated computations. But this is not very
helpful in explaining why (say) the colour red feels like the
way it does {{ask a Daltonist about that one}}, or in deducing
from the neurobiology of echolocation how it feels to be a bat,
{{frankly, this question is entirely meaningless. If you knew
what it would feel like to be a bat, than for all practical
purposes you _would_ be a bat. Fancy reasoning about
representational systems and processing one simply does not have
is wistful speculation at best}} or in deciding whether it is
unethical to unplug a computer {{not yet. Soon, though. ALife
and stuff}}. David Chalmers (whose book 'The Conscious Mind' is
reviewed on page 123) refers to this class of questions as 'the
hard problem'; unlike the (relatively) easy problem of
explaining how the brain generates behaviour, the experiental
aspects of our brain states seem to defy any conceivable
explanation in terms of known concepts.
But not everyone accepts this 'hard/soft' distinction; for one
thing, it tends to marginalize neuroscience by implying that it
can never hope to illuminate the 'hard' problem. Being
marginalized, however, does have the advantage of leaving
neuroscientists free to tackle the 'easy' problems untroubled by
philosophical angst. {{yeah... give'em philosophers a crash
course in reality. Seem to need it, sometimes. The whole long
history of philosophy is riddled with glaring errors, based on
their apparent inability to realize that common knowledge and
logical reasoning as opposed to the scientific method are much
too weak methods to chart the 'reality'}} While the full
experiential aspect of consciousness may be impossible to define
and to measure, an 'easy' definition is both feasible and
useful: to be conscious of something is to have a flexible
neuronal representation that can be used to drive many different
behavioural outputs. In an experimental situations, this can be
an arbitrarily specified button press or (for humans) a
statement that one is conscious. The subject's testimony can
then be correlated with the accompanying neural activity, either
by functional brain imaging {{one extremely interesting, quite
recent approach here is to record a high-res spatiotemporal
pattern magnetic encephalography (MEG) by means of a
collage/mosaic of multielement hot superconducting detector
arrays. Data stream avalanche. By means of some quite simple
maths, a noninvasive hires imaging of spatiotemporal firing
activity, an extremely detailed signature/fingerprint can be
constructed}} or (in monkeys trained to report their
perceptions) by recording the activities of individual neurons.
{{Frankly, I don't think an activity requiring concerted firing
of MNeurons at least is visible at single neuron functional
level}}.
So far, most such studies have employed visual stimuli, but
other approaches, along with conditions such as sleep,
anaesthesia, and perhaps even trance, offer further
opportunities to search for neural correlates of conscious
states. The underlying assumption is that systematic
correlations are there to be found, and that conscious states
correspond not to some loosely defined global state of the
entire brain {{need not to be too loosely defined, does it?}},
but much more precisely to the activities of specific
populations of neurons. {{ah, 'populations'. A perfectly
specified term. Has a dynamic range from 10 to 10^11 neurons.
Just about perfect.}}
Whatever the nature of these patterns of activity, credit is due
to Daniel Dennet {{him, again. omniubiquitous guy.}} for warning
against the ''Cartesian Theatre'' fallacy that perceptual
representations converge on a single hypothetical 'perceiver'.
{{homunculus theory, still alive?!}}. Our current knowledge of
the brain gives no support to such a view; visual
representations, for instance, are distributed across many brain
regions. Similiarly, Dennet doubts the existance of any 'central
meaner' that decides what we mean to say and do. Instead, he
envisages that multiple neuronal representations are constantly
competing for control of both perception and action. And to
those who reply that 'it just doesn't feel like that', hee
rightly warns against the assumption that a few minutes of
introspection are sufficient to test a theory of brain function.
{{the 'privileged observer' stuff. Alas, introspection does not
give one 'wheel' rights}}
Those, who still think that their intuitions are a reliable
guide to how their minds work should reflect on the condition
known as anosognosia. This rare syndrome arises after damage to
thhe right parietal lobe, and leaves patients not only paralysed
on the left side of their bodies but also unable to acknowledge
the fact. When asked to perform tasks with the paralysed hand,
they confabulate, and many will also deny the paralysis of a
fellow patient. And as if this were not bizarre enough, the
false belief (though not the paralysis) can be temporarily
corrected by flushing cold water into the left ear. There is a
reason to believe that anosognosics have suffered damage to a
belief 'mechanism' that normally updates our world-view in the
face of new information; it should now be possible to indentify
the brain regions involved.
As neuroscience sheds light on perception, memory, emotion,
decision-making and belief, it will become increasingly
difficult to deny its central importance in understanding
consciousness. Many other approaches will come to seem
correspondingly irrelevant. Buth whether neuroscientists will be
able to make their discoveries 'feel' like an explanation for
what goes on inside our heads, whether we will ever be able to
reconcile their findings with our intuitions about our own
mental lives, seems less certain. To meet that challenge,
neuroscience may yet be be grateful for the philosophers' help.
{{yeah, and go ask some psychoanalysts, also}}.