"Jeffrey Kazuo Yoshimi" <yoshimi at grad-ea-6.oac.uci.edu> wrote in message
news:Pine.OSF.4.30.0102051315330.11515-100000 at grad-ea-6.oac.uci.edu...
>> Hi,
>> I am a philosopher studying consciousness and the brain, which tends to be
> sort of annoying (the experimentalists do all this hard work and then
> arm-chair cowboys speculate about it and get it wrong in the process) but
> I appeal to your patience and ask some questions.
>> Assume that for a brain to produce conscious experience, it is necessary
> that some set of conditions c1..cn obtain. Take away any ci from a
> consciousness-producing brain, and the brain in question no longer
> produces consciousness. Of course we could not prove that any such
> condition is necessary, but I'm not looking for proof, just empirical
> hypotheses about what such conditions might be.
>> I would guess that some such conditions might be:
>> c1: At least some number n of neurons must be firing (can there be a
> conscious-brain where no neurons fired?)
>> c2: These n neurons must fire above some threshold. (do m > n barely
> active neurons give rise to consciousness?)
>> And so forth.
>> Perhaps this way of thinking about the question is wrong. If so, why? If
> not, are there speculations on how many neurons are minimal for
> consciousness, and /or what threshold must be surpassed? Are there other
> hypothesized conditions ci? Perhaps glial cells do something that is
> essential, perhaps a medium which allows an electrical field surpassing
> the local field is essential (i.e. imagine a brain with firing neurons but
> silicon replacements for the glial cells or some sort of insulation
> surrounding all the neurons).
>> So I descend into the philosopher's most dangerous luxury--thought
> experiment.
>> If any of this stuff is answered in any existing text (I was thinking
> anesthesiology textbooks might be useful), or is speculated about in any
> way, or is completely wrongheaded, I'd like to hear about it.
>> Thanks for your patience!
>> Jeff Yoshimi
> UC Irvine
>> PS: Another good question, to me. Could sufficient conditions s1..sn be
> specified such that any brain meeting them produces consciousness? This
> would be like a recipe for a consciousness-producing brain (is
> "consciousness-producing" an acceptable phrase?) which would encompass
> the necessary conditions above.
There is a fundamental assumption in your thinking that is incorrect --
that neurons "firing" above some "threshold" is even an appropriate
measure of brain activity. Action potentials are extremely useful if
you really have to send neural information any large distance, say
1mm or more. But for shorter distances, synapses can function,
synaptic inputs can integrate to perform "information processing",
transmitter can be released, and all the plasticity you need can
all occur without action potentials.
The "microcircuits" that form within, say, the cortex generally
involve thousands of synaptic interactions between thousands
of neurons within each cubic millimeter. It is not really clear
how relevant action potentials are to this situation.
That is a problem in your thinking about physiology. There are other
problems in your approach. If the brain truly is a cellular piece of
'machinery' that works according to physical laws and also serves
as the substrate for 'mind' and 'consciousness' and all that, then in
some sense the processes are algorithmic (albeit with substantial
random and time-varying components). So the "processes" associated
with consciousness could be simulated or replaced with mathematics
or with mechanical/electronic analogs. That is the basis of "artificial
intelligence". It may theoretically produce a form of something that
meets your definition of consciousness with resembling in the slightest
the machinery that really exists in a biological brain.
So is consciousness a particular form of brain activity involved in
specific cells that work with a specific biochemistry and biophysics,
or is it some type of form of elaborate information processing
involving the usual paradox-introducing notions of recursion and
self-reference?