"F. Frank LeFever" wrote:
> This is the great MYSTERY, which may be too fundamental to be resolved:
> our private "experience". We can tell other people that we have it,
> and other people can tell us that they have it, but we can do so only
> by saying something like "it's the way you feel when you put your hand
> in cold water" (etc., etc.)--i.e. by reference to some external
> mutually observable event.
The Chinese Room, ex hypothesi, proffers the same sorts of referential
descriptions.
There are all sorts of possible Chinese Rooms. There are Chinese
Rooms that respond, absurdly, as though they were 13 year old
Chinese ballerinas. There are Chinese Rooms that respond with highly
sophisticated philosophical explanations as to how a "mere" room could
actually comprehend Chinese. Rooms that make fervent pleas for you to
believe that they are really conscious, and rooms that are quite
indifferent to your beliefs. We might find ourselves having
different sorts of emotional responses to these different rooms.
Searle expresses his incredulity that a man who doesn't understand
Chinese, together with "bits of paper", could understand Chinese.
One might also be incredulous of the possibility that a CPU,
together with "bits of computer memory", could play grandmaster
chess, or that an empty skull, together with "bits of neural tissue",
could love its mother.
Folks like Searle suffer from a poverty of imagination, and trade
in such poverty in their readers. And they play on our lack of
a proper ontology of the mental, together with anti-mechanistic
prejudices, to "prove" all sorts of things about 'computers" and
such. But the enterprise is fundamentally dishonest; one can
employ a severely impoverished ontology of "understanding" to
"prove" that anything must lack it, from Searle's Chinese Rooms
to Searle himself (and for the latter, we have empirical evidence
to back it up).
> We cannot go beyond this to give them the "information" which would
> GIVE them the same experience, the same "sensation" that we have. It
> is a matter of faith that what I "feel" when my hand is in cold water
> is what YOU feel when YOUR hand is in cold water.
It is not a matter of faith, it is a matter of inference to the
best explanation. David Deutsch is discussed elsewhere in this
group; I highly recommend his _Fabric of Reality_, not for the physics,
but for his clear exposition of Popper's solution to "the problem
of induction".
> Knowing SOMETHING about what neural structures must be intact for
> people to "feel" (or SAY that they "feel"), we can doubt that any
> computer so far constructed "feels"; but for Searles and others to
> assert that nothing but a living brain can EVER, "feel" or "be
> conscious" is in itself a leap of faith, which unfortunately is rearely
> made explicit.
Searle makes clear that he doesn't assert this. But he does repeatedly
state that he has proven that a machine that "feels" or "understands"
cannot do so by virtue of computation alone. The intellectual
poverty of the philosophical community is illustrated by the
fact that Searle isn't hounded out of the room whenever he makes
such a proof claim; just imagine if Andrew Wyles had gone around
claiming that he had proved Fermat's Last Theorem while that
proof was still unsettled. And yet philosophers seem to feel
no embarrassment about such overwhelming methodological ineptitude.
> Searles did ALLUDE to this problem of definition of
> "consciousness" early in his presentation to the ARNMD meeting last
> December, but I waited in vain for him to explicitly address this
> problem; he didn't thatt day.
Put Searle, von Daniken, and crop circle proponents on a panel
together, and I suspect that you would find a striking similarity
in their adherence to their beliefs and their modes of defense
of those beliefs.
--
<J Q B>