IUBio

It's primitive; it's dumb (MORE further) ( but not furthest?)

Jim Balter jqb at sandpiper.net
Sun Jul 11 06:40:06 EST 1999


Wolfgang Schwarz wrote:

> I somehow agree with you on that point. But certainly it's not that
> easy.
> Searle has some famous arguments on his side, e.g. the chinese room
> argument [1]:
> Briefly, imagine someone who understands no Chinese being confined in
> a room with a set of rules for systematically transforming strings of
> symbols to yield other strings of symbols. As it turns out, the input
> strings are Chinese questions, and the output strings Chinese answers.
> Nevertheless it would be odd to say that the person in the room
> understood any of the questions or answers.
> Therefore, rule-governed syntactic manipulation of symbols is not
> sufficient for understanding.
> 
> Anyway, I think the most plausible definitions of intelligence are
> functional definitions, and there is no a priori reason to doubt that
> some machine could perform the necessary functions. After all, the
> Chinese room (including the person in it) is an intelligent system,
> whatever else is missing.
> 
> As for consciousness, it seems that one just begs the question if one
> seeks a functional definition. The difference between a conscious
> system and a system that lacks consciousness is in the first place not
> that the former can perform actions which the latter can not, but that
> "there is anything it is like to be" the former system, but not the
> latter [2].

This is nonsensical blather and prejudice.  David Chalmers can do no better
in describing the difference between himself and his zombie counterpart
than to say that the zombie is "all dark inside".  Of course, being
"all dark inside" is being *like* something after all.  That Chalmers
cannot move past such metaphors to a more rigorous explanation of
what a zombie is like without having it be like anything to be a
zombie seems not to alarm him, as it should anyone not in the grips
of an ideology, that his notion may be quite incoherent.

> This is of course far from a definition.
> 
> cu,
> 
> Wolfgang.
> 
> [1] John Searle: "Mind, Brains, and Programs", Behavioral and Brain
>     Sciences 3 (1980):417-457
> [2] Thomas Nagel: "What is it like to be a bat?", Philosophical
>     Review 83 (1974):435-450
>     Ned Block: "On a confusion about a function of consciousness",
>     Behavioral and Brain Sciences 18 (1995):272-287
>     David Chalmers: "Facing up to the Problem of Consciousness",
>     Journal of Consciousness Studies 2 (1995):200-219

There are all sorts of arrested development.  Both Searle and Nagel have
been thoroughly refuted in the philosophical literature, yet they are
frequently cited as the final word.  The logic of Searle's Chinese Room
couldn't even pass a freshman course in logic; the proposition that
he sets out to contradict in his reductio ad adsurdum -- "what ... if my
mind actually worked on the principles that [Schank's] theory says all
minds works on" -- is never even realized in the thought experiment.
In fact, Searle's argument hinges on the mind of the Searle homunculus
being explainable only in the standard folk theoretic mental framework.
It is the room itself that operates, ex hypothesi, on Schank's principles.
Thus, when Searle insists, his response to the Systems Reply, that the room
itself is not conscious, he has proven nothing, but is merely repeating his
prejudices.  The above mentioned David Chalmers does a nice job of
dismissing Searle's amateurish confusion in his book _The Conscious Mind_.
Of course, it is only natural that the internal incoherence of the
doodoolists' conceptions will express itself as an external incoherence
amongst them.

--
<J Q B>




More information about the Neur-sci mailing list

Send comments to us at biosci-help [At] net.bio.net