IUBio

Toward a Science of Consciousness 1998

Mdg Mdg at nospam.com
Sat Apr 25 06:17:44 EST 1998


Wim Van Dijck wrote in message <3541fe96.1927548 at news.rug.ac.be>...
>On 23 Apr 1998 17:00:42 -0500, rickert at cs.niu.edu (Neil Rickert)
>wrote:
>
>>modlin at concentric.net writes:
>>>In <353EFFF7.857AE508 at linkserve.com.ng>, Lyle Bateman
<lbateman at linkserve.com.ng> writes:
>>
>>>>Its a matter of a lot more than just programming, I'd have to say.
Hardware
>>>>design is critical here.  With the type of architecture currently
popular in
>>>>the computer industry, conciousness will never happen.
>>
>>I have to agree with Lyle here.
>Me too
>
>>>In the sense that you seem to mean it, your statement that hardware
>>>design is critical is wrong.
>>
>>And thus I disagree with Bill.
>Indeed
>
>>>Hardware design is important in a lot of practical ways.  A design must
>>>provide devices and channels for information to come into the system and
>>>out of it... sensors and effectors, in biological or robotic terms.
>>>Hardware design also determines how fast computations can proceed, and
>>>how much information can be stored and manipulated... all very important
>>>to the practicality of solving any particular computational problem.
>>
>>>But hardware design has absolutely nothing to do with the kinds of
>>>things that can be computed,
>>
>>To the extent that that is true, computation is irrelevant to
>>cognition.
>
>I once heard a quite strond argument during some introductory AI
>classes: computer hardware (neural nets not included) work in
>algoritms. Conscious minds, such as ours, use procedures (or whatever
>you want to call it) that are not algoritm based. Computers CAN only
>use algoritms (at least nowadays) so based on this principle, a
>computer will never gain consciousness, no matter how big or fast it
>is.
>


What about simulation?  Using algorithms you can simulate the behavior of
the human brain, or of a neural net.  The sophistication is an emergent
quality, not apparent at the finer scale of processing.  The processing
involved would be immense, and the system would have to be far more
sophisticated then the one being simulated (modeled?) (emulated?) to be
effective.  I heard a phrase, "Any computer can emulate any other
computer", I suspect it's true.  Once again though some emulation's would
require far more processing power than than the original computer being
emulated.

Leads me to a question I always wondered about, say we hit the point where
we can actually map a human brain to a fine enough detail that we can
simulate it's behavior on a computer.  Will the simulation be conscious?

Mdg





More information about the Neur-sci mailing list

Send comments to us at biosci-help [At] net.bio.net