IUBio

machine brains

Malcolm McMahon malcolm at pigsty.demon.co.uk
Sun Mar 7 18:15:26 EST 1999


On Sun, 07 Mar 1999 16:01:57 GMT, ZZZghull at stny.lrun.com (Jerry Hull)
wrote:

>On Sun, 07 Mar 1999 11:21:39 GMT, malcolm at pigsty.demon.co.uk (Malcolm McMahon)
>wrote:
>
>>
>>Oh, I can't agree with that. There's at least as much synthesis as
>>analysis going on in the brain. Complex ideas are made of simpler ideas.
>>When we examine the scene before our eyes what are we doing? We're
>>recognising elements of that scene and composing a model of our
>>surroundings in terms of objects we're familiar with. Exactly the same
>>with hearing speach or reading a paragraph. Perhaps the "elementary
>>particle" of thought is the firing of a neuron.
>
>I'm using a technical meaning of 'analytic' and 'synthetic' that you may be
>unfamiliar with.  The relation between a term and its definition is a logical,
>ANALYTIC relation; the relation between a cause and its effect is NOT ANALYTIC
>(fundamentally, for reasons detailed by Hume), but empirical, contingent,
>SYNTHETIC.
>

I'm using analytical and synthetic in what seems to me their common
sense sense. Analysis is investigating the components of a thing.
Sythesis constructing something.

>The only sense in which one thought can be "composed" of another thought, is
>to find that other thought hidden in it.  This kind of composition was
>distinguished by Kant as "analytic", as opposed to "synthetic".
>

No, I'm not talk about thoughts "hidden" in other thoughts. Consider a
plan as a particular class of idea. A complex plan will usually involve
sub-goals with plans to achieve them. The plan, in effect, is
hierarchically constructed from sub-plans. I think all ideas are like
that, at least to some degree.

>
>>>Consciousness pretty much DEFINES what 'mental' is customarily taken to mean.
>>>That is why it is so perverse to suggest that the two should be separate.
>>
>>No, I don't agree that mental processing is either necessary or
>>sufficient for consciousness to be present. A computer vision system or
>>an expert system, I would argue, perform functions we would characterise
>>as mental if an animal was doing them, yet we don't imagine that such
>>systems have consciousness.
>
>You are fighting with your own idiosyncracies of language.  I know of noone
>who regards "mental" as the name of a certain kind of processing, as OPPOSED
>to consciousness.  Since you are using words in a peculiar way, it's hard to
>know exactly what you believe.  I AGREE that there can be isomorphisms between
>what a computer algorithm is doing and what the brain is doing, when it e.g.
>processes vision.  But such algorithms are certainly not part of the MEANING
>of 'mental' or 'vision' for me and I suspect most speakers of the English
>language.  You are using the word 'mental' to describe things that are
>conventionally understood to be 'PHYSICAL', i.e., what is going on in the
>brain, &c.
>

It seems to me that the distinction between mental and brain functions
is two sharply drawn here and that mental process is just another way of
looking at brain function. It still seems to me that you're calling the
same processes mental when they are observered by consciousness that you
would not call mental when they weren't.

>>Because thought is what we (and especially philosohpers) think we excell
>>at we tend to exagerate it's significance. We'd like to believe that
>>consciousness in things exists in direct proportion to their
>>ressemblence to us.
>
>Who believes that?  & if so, so what?
>

I feel that that lies, to a degree, beyond the belief that consciousness
is just meta-logic or some kind of special reasoning process.

>>>>But a particular mind dictates (or is instanciated) as more than the
>>>>electrical state of the brain. It's written into the actual
>>>>microsturture. To make a new instance of a mind you'd have to actually
>>>>physically duplicate, or fully model the physical brain (or at least
>>>>crucial parts of it).
>>>
>>>When you find out how this is done, please let the rest of us know.
>>
>>Oh, it's easy enough to see how it might be done in principle. It's just
>>beyond our present technology. I'd give it 50-100 yeare.
>>
>>You'd inject vast numbers of nano-machines into the CS fluid. These
>>nanites would spread throughout the CNS, following and tagging neural
>>processes, examining the state of pre-synaptic membranes in order to
>>measure the "gain" on the different synapses. Each nanomachine would
>>record a few neurons in detail together with information about other
>>nano-machines and their tags encountered. When they'd been in for long
>>enough you'd "lure" them out of the body with some kind of marker
>>chemical. Between them they'd cary an almost complete picture of the
>>engram.
>
>You assume you know how the brain stores information; I believe this has not
>yet been settled.

It's not settled but I think that the sytematic alteration of properties
of the pre-synaptic vesticle and membrane, together with the actual
rewiring of neural processes is a pretty convincing, and widely popular
candidate. Such changes have only actually been measured in very simple
animals but, as far as I know, everything tested so far is consistent
with than picture.

BTW I read only this week that, contrary to previous belief, whole new
neurons do grow in mature central nervous systems (at least in rats),
though most of them die almost immediately. But then most of the nerves
cells that form during morphogensis die. It's like the CNS has it's own
private evolutionary process going on. The point though is that this
makes the picture of a nervous system storing information essentially by
physically adding new circuits even more credible.



>  Certainly there must be SOMETHING in the brain, &c. that
>contains ALL the information involved in any given thought, but since we have
>not yet nailed that down, it's hardly simply a TECHNICAL problem.
>

If you can actually copy the brain then you can preserve these thoughts
and memories without having the least idea how they are organised.

>>We already model very simple neural networks on ordinary computers or
>>specialised hardware.
>
>But we don't know how well NNs emulate real neurons, or whether they capture
>ALL neural behavior relevant to consciousness.  The problem is sooo easy when
>you just assume it has been solved.

No we don't for sure. There's the hetrodox ideas of Penrose, for
example, that some kinds of quantum process involving individual
electrons may be going on. It's even possible you might copy a brain and
have it apparently fully functional but consciousness would not be
present, though perception, reaction and even ideation might go on as
normal.




More information about the Neur-sci mailing list

Send comments to us at biosci-help [At] net.bio.net