In <6hrkup$qrm at ux.cs.niu.edu>, rickert at cs.niu.edu (Neil Rickert) writes:
>modlin at concentric.net writes:
>>In <6hqseq$q8n at ux.cs.niu.edu>, rickert at cs.niu.edu (Neil Rickert) writes:
>>>[modlin] What is your definition of computation?
>>>>A computation is a set of causal operation which take place in the
>>>world, and which have a certain kind of mathematical description.
>>>[modlin] Under your rules, is a Turing machine capable of computation?
>>>>No. It is capable of formal computation, but not of computation,
>>>where formal computation is a mathematical idealization of
>>>computation.
>>>Interesting. Your conflation of "computation" with notions of
>>physically realized causality is something I've not encountered
>>before... none of the classic works on computability uses it that way,
>>and indeed I can't think of a single author who would balk at saying
>>that a Turing machine computes.
>>I don't think I am conflating anything. Many big corporations have
>been purchasing expensive computers for decades, because of the causal
>operations that they perform.
(Before responding, let me offer an apology for saying I wouldn't talk
to you about this any more. I was frustrated. But now after a few
hours sleep I wish I hadn't said it. <g>)
Anyway. The "causal operations" which make computers worth money to
a business are outside the computers themselves. A corporation would
not care if the internal mechanisms of the computer used entirely
different sets of causal operations to accomplish its computations, or
even if it worked by some mystical acausal magic... so long the
external results have a proper functional relationship to things of
interest to them.
Thats where I see your words as conflating ideas which would be more
usefully separated: you seem to me to be trying to use the external
mapping of computational results to motivate arguments about the
internal mechanisms of computation. The two are related only
indirectly, through practical engineering issues of performance.
>I would use 'computability' in the same way as the works you refer
>to. You have to remember that mathematics ain't real life.
>Mathematicians work with idealized models of real life. The Turing
>machine is an idealized model of computation. The mathematical
>theory of computation is a theory of this idealized model, just as
>the mathematics of the real numbers is about an idealization of the
>decimal measurements we make.
>>In a context of talking about the mathematical theory of computation,
>I would also have no problem with saying that a Turing machine
>computes. But in that case the context reminds us that we are
>actually talking about a mathematical idealization of computation.
But when I as a programmer try to design a procedure to accomplish some
useful computation, I'm looking at as a mathematical idealization. I'm
certainly not concerned with the causal details of NAND gates etched in
silicon. I work in terms of an idealized machine, an abstract language
virtual "computing device" such as that defined by the C language, and I
seldom care how that abstraction might be physically realized.
The normal working environment for anyone concerned with computation is
an "idealized model of real life", linked only at its periphery with
anything actually real. The word computation is never used in the sense
you are contrasting with the mathematical, in reference to some process
intimately wedded to its particular realization.
Which is what I meant by the following:
>> Computation is an abstraction,
>>inherently distinct from the engineering practicalities of a device
>>which might instantiate it.
You reacted to the above statement by saying this is incompatible with
the notion that cognition is computation, and doing some handwaving
about it implying an homunculus or substance dualism. I'll try to
respond to those remarks in a separate posting. For the moment, I'm
more interested in whether or not we can get past our disagreement about
the relationship between computation and causality, without worrying
about its implications for cognition.
Your turn.
Bill Modlin