In article <B16B7FDB96681170D3 at 0.0.0.0> tonmaas at xs4all.nl (Ton Maas) writes:
>In article <6i21sj$2j$1 at news.ox.ac.uk>,
>patrick at gryphon.psych.ox.ac.uk (Patrick Juola) wrote:
>>>Oh, I'm disagreeing, vehemently. We *have* evolving computers, we
>>*have* computers (or at least multi-cpu networks) with greater storage
>>capacity and comparable complexity to the human brain, and that's
>>not produced anything that appears in any way conscious. The
>>argument that "evolving systems," "tinker-able systems" or any
>>such is just another attempt to insert a ghost-in-the-machine through
>>a process that the writer doesn't understand, even though many other
>>scientists may understand quite weill.
>>Well, I'm not convinced we have "evolving" computers which comply with
>Varela & Maturana's definition of autopoiesis. After all, in man-made,
>digital computers the distinction between hardware and software is clean
>cut, but in biological organisms this is a much more complex matter. Modern
>day emphasis on genes as informational carriers of the "blueprint" of
>organisms, for instance, is still largely unaware of the implications of
>research done by D'Arcy Thompson (who showed that evolution is actually
>saturated with patterns that are highly constant between species - such as
>a nose above a mouth and an eye and an ear on eiter side - and which are
>analogously coded rather than digitally) and the late William Bateson (who
>removed limbs from embryos and grafted them back on in opposite positions,
>where they grew into their "proper" shape), which have rather far-reaching
>implications for the relation between genes and actual organisms. IMHO we
>are approaching the limits of an "old" paradigm, with its notion of
>information as storeable and quantifiable rather than defined in-process
>(as "a difference which makes a difference").
I'm afraid, Mr. Maas, that you're at least twenty years behind the
coal face. The distinction between software and hardware isn't particularly
clean-cut and hasn't been since the first emulator was written in the
Sixties; most modern computers are now designed first in simulation and
built only when many designs have been tested and abandoned in software
alone. Furthermore, even the simulation itself usually relies heavily
on randomized tinkering -- check out the literature on "superoptimization"
in GCC as a particularly understandable example. In these applications,
the computer was simply asked to run random bits of machine code, and
the shortest/fastest that performed particular (usefu) operations were
"remembered" by the superoptimizer and incorporated into later
compilations. (Rather like in the old x86s where the fastest way to
clear a register was to XOR it with itself instead of using the CLR
instruction.)
Similarly, the role of other, non-gene areas of information transmission
is a major area of research, although it hasn't gotten the funding or
press of the HGP for rather obvious political reasons -- the HGP being
the 90's equivalent of the Apollo project.
But this is largely arguing over irrelevancies. The *relevant* question,
which still hasn't been addressed, is why the origin of an organism
should be relevant to its conscousness -- and in particular why a
"tinkerable" organism should be conscious when an exact-duplicate-by-design
isn't. At which point, "ghost in the machine" is the kindest way I can
put it.
-kitten