F. Frank LeFever <flefever at ix.netcom.com> wrote in message
news:7mq3vk$l6q at dfw-ixnews10.ix.netcom.com...
>> Here's that broken record again--me, asking for definitions!
> "Intentionality" = ??
Please, not that. Anything but that. Oh please. Please.
I don't think I can put it into words. Webster defines "intentional" as
"b : having external reference." I'll go with that for now.
> My off-the-top-of-my-head response: very early
> in my grad studies, there was an ambitious attempt to deal with what
> PERHAPS you "intend" here, by--Pribram, Galanter, and Miller?
> Ironically, (if I recall correctly) they were trying to account for
> "purposeful" behavior in living organisms (without invoking
> teleological explanations of "goal" seeking) in terms of concepts from
> cybernetics... That is, using the "intentionality" of computers with
> real-world interfaces as a model for "intentionality" of living
> organisms.
>> (n.b.: I am leaping to the conclusion that you mean all kinds of
> goal-directed behavior, not just the intent to convey meaning)
um... I guess.
Does the systems of thermostat and heater statisfy your defintion of
"goal-directed behavior" such that it can intent to keep the temperature
in a given range?
I'll beg the question.
Consider the brain in a vat. Suppose it sends signals to the simulation of
a body that we can interpret as saying, "God, I could really go for a ham
sandwich right now." Well, even though we can interpret it that way in
reallity
it doesn't want a ham sandwich but rather to be stimulated in a way
consistent
with having a ham sandwich if it had the body we are simulating.
As I said, this begs the question since it assumes a brain in a vat can
still
have intents and desires.