Reasoning v. other AI stuff
L.Fine at lycos.co.uk
Sun May 4 00:33:08 EST 2003
erayo at bilkent.edu.tr (Eray Ozkural exa) wrote in message news:<fa69ae35.0305031218.21ecd970 at posting.google.com>...
> L.Fine at lycos.co.uk (Acme Debugging) wrote in message news:<35fae540.0305011251.188e4c3a at posting.google.com>...
[restoring edited content]
>>>> Thanks for the references. Actually, I'm not looking for any
>>>> system, just saying that logic used in most real-world reasoning
>>>> needs to incorporate probability. I think different logic(s) will
>>>> needed for different types of questions. I would favor just a few
>>>> simple probability calculations in a first-step reasoner, though
>>>> might run out of applications pretty fast.
>>>Note that multi-valued logic is equivalent in power to classical
>>>logic. That _naturally_ includes fuzzy logic.
>>Understand this part.
>>>So the answer to your
>>>question must necessarily lie elsewhere....
>> Don't understand this part. Could you repeat the question?
> I think incorporating probabilities into a reasoner is not sufficient
> by itself,
"Sufficient" to what? My narrow problem definition or ... the
provocative statement I made in my opening post about waiting for the
reasoner to answer our AI questions (which I've redundantly stated was
not serious) ... or the definition you seemed to make earlier in the
thread of answering any possible question?
You seem to have inferred a question I didn't ask in the earlier post,
and now make this ambiguous comment. Are you poking fun at good ol'
simple-minded Larry? If so it is well-deserved as I have poked a
little fun at you too.
I've avoided much investment in the first-step reasoner idea, and
avoided specifics purposely to promote open-ended criticism. The
multi-valued/ fuzzy /qualitative, etc. logics, and whatever my schemes
are insufficient for, may yet have relevance to my narrow problem
definition. Perhaps you would glance over my post just sent to George
Dance if you have the time and inclination, as I would value your
> but I don't know what the "ultimate reasoner" is :) It's
> probably "common-sense reasoner"!
Of course you know I never thought to produce an "ultimate reasoner."
If I did, it would go far beyond "common sense." But I think you may
again be poking a little fun.
More information about the Neur-sci