Jure Sah wrote in message <38358DE4.614D2429 at guest.arnes.si>...
>Completely agree with you!
>However, that zillions of operations seems just a little too much for a
>Let me tell you how many operations does a computer make while he thinks of
>say a number.
>And I think that DATA problem is already solved just nobody "want's" to use
>Let's say you got a picture (bitmap) can you change that picture in to a
>directly? Well can you? Don't you think that those big brains of yours can
>an easy problem? Well a computer can turn it immediately! He just need's an
>Rename Pic.BMP to Pic.Wav (this tactics uses raw data that is used with
>(A human) would say that that noise has no conection to the picture, but
>computer would wave no problem with changing it to a picture again...
Computers are really fantastic machines. I have four of them and,
except for weekends, I spend more time with them than with "humans".
It's almost unbelievable the number of operations they are
executing in the seconds that we take to type a simple text such
as this one. They are fast and precise.
Unfortunately, when it's time to live in an uncertain and vague world
such as ours, that precision goes against computers. Digitized
photographs of two different apples may have very few points in
common. The most sophisticated computer vision programs of today
can only recognize these apples with 100% accuracy in very special
conditions (background, illumination, etc). However, any 3 year
old child is able to identify the apples in less than a second,
even on poor light or with partially ocluded images. Perhaps the
greatest flaw of today's computers is just that: they are
too much precise.