Ray Scanlon wrote:
> Michael Edelman wrote in message <36C9BFFF.50AF3A56 at mich.com>...
> >Ray Scanlon wrote:
> >> What do we wish to simulate and at what level? Are we talking, for
> >> of the brain of rat? This is a mammalian brain, it has all the parts of a
> >> man's brain, just not so many neurons? What is the unit of our
> >> The neuron?
> >> The neuron is a large assortment of neurons, I have no idea how many. I
> >> would be happy if someone would give me a guess or a citation as to how
> >> in an average neuron. Some would say that until we know the story of
> >> molecule we cannot simulate a neuron. This is hogwash.
> >We can simulate a neuron, but we need to know more about how neurons
> >interconnect. At one time we thought all neronal connections were
> >axon-to-dendrite. Then we found axon-axonal, electrical, even
> >connections. Without an understanding of all those mechanisms we can't
> >completely model a single neuron.
>> Yes! And you can add ephaptic and hormonal effects. I say we must work with
> what we have. If you insist on the life history of every molecule you are a
> latter day Luddite.
That's rather a straw man argument. I do no insist on the role of every
molecule- but I insist on a model that takes into account the function of every
Earlier I cited the failure of artificial hearts built on models of the heart
as a pump, that failed to take into account the feedback mechanisms that
regulate heart function. An incomplete model of the brain will not yield a
complete model of its function.
Your model may work for the mechanistic model of mind you propose, one that has
no place for conciousness, but it may not be complete enough to model aspects of
mind that many of us think are central to the brain's purpose.
>> >There's a danger here in defining the question down to the point where it's
> >simple, but uninteresting. Supposing you assemble 200 million of your
> >neurons into a mass and start generating S-R pairs. Do you have a mind, or
> >an interesting programable array?
>> You misunderstand me. First we are to understand how the brain works, then
> we are to speak of designing a thinking machine (if we wish). Note that we
> talk only of design, no one should be so foolish as to think of actually
> implementing such a machine.
Why not? You've got perhaps 2x10^11 cells in the brain, give or take a factor of
2x10^2, which is certainly not a number inconceivable of realizing in hardware,
given that today's chips have something like 6x10^6 discrete devices, and
fine-grained processors are being designed with 10^4 or 10^ processing elements,
each of whihc can model another 10^6 or more virtual elements. So we're not that
What I'm objecting to here is your conception of the brain as a device with a
very predictable, top-down sort of structure. Of course my central issue here is
your rejection of mind, putting you solidly in the behaviorist/positivist
school. You're seeking to build (metaphorically, lest you think I'm tying this
to hardware) a brain that to me is just an automaton, with a rather large parts
count. What do you hope to explain with such a model? What is the purpose of
your model, and how does it differ from an ordinary computer, apart from the
> >It's an incomplete picture. There's a lot more going on than simple
> >axonal/dedritic transmission.
>> Lord, yes! But not so much as to justify a throwing up of hands and a
> retreat to religion.
We seem to have an irreconceivable division here. Every time I argue for the
brain as something more than a deterministic automoton, or argue for the
consideration of commonly accepted phenomena such as self awareness, who accuse
me of being a luddite or making an appeal to metaphysics.
That's silly. We're all self-aware. You aren't an automoton. Who am I debating
with? What are dreams?
> Explanations of the soul (mind) belong to religion. I say that theology
> should be put to one side until the brain is explained. When this is done
> one may turn to religion and talk about the soul (mind).
To equate soul and mind is to claim questions about mind are metaphysical ones-
but they're not. One can investigate the nature of conciousness through
controlled and repeatable experiments. You can't do that with the soul. *That's*
metaphysics. Unless you accept the reality and reliability of self-report, you
can't even do the kind of research program you're talking about.
What's your data? Suppose you've got a ton of really good single-unit records
for every neron in the brain, and you can trace activity all the way from retina
to cerebral cortex and every path along the way. What have you got? Nothing,
unless there's a correlation to subjective experience. You have a machine, and
machines, as far as we know, don't ask questions about the nature of other
machines. And that means you're not a machine, either.
>>> >> The brain
> >> is simple and its activities are simple. We are overwhelmed by the number
> >> neurons, by the number of nuclei, by the number of tracts, but if we lump
> >> the trees together we shall see the forest.
> >Of course, I think the forest is Mind.
>> Good enough! But save it until we get the brain explained.
You'll never explain brain without explaining mind. Can you describe the
function of a computer in the absence of the existence of any software?
Here's a little gedanken experiment: Suppose no software exists, but computers
do. (Never mind why this would be the case). We have all these computers, and
they're all fully described, with every possible state mapped and documented.
Have you explained the computer? No.
If you could come up with a complete diagram of one brain, and a table of all
possible states of that brain, do you have an explaination of that brain? No.
Michael Edelman http://www.mich.com/~mje
Telescope guide: http://www.mich.com/~mje/scope.html
Folding Kayaks: http://www.mich.com/~mje/kayak.html