IUBio Biosequences .. Software .. Molbio soft .. Network News .. FTP

Nonhuman empathy

Matt Jones jonesmat at ohsu.edu
Fri Feb 12 13:18:22 EST 1999

In article <36C4373F.A466D92D at mich.com> Michael Edelman, mje at mich.com
>The difference between the human and the animal case is that in the human
>case, you may have been wrong in asssigning specific states to the actors,
>but you can safely assume that being human they share with you certain
>states. Humans can be sad, happy, etc. and will have similar external
>correlates to these states.

You say I can "safely assume that being human they share...certain
states", and I agree. But I would go further and say that being animals
with a lot of shared biochemistry and evolutionary history, we all share
certain states. 

>The fallacy committed by the turtle owner was not just in assuming the
>existence of certain states to his turtle, but more importantly, assuming
>that the turtle was experiencing a certain state based on a behavior that
>would correlate with that state in humans.

This is not a fallacy, any more than assuming that a human is in a
certain state based on external correlates. We don't have ANY clue as to
another human's internal state other than what we see externally.
Different humans have different external manifestations of internal
states that they would verbally identify with the same word. For example,
when "angry", some people jump up and down and scream while others merely
wrinkle their brow, while others display little if any external cues.
These external displays can be modulated intentionally (i.e., poker
face). So it is in fact a fallacy to lump all these cues together as if
there were a simple external "anger" display in humans. There isn't. Once
we admit that we need to have some leeway in correlating human internal
states with external ones, then we must admit to a similar leeway when
translating across species. 

>> Maybe I'm in the minority, as someone else suggested, but I think it's
>> preposterous (and dangerous) to suggest that animals (dogs and cats, for
>> example) don't feel pain or anger or jealousy.
>I don't think anyone would deny that cats and dogs and turtles feel pain-
>they certainly show aversions to stimili we find painful, and we can probably
>be safe in calling that pain.
>But there are organisms with extrememly simple CNS- a few hundred neurons-
>that also show aversion to certain stimuli, and yet we'd be hard pressed to
>say they have the phenomenological experience of pain.

Yes, there is a spectrum of nervous system sophistication. A spectrum,
not discrete categories. This is a consequence of us having evolved from
simpler ancestors.

>As for something like jealosy- now you're in an area that's much harder to
>support. Do cats get jealous? I'm not sure how you could divine that.

Do people get jealous? I'm not sure how you could divine that. Oh, I
know. You look at their behavior and infer an internal state. When my cat
suddenly starts pissing on the furniture the day after I bring home my
newborn daughter, I attribute it to jealousy.  I could be wrong, but I
can also be wrong in making such attributions with other humans. So
there's nothing inherently fallacious about TRYING to make such an

>> Are we somehow chemically
>> priviledged that we have the machinery for those emotions, but cats don't
>> (the answer is no).
>You must differentiate between the machinery and the emotional state- as with
>the simple organisms mentioned earlier. Ajellyfish shows aversion to stimuli
>associated with harm to the organism, but it's hard to imagine a jellyfish
>having the phenomenological experience of pain in the same sense of a human.

Well, yeah, I don't think that jellyfish have the "phenomenological
experience " (whatever that means) of pain in the same sense as a human.
But I also don't think that all humans share the same experience of pain.
People are said to have high or low pain thresholds, they respond
differently to similar stimuli. I'd say it's hard to imagine defining any
single "phenomenological experience" that applies to humans equally.

>> Or is our brain organized in such a way that we can
>> somehow generate emotions through a feat of supreme mental computation,
>> whereas creatures with less cognitive and logical horsepower can't
>> (answer: no).
>I think you need to provide a better argument than  flat out assertion that
>the capacity of an organism to experience complex emotional states is
>independant of the complexity of the brain.  Does a sea snail feel jealosy?
>Does an earthworm experience existential angst?

First, let me reiterate that I'm not saying animals and humans experience
EXACTLY the same mental states (that would be a stupid thing to say). I
also don't think that any two humans experience EXACTLY the same internal
states (equally stupid, in my opinion). So, yes, of course there are
differences between us and them. But at the bottom of it all is plenty of
shared genetics, biochemistry, physiology and behavior. We can study
"aversion responses" in snails and in neuroscientists alike. There is a
core of shared experiences (at least at a very low level) because we are
all products of the same basic environment (i.e., carbon life forms,
convert glucose to energy, undergo sexual reproduction, the really basic
stuff is the same).

Next, I haven't assigned any levels of complexity to emotional states as
you have. In what way, exactly, is jealousy more complex than pain? In
what way is existential angst more complex than fear?  We humans can
spout a lot of complicated verbiage to describe jealousy or angst, but
that verbiage IS NOT the emotion, it is stuff that we say about the
emotion when we have time to sit down and think about it and write it all
down and publish it. While the emotion actually has hold of us, we are
not thinking "Hmm, I have this funny feeling because I realize that I'm
ultimately free to determine my own fate and the meaning of my existence.
This is angst I'm feeling.". Instead, we think "Oh shit. I hope I don't
screw up."  That is, if we think in a human language at all when we
experience emotion, which is arguable.

>If complexity is not an issue, what, then, is the function of the brain with
>respect to conciousness? Are emotional states independant of conciousness, or
>are you perhaps arguing for a hard dualism, in which conciousness is a
>seperate entity?

No, exactly the opposite. I'm a lumper, not a splitter. I'm arguing that
you are defining emotion too narrowly and in too anthropocentric (pardon
my gender bias) terms. I didn't say that complexity is not an issue, but
I do not think it is a hard and fast requirement either (there probably
is some minimum level of complexity required for emotional states, but
that level is far far below the complexity of the human or turtle nervous
system). It seems to me that you are pushing a dualist viewpoint if you
wish to separate human behavior from that of the rest of the animal
kingdom. Do you think that only human beings have consciousness? I don't.

This is getting lively. I'm enjoying it.

Matt Jones

>-- mike

More information about the Neur-sci mailing list

Send comments to us at biosci-help [At] net.bio.net