IUBio Biosequences .. Software .. Molbio soft .. Network News .. FTP

Nonhuman empathy

Michael Edelman mje at mich.com
Fri Feb 12 09:14:23 EST 1999



Matt Jones wrote:

> [snip]

> Imagine that you're walking in the park and you come upon someone holding
> a gun on another person. You quickly reach a decision as to what's going
> on, and you would probably say that you can empathize with the person
> being held at gunpoint (they feel scared, angry, are trying to decide
> whether to run or hand over their wallet, etc...). At this point few (I
> think) people would deny that you had sufficient information to assign
> mental states. OK, then just as you're about to run and get the cops, you
> see that nearby there is a camera crew filming the scene. Now you realize
> that you've stumbled onto a movie set, and no one is in any danger, and
> maybe you feel kind of foolish for getting so excited.
>
> Here's the question: did any empathy take place? You assigned mental
> states to other people, in a way that was perfectly reasonable. You put
> yourself in their position. But it turns out you were completely wrong,
> and you were really acting on insufficient information.  But who cares?
> It would be ridiculous to come upon a scene like that and immediately
> conclude "Oh, this must be a movie set." Just like it is ridiculous to
> see your turtle jump for cover and conclude "Oh, there must be some
> explanation for that that doesn't involve being frightened or seeking
> protection from a perceived threat."
>
> I guess my point is that empathy is in the eye of the beholder. You
> assign certain mental states to your turtle because those are the states
> you think you would have if you were the turtle. That's what empathy is,
> and that's how we empathize with other human beings too. I don't really
> hold with this "anthropomorphic fallacy" idea, because it's easy to show
> that we're committing that "fallacy" even when we assign mental states to
> other humans (even the term is rife with generalizations: why isn't it a
> gyno-morphic fallacy?).
>

The difference between the human and the animal case is that in the human
case, you may have been wrong in asssigning specific states to the actors,
but you can safely assume that being human they share with you certain
states. Humans can be sad, happy, etc. and will have similar external
correlates to these states.

The fallacy committed by the turtle owner was not just in assuming the
existence of certain states to his turtle, but more importantly, assuming
that the turtle was experiencing a certain state based on a behavior that
would correlate with that state in humans.

> Maybe I'm in the minority, as someone else suggested, but I think it's
> preposterous (and dangerous) to suggest that animals (dogs and cats, for
> example) don't feel pain or anger or jealousy.

I don't think anyone would deny that cats and dogs and turtles feel pain-
they certainly show aversions to stimili we find painful, and we can probably
be safe in calling that pain.

But there are organisms with extrememly simple CNS- a few hundred neurons-
that also show aversion to certain stimuli, and yet we'd be hard pressed to
say they have the phenomenological experience of pain.

As for something like jealosy- now you're in an area that's much harder to
support. Do cats get jealous? I'm not sure how you could divine that.

> Are we somehow chemically
> priviledged that we have the machinery for those emotions, but cats don't
> (the answer is no).

You must differentiate between the machinery and the emotional state- as with
the simple organisms mentioned earlier. Ajellyfish shows aversion to stimuli
associated with harm to the organism, but it's hard to imagine a jellyfish
having the phenomenological experience of pain in the same sense of a human.

> Or is our brain organized in such a way that we can
> somehow generate emotions through a feat of supreme mental computation,
> whereas creatures with less cognitive and logical horsepower can't
> (answer: no).

I think you need to provide a better argument than  flat out assertion that
the capacity of an organism to experience complex emotional states is
independant of the complexity of the brain.  Does a sea snail feel jealosy?
Does an earthworm experience existential angst?

If complexity is not an issue, what, then, is the function of the brain with
respect to conciousness? Are emotional states independant of conciousness, or
are you perhaps arguing for a hard dualism, in which conciousness is a
seperate entity?

-- mike

http://www.mich.com/~mje




More information about the Neur-sci mailing list

Send comments to us at biosci-help [At] net.bio.net