[Neuroscience] Mind out for the new battlefield

Allen L. Barker via neur-sci%40net.bio.net (by Allen.L.Barker At gmail.com)
Fri Nov 24 00:28:41 EST 2006



Mind out for the new battlefield
http://www.theaustralian.news.com.au/story/0,20867,20797780-12332,00.html
Jonathan D. Moreno
November 22, 2006

NEUROSCIENCE has almost surely grown faster than any other
interdisciplinary area in the past decade. The Society for
Neuroscience is host to one of the biggest science meetings in
the world, drawing about 40,000 attendees from disciplines
including neurology, psychology, computer science, radiology and
psychiatry, as well as my own field of bioethics.

My fascination with the ethics of neuroscience research is rooted
in two distinct experiences. First, I wrote a book called Undue
Risk: Secret State Experiments on Humans, about the history and
ethics of human experiments conducted for national security
purposes. My work on that book - in addition to my role as a
staff member for a presidential advisory commission on radiation
experiments on humans - sensitised me to the complex relationship
between science, ethics and national defence needs.

Understanding that science is essential for US military
superiority, every presidential administration since World War II
has provided federal largesse to support research, creating a
relationship between academe and government that has served the
country well.

Second, as I followed developments in neuroscience through the
popular press and scientific publications, I noticed that many of
the most exciting experiments were supported by national security
agencies, such as the Defence Advanced Research Projects Agency.
Yet press coverage of neuroscience experiments usually mentions
the source of funds only in passing.

I wondered what security agencies' financial support might say
about their interest in the long-term contributions of
neuroscience to national security. Though that question seemed to
me to be the 800-pound gorilla in the neuroscience lab, to my
amazement, no one else, including the neuroscientists, appeared
to be asking it in any systematic way.

In one important sense, that is perfectly understandable.
Scientists focus on their particular research aims, not on the
long-range interests of financial supporters. What seems to an
investigator to be a very limited research question can be seen
by a security agency as part of a larger pattern. And, of course,
even scientific advances that do not stem from military-sponsored
research can be adapted for security purposes later.

Not all the neuroscientists I spoke to were enthusiastic about
discussing such issues on the record, to put it mildly. Their
reluctance only served to confirm my sense that those matters
were of more than passing interest, and led to my new book, Mind
Wars: Brain Research and National Defence.

Fortunately for my project, not all scientists or former agency
employees were unwilling to talk to me, and much information is a
matter of public record. I came to appreciate the way that DARPA,
in particular, does business, for it is a science agency, not a
spy agency, and the vast majority of its work is done in concert
with scholars. Thus, although many of the studies that raise
interesting ethical and social questions are sponsored by DARPA,
that does not imply that the agency should not be supporting
them, or that the research should not be done.

Of course scientists in any field are understandably reluctant to
make comments that could jeopardise future financial support for
their work, but there is a special sensitivity is being harnessed
by the US military for national security, with alarming ethical
implications - in some cases, almost paranoia - about the
suggestion that scientific research is leading to mind reading or
mind control. That sensitivity is partly left over from the early
days of the Cold War, when US government officials suspected that
treasonous statements by prisoners of war in North Korea were the
result of brainwashing. Twenty years later, we learned that the
CIA and the US army had themselves engaged in mental manipulation
experiments that included the use of hallucinogens such as
LSD.People read each other's minds all the time, sometimes
unconsciously and relying on various cues such as body language,
and our minds are controlled in countless ways, from natural
stimuli such as odours to pop-up web ads. But most of us get
nervous when we imagine that some distant authority could have
access to what we like to think of as private thoughts, or that
some deliberate and fairly precise means can be used to alter our
cognition or behaviour in accord with someone else's strategic
purpose.

I don't know if any of the contemporary research projects that I
discuss in my new book qualify as mind reading or mind control,
but some of them seem pretty close. Certain brain-scanning
techniques, especially functional magnetic resonance imaging,
have stimulated a huge amount of research attempting to correlate
neural activity with specific tasks or experiences.

In one famous and contentious study, negative automatic responses
by white research subjects to photographs of black faces were
correlated with activity in the amygdala, which processes emotion
in the presence of stimuli. Or take the example of "prisoner's
dilemma" experiments, in which both subjects benefit when they
co-operate with each other. When the subjects do well,
neurotransmitters activate pleasure centres in the brain. Some
neuroscientists claim that fMRI can already show when subjects
are thinking of a certain number, when they are lying, or what
their sexual orientation is, and that the technique will make
even more refined and precise analyses possible in the years ahead.

Other studies focus on replacing old-fashioned lie detectors with
systems based on neuroscience. The hope is that the new
techniques would not only be more reliable, but that they could
replace torture and other physically aggressive means of
interrogating terror suspects and enemy operatives.

It's not hard to see additional security implications of such
technical capabilities. For instance, a spy agency could measure
the neurotransmitter secretions of candidates for special
missions, to see how they react to stress. Military personnel in
information-rich environments, such as cockpits, could have their
brain functions monitored for information overload, and officers
behind the front lines could modify the flow of data accordingly,
using devices now being developed to provide real-time remote
brain imaging.

Other direct interventions to enhance soldiers' capabilities
could come in many forms, including new generations of
neuropharmaceuticals, implants, and neural stimulations. New
anti-sleep agents such as modafinil (which, under the brand name
Provigil, some students may already have discovered) are
replacing old-fashioned amphetamines among fighter pilots as well
as globe-trotting business executives. DARPA's "peak soldier
performance" program aims to improve metabolism on demand so a
soldier could operate at a high level for three to five days
without needing sleep or calories, except perhaps high-nutrition
pills.

DARPA is also interested in increasing the "bandwidth" of
soldiers' brains. One idea is to develop something called a brain
prosthesis, a chip that - if it could be made to work - would
restore mental functioning in people who have epilepsy or have
had strokes. But experts disagree about whether such a device,
intended to treat a medical condition, could also improve normal
mental functioning.

Or perhaps extra copies of genes that code for certain neural
receptor sites could be introduced into the brain to improve
learning skills; that has been done in mice, in the lab of Joe Z.
Tsien of Boston University. Electrical stimulation has been used
with some success as an adjunct to standard rehabilitation
techniques for stroke victims; could it improve cognitive
functions in healthy individuals?

Intelligence and endurance are not the only traits that make a
good soldier. Another is the ability to manage fear. In an
interesting experiment, Gleb Shumyatsky's research team at
Rutgers University in New Jersey, found that mice bred not to
have the gene stathmin did not exhibit normal fear behaviour,
such as freezing in place, as often as normal mice did when
exposed to things such as a mild shock.

Stathmin is expressed in the amygdala and is associated with
innate and learned fear. The mice without stathmin froze less
often because they had impaired learning capacity.

It is unlikely a particular gene in humans corresponds so
precisely to fear, given differences in the way mouse and human
genes are expressed. But given past hype about, say, the
so-called gay gene, it is easy to imagine an overly enthusiastic
official proposing to screen recruits for the "fear gene".

Where there is fear, there is often long-term trauma. Trauma
victims who had been given the beta blocker propranolol - which
is normally used to treat heart disease but which inhibits the
release of brain chemicals that consolidate long-term memories
with emotion - scored lower on a scale measuring post-traumatic
stress disorder than did members of a control group after a month
of psychological counselling. The difference was not
statistically significant, but another result is more important.
Three months later, none of the beta blocker recipients had
elevated physiological responses when asked to recall their
traumatic experiences, while 40 per cent of the control-group
members did.

Those results give hope to sufferers of post-traumatic stress
disorder. They also raise the question of whether the drug could
be given prophylactically, before a person enters what could be a
traumatising situation. How would we feel about preventing the
disorder in young soldiers going into battle: preventing a
lifetime of harrowing memories as well as the soldiers' capacity
to connect horrific experiences with negative emotions? Do we
really want guilt-free soldiers?

The hair on the back of readers' necks may be rising at the
prospects ahead of us and their ethical and legal implications.
How are we to ensure that interventions such as those to manage
guilt and fear would be confined to military operations against
truly dangerous adversaries and not more widely adopted, perhaps
by civil authorities or criminals?

The importance and difficulty of regulation become even more
impressive when we consider that many of the technologies would
be not only useful tools in military situations but also
promising advances in health care. The same sort of device that
would allow an officer to see if a pilot was receiving too much
information, say, could permit a nurse in a doctor's office to
check up on the welfare of a brain-injured patient at home. There
are also commercial possibilities, of course. For instance,
businesses are already intrigued by the possibilities of using
brain-imaging techniques to conduct market research.

Much of the history of bioethics might be read as a 40-year
conversation about the prospects for changing human nature
through startling developments in the life sciences. Bioethicists
have largely played down such concerns, noting the extent to
which we already deliberately change ourselves in all sorts of
low-tech ways, such as using sleep medication or taking French
lessons.

However, an alternative view has recently been getting more
attention. Its supporters - including Leon R. Kass, professor in
the Committee on Social Thought at the University of Chicago and
former chairman of the President's Council on Bioethics - contend
that practices such as new reproductive technologies, while
attractive and seemingly benign, have profound but unpredictable
societal implications. The debate about what types of enhancement
are permissible, given the risks, should be expanded to include
the sorts of interventions I have described, especially given the
clout of national-security funds and goals.

The defence implications of neuroscience also raise policy
questions about civil liberties, regulation and safety. We're
familiar with the role of atomic scientists in the control of
nuclear weapons, and more recently biologists have become key
players in planning for defence against bioterrorism. The same is
not yet true of neuroscientists, partly because the idea that
neuroscience could be involved in national security is only now
becoming clear, and partly because neuroscience is a complex
interdisciplinary field whose practitioners work in separate
silos. But the day is approaching when we will have to consider
those issues in a more systematic way.

Programs in neuroscience are springing up at colleges and
universities across the US. The programs should include
discussions of science policy, such as: How do the sources of
research funds affect the direction of science and social change?
Which uses of brain research are acceptable and which are not?
And what limits should society, perhaps acting through scientific
associations, place on the acceptable applications of neuroscience?

Whatever the future holds for neuroscience, it would be naive to
suppose that national -security organisations are not monitoring
developments in that field as they do in any other. We need a
public conversation about the role of brain research in defence.

The Chronicle of Higher Education

Jonathan D. Moreno is a professor of biomedical ethics and of
medical education and director of the Centre for Biomedical
Ethics at the University of Virginia. His latest book, Mind Wars:
Brain Research and National Defence, is being published this
month by Dana Press.




More information about the Neur-sci mailing list