Donchin's ERP work in the Times

backon at vms.huji.ac.il backon at vms.huji.ac.il
Tue Feb 23 14:13:13 EST 1993


In article <1993Feb10.022019.7846 at reed.edu>, zeke at reed.edu (Zeke Koch) writes:
> Does anyone know anything about Donchin's work on Computer Human
> Interaction?  I saw the following article in the times today.
>
> --------
> THE NEW YORK TIMES, TUESDAY, FEBRUARY 9, 1993
>
> Computers Are Starting to Take Humans' Wishes as Their Commands
>
> By ANDREW POLLACK
>
> ATSUGI, Japan
>
>   People now control computers with a keyboard, a
> mouse or in some cases with spoken commands But at Japan's largest
> computer company, Fujitsu Ltd, and at several other laboratories
> around The world, researchers are developing ways to control a
> compu~er by merely thinking a command
>
> A New York State Department of Health research Team has developed a
> system that allows users, after some training, to move a cursor slowly
> up and down or side to side on a computer screen by mental action
> alone.  University of Illinois psychologists developed a way of
> allowing people to type, albeit at a rate of only 2.3 characters a
> minute, by spelling out words in their minds
>
> And at the research laboratories of the Nippon Telegraph and Telephone
> Corporation, Japan's main telephone company, researchers have devised
> techniques to tell from brain waves, with a fair degree of accuracy,
> the direction a person will move a joystick. A similar project is
> under way at Graz University of Technology in Austria.
>
> "This is no parapsychological exercise," said Emanuel Donchin, a
> professor of psychology at the University of Illinois who led the
> development of the thought controlled typewriter.  Rather, such
> mind-over-cursor techniques work by having computers analyze electric
> signals emitted by the brain as it works The signals are collected by
> electroencephalography, or EEG, a technique that involves attaching
> electrodes to the scalp. It has long been used to diagnose brain
> disorders.
>
> Complete human-brain computer interaction is certainly decades away
> and might never move beyond science fiction. But in the next decade,
> practical if limited systems for helping severely handicapped people
> communicate or operate appliances are seen as feasible. A related
> technique in which electrical signals to the muscles are detected and
> analyzed is also being explored to help paralyzed people operate
> artificial arms or legs.
>
> If we can use a computer without even uttering a sound, it would be
> easier," said Norio Fujimaki, one of the three researchers
> participating in an experimental program on thought driven computers
> at Fujitsu's research laboratory in this city near Yokohama.
>
> Attempts to develop thought input for computers began in the 1970's
> with the "biocybernetics" program financed by the United States
> Defense Department. One goal was to enable a computer to determine the
> state of mind of a fighter pilot so it could better assist him in
> operating the plane, said Professor Donchin, who was involved in the
> work.



Well, now that the Cold War is over, you should be aware that Dr. Bechtereva
and her staff at the Leningrad Military Hospital (Russia) had carried out
factor-analysis of neuronal multiple unit activity and had actually succeeded
in decoding phonemes, thoughts, etc. from implanted electrodes on *volunteers*.
This work was done over 16 years ago and was highly classified.

Later work (not in the Soviet Union) enabled surface electrodes to decode what
the person *thought*.

For obvious reasons, I can't exactly elaborate on this.

Josh
backon at VMS.HUJI.AC.IL












>
> But the program was discontinued. in the early 1980's, and since then
> work in this field, aimed mainly at medical uses, has been sporadic,
> hurt by shortages of financing and technical obstacles.  Research in
> this area often raises concerns about whether technology will be
> developed to read minds. But Professor Donchin and others, say that
> most of the systems under development cannot eavesdrop on a person's
> thoughts.
>
> Indeed, for now and in the near future it is a major challenge to
> recognize from brain waves if a person means "yes" or "no," let alone
> to understand complex thoughts. That is because there is little
> understanding about the connection between any particular thought and
> the voltages emitted by brain cells.
>
> Moreover, any one signal may be drowned out by the signals from all
> the other brain activities going on at the same time.
>
> Don't Breathe, Please
>
> "It's difficult enough to have a speech recognition device, but there
> you know the language," said Erich Sutter, a senior scientist at the
> Smith-Kettlewell Eye Research Institute in San Francisco who developed
> a system using EEG that can tell where on a computer screen a person
> is looking. "With EEG signals, we really don't know the language the
> brain uses, and the brain may be doing all sorts of things unrelated
> to the thought you are trying to dig out."
>
> Consider the first efforts at thought input by Dr. Fujimaki of Fujitsu
> and his collaborator, Prof. Shinya Kuriki of Hokkaido University.
>
> A volunteer sitting in a chair would have 12 electrodes attached to
> his or her scalp. Because any movement, even blinking or looking at
> the scenery, would generate a brain signal 10 times larger than the
> one the researchers were trying to detect, subjects had their heads
> locked in one position with a special brace. They were told to stare
> at a black dot and to breathe, blink and swallow as little as
> possible.
>
> The subjects were told to say the sound "ah" in their mind, without
> actually voicing it, when they saw one color of flashing light, but
> not to say it when they saw another color. By averaging dozens of
> readings, Dr. Fujimaki could detect a difference in brain pattern when
> a person was mentally saying "ah"
>
> `It's Far From Practical'
>
> But the need to take so many readings rules out the use of the
> technique for computer control. Ideally, a person would want to think
> the letter "a" only once and have it recognized. "In our experiment,
> 10 hours are required to communicate only one vowel " Dr. Fujimaki
> said. "It's far from practical communications."
>
> Other researchers have made more progress by using particular signals
> that are easier to detect and analyze.
>
> At the University of Illinois, Professor Donchin took advantage of
> what; is known as the "oddball paradigm." When someone sees something
> that he or she has been waiting for but that occurs only rarely, the
> brain emits a detectable signal about three-tenths of a second later.
>
> To develop his brain-activated typewriter, Professor Donchin arranged
> the letters of the alphabet in rows and columns that were displayed on
> a computer screen. The rows and columns were flashed one by one in a
> random order. When either the row or the column containing the letter
> a person was thinking about flashed on the screen, the person's brain
> would emit the telltale signal. By knowing the row and column, the
> computer could then identify the proper letter.
>
> Disciplining the Brain
>
> At the New York State Department of Health's Wadsworth Center for
> Laboratories and Research in Albany, Dr. Jonathan R. Wolpaw and his
> colleagues get around the problem of having a computer try to guess
> what the brain is thinking. Their approach is to train the brain to
> emit signals that can be easily understood by a computer. "It's
> putting the task on the brain," Dr. Wolpaw said.
>
> Dr. Wolpaw's technique uses mu waves, which are rhythmic signals
> emitted by the brain's sensorimotor center when it is in idle mode.
> In Dr. Wolpaw's system, electrodes measure the amplitude of the mu
> waves and translate large amplitudes into an upward movement of the
> cursor and low amplitudes into a downward movement.
>
> In one experiment, four of five subjects gradually learned to control
> their mu waves enough to move a cursor from the center of the screen
> to either the top or the bottom in about three seconds.
>
> Some subjects found that particular thoughts, say, about weightlifting
> would move the cursor down, while thoughts about relaxing moved the
> cursor up. After a while, such imagery was no longer needed, Dr.
> Wolpaw said.
>
> By using more detailed measurements of the mu rhythms, Dr.  Wolpaw's
> team has recently succeeded in enabling people to move the cursor side
> to side as well as up or down. But people still cannot bring the
> cursor to a particular point and stop, a level of control needed to
> develop the mental equivalent of a computer's mouse.
>
> Akira Hiraiwa and his colleagues at Nippon Telegraph and Telephone
> have taken advantage of the fact that the brain emits certain voltages
> before an action, is taken. They developed a pattern- matching
> computer known as a neural network that could tell the difference
> between signals corresponding to a left and right movement of a
> joystick. But it was difficult to have the system work fast enough to
> make the prediction before the movement occurred, although researchers
> in Austria, using a similar technique, say they can do this.
>
> Even for paralyzed people, brain control right now is still
> impractical, compared with other techniques that have been developed
> to allow people to control computers by eye movements or breath.
>
> In recent years, techniques have been developed that provide better
> images of the working brain than EEG does. Positron emission
> tomography and fast magnetic resonance imaging have provided pictures
> of the brain as it performs a function like recalling a word
>
> Dr. Fujimaki of Fujitsu hopes to use extremely sensitive supercon
> ducting sensors to read the faint magnetic waves emitted by the brain.
> Such magnetic mea


More information about the Neur-sci mailing list